Tagged: Particle Physics Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 3:26 pm on October 23, 2017 Permalink | Reply
    Tags: , ATLAS and CMS join forces to tackle top-quark asymmetry, , , , , Particle Physics   

    From CERN: “ATLAS and CMS join forces to tackle top-quark asymmetry” 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead

    CERN

    20 Oct 2017
    Matthew Chalmers
    Henry Bennie

    1
    Event display of a tt̄ event candidate in the 2015 data (Image: ATLAS/CERN)

    2
    All matter around us is made of elementary particles called quarks and leptons. Each group consists of six particles, which are related in pairs, or “generations” – the up quark and the down quark form the first, lightest and most stable generation, followed by the charm quark and strange quark, then the top quark and bottom (or beauty) quark, the heaviest and least stable generation. (Image: Daniel Dombinguez/CERN)

    In their hunt for new particles and phenomena lurking in LHC collisions, the ATLAS and CMS experiments have joined forces to investigate the top quark. As the heaviest of all elementary particles, weighing almost as much as an atom of gold, the top quark is less well understood than its lighter siblings. With the promise of finding new physics hidden amongst the top quark’s antics, ATLAS and CMS have combined their top-quark data for the first time.

    There were already hints that the top quark didn’t play by the rules in data collected at the Tevatron collider at Fermilab in the US (the same laboratory that discovered the particle in 1995).

    FNAL Tevatron

    FNAL/Tevatron map


    FNAL/Tevatron DZero detector


    FNAL/Tevatron CDF detector

    Around a decade ago, researchers found that, when produced in pairs from the Tevatron’s proton-antiproton collisions, top quarks tended to be emitted in the direction of the proton beam, while anti-tops aligned in the direction of the antiproton beam. A small forward-backward asymmetry is predicted by the Standard Model, but the data showed the measured asymmetry to be tantalisingly bigger than expected, potentially showing that new particles or forces are influencing top-quark pair production.

    “As physicists, when we see something like this, we get excited,” says ATLAS researcher Frederic Deliot. If the asymmetry is much larger than predicted, it means “there could be lots of new physics to discover.”

    The forward-backward asymmetry measured at the Tevatron cannot be seen at the LHC because the LHC collides protons with protons, not antiprotons. But a related charge asymmetry, which causes top quarks to be produced preferentially in the centre of the LHC’s collisions, can be measured. The Standard Model predicts the effect to be small (around 1%) but, as with the forward-backward asymmetry, it could be made larger by new physics. The ATLAS and CMS experiments both measured the asymmetry by studying differences in the angular distributions of top quarks and antiquarks produced at the LHC at energies of 7 and 8 TeV.

    Alas, individually and combined, their results show no deviation from the latest Standard Model calculations.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    These calculations have in fact recently been improved, and show that the predicted asymmetry is slightly higher than previously thought. This, along with improvements in data analysis, even brings the earlier Tevatron result into line with the Standard Model.

    ATLAS and CMS will continue to subject the heavyweight top quark to tests at energies of 13 TeV to see if it deviates from its expected behaviour, including precision measurements of its mass and interactions with other Standard Model particles. But measuring the asymmetry will get even tougher, because the effect is predicted be half as big at a higher energy. “It’s going to be difficult,” says Deliot. “It will be possible to explore using the improved statistics at higher energy, but it is clear that the space for new physics has been severely restricted.”

    The successful combination of the charge-asymmetry measurements was achieved within the LHC top-quark physics working group, where scientists from ATLAS and CMS and theory experts work together intensively towards improving the interplay between theory and the two experiments, explains CMS collaborator Thorsten Chwalek. “Although the combination of ATLAS and CMS charge asymmetry results didn’t reveal any hints of new physics, the exercise of understanding all the correlations between the measurements was very important and paved the way for future ATLAS+CMS combinations in the top-quark sector.”

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

    Quantum Diaries

    Advertisements
     
  • richardmitnick 1:16 pm on October 20, 2017 Permalink | Reply
    Tags: , , , , , Particle Physics, Scientists make rare achievement in study of antimatter,   

    From Symmetry: “Scientists make rare achievement in study of antimatter” 


    Symmetry

    10/19/17
    Kimber Price

    1
    Maximilien Brice, Julien Marius Ordan, CERN

    Through hard work, ingenuity and a little cooperation from nature, scientists on the BASE experiment vastly improved their measurement of a property of protons and antiprotons.

    2
    BASE: Baryon Antibaryon Symmetry Experiment. Maximilien Brice

    Scientists at CERN are celebrating a recent, rare achievement in precision physics: Collaborators on the BASE experiment measured a property of antimatter 350 times as precisely as it had ever been measured before.

    The BASE experiment looks for undiscovered differences between protons and their antimatter counterparts, antiprotons. The result, published in the journal Nature, uncovered no such difference, but BASE scientists say they are hopeful the leap in the effectiveness of their measurement has potentially brought them closer to a discovery.

    “According to our understanding of the Standard Model [of particle physics], the Big Bang should have created exactly the same amount of matter and antimatter, but [for the most part] only matter remains,” says BASE Spokesperson Stefan Ulmer.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    This is strange because when matter and antimatter meet, they annihilate one another. Scientists want to know how matter came to dominate our universe.

    “One strategy to try to get hints to understand the mechanisms behind this matter-antimatter symmetry is to compare the fundamental properties of matter and antimatter particles with ultra-high precision,” Ulmer says.

    Scientists on the BASE experiment study a property called the magnetic moment. The magnetic moment is an intrinsic value of particles such as protons and antiprotons that determines how they will orient in a magnetic field, like a compass. Protons and antiprotons should behave exactly the same, other than their charge and direction of orientation; any differences in how they respond to the laws of physics could help explain why our universe is made mostly of matter.

    This is a challenging measurement to make with a proton. Measuring the magnetic moment of an antiproton is an even bigger task. To prevent antiprotons from coming into contact with matter and annihilating, scientists need to house them in special electromagnetic traps.

    While antiprotons generally last less than a second, the ones used in this study were placed in a unique reservoir trap in 2015 and used one by one, as needed, for experiments. The trapped antimatter survived for more than 400 days.

    During the last year, Ulmer and his team worked to improve the precision of the most sophisticated technqiues developed for this measurement in the last decade.

    They did this by improving thier cooling methods. Antiprotons at temperatures close to absolute zero move less than room-temperature ones, making them easier to measure.

    Previously, BASE scientists had cooled each individual antiproton before measuring it and moving on to the next. With the improved trap, the antiprotons stayed cool long enough for the scientists to swap an antiproton for a new one as soon as it became too hot.

    “Developing an instrument stable enough to keep the antiproton close to absolute zero for 4-5 days was the major goal,” says Christian Smorra, the first author of the study.

    This allowed them to collect data more rapidly than ever before. Combining this instrument with a new technique that measures two particles simultaneously allowed them to break their own record from last year’s measurement by a longshot.

    “This is very rare in precision physics, where experimental efforts report on factors of greater than 100 magnitude in improvement,” Ulmer says.

    The results confirm that the two particles behave exactly the same, as the laws of physics would predict. So the mystery of the imbalance between matter and antimatter remains.

    Ulmer says that the group will continue to improve the precision of their work. He says that, in five to 10 years, they should be able to make a measurement at least twice as precise as this latest one. It could be within this range that they will be able to detect subtle differences between protons and antiprotons.

    “Antimatter is a very unique probe,” Ulmer says. “It kind of watches the universe through very different glasses than any matter experiments. With antimatter research, we may be the only ones to uncover physics treasures that would help explain why we don’t have antimatter anymore.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 2:17 pm on October 17, 2017 Permalink | Reply
    Tags: , , , HBOOK and PAW and ROOT and GEANT, , , Particle Physics, , René Brun   

    From ALICE at CERN: “40 Years of Large Scale Data Analysis in HEP: Interview with René Brun” 

    CERN
    CERN New Masthead

    16 October 2017
    Virginia Greco

    1
    Over 40 years of career at CERN, René Brun developed a number of software packages that became largely used in High Energy Physics. For these fundamental contributions he was recently awarded a special prize of the EPS High Energy Particle Physics Division. We have talked with him about the key events of this (hi)story.

    1
    René Brun giving a seminar at CERN (on October 4, 2017) about “40 Years of Large Scale Data Analysis in HEP – the HBOOK, Paw and Root Story”. [Credit: Virginia Greco]

    It is hard to imagine that one same person can be behind many of the most important and largely used software packages developed at CERN and in high-energy physics: HBOOK, PAW, ROOT and GEANT. This passionate and visionary person is René Brun, now honorary member of CERN, who was recently awarded a special prize of the EPS High Energy Particle Physics Division “for his outstanding and original contributions to the software tools for data management, detector simulation, and analysis that have shaped particle and high energy physics experiments for many decades”. Over 40 years of career at CERN, he worked with various brilliant scientists and we cannot forget that the realization of such endeavors is always the product of a collaborative effort. Nevertheless, René has had the undoubtable merit of conceiving new ideas, proposing projects and working hard and enthusiastically to transform them in reality.

    One of his creations, ROOT, is a data analysis tool widely used in high energy and nuclear physics experiments, at CERN and in other laboratories. It has already passed beyond the limits of physics and is now being applied in other scientific fields and even in finance. GEANT is an extremely successful software package developed by René Brun, which allows simulating physics experiments and particle interactions in detectors. Its latest version, GEANT4, is currently the first choice of particle physicists dealing with detector simulations.

    But previous to ROOT and GEANT4, which are very well known among the youngest as well, many other projects had been proposed and software tools had been developed. It is a fascinating story, which René was invited to tell in a recent colloquium, organized at CERN by the EP department.

    As he recounts, all started in 1973, when he was hired in the Data Handling (DD) division at CERN to work with Carlo Rubbia in the R602 experiment at the ISR. His duty was to help developing a special hardware processor for the online reconstruction of the collision patterns. But since this development was moving slowly and was not occupying much of his work time, René was asked to write some software for the event reconstruction in multiwire proportional chambers. “At that time, I hated software,” René confesses smiling, “I had written software during my PhD thesis, while studying in Clermont-Ferrand and working at CERN during the weekends, and I hadn’t really enjoyed it. I had joined Rubbia’s group with the ‘promise’ that I would work on hardware, but very quickly I became a software guy again…”

    In short time, René implemented in software (programming in Fortran4) what they could not realize via hardware and, in addition, he developed a histogram package called HBOOK. This allowed realizing a very basic analysis of the data, creating histograms, filling them and sending the output to a line printer. He also wrote a program called HPLOT which was specialized in drawing histograms generated by HBOOK.

    At that time, there were no graphic devices, so the only way to visualize histograms was printing them using a line printer, and programs were written in the form of punched cards.

    René remembers with affection the time spent punching cards, not for the procedure itself, which was slow and quite tedious, but for the long chats he used to have in the room where the card punchers and printers of the DD department were sitting, as well as in the cafeteria nearby. In those long hours, he could discuss ideas and comment on new technologies with colleagues.

    A huge progress was made possible by the introduction of the teletype, which replaced card punchers. Users could generate programs on a disk file and communicate with a central machine, called FOCUS, while – at the same time – seeing on a roll of paper what they were doing as in a normal type machine. “The way it worked can make people smile today,” René recounts, “To log in the FOCUS, one had to type a command which caused a red light to flash in the computer centre. Seeing the light, the operator would mount into the memory of the machine the tape of the connected person, who could thus run a session on the disk. When the user logged out, the session was again dumped on tape. You can imagine the traffic! But this was still much faster than punching cards.”

    Some time later, the teletype was in turn replaced by a Tektronix 4010 terminal, which brought in a big revolution, since it gave the possibility to display results in graphic form. This new, very expensive device allowed René to speed up the development of his software: HBOOK first, then another package called ZBOOK and the first version of GEANT. Created in 1974 with his colleagues in the Electronic Experiments (EE) group, GEANT1 was a tool for performing simple detector simulations. Gradually, they added features to this software and were able to generate collision simulations: GEANT2 was born.

    In 1975 René joined the NA4 experiment, a deep inelastic muon scattering experiment in the North Area, led by Carlo Rubbia. There he collaborated on the development of new graphic tools that allowed printing histograms using a device called CalComp plotter. This machine, which worked with a 10-meter-long roll of paper, granted a much better resolution compared with line printers, but was very expensive. In 1979 a microfilm system was introduced: histograms saved on the film could be inspected before sending them to the plotter, so that only the interesting ones were printed. This reduced the expenses due to the use of the CalComp.

    René was then supposed to follow Rubbia in the UA1 experiment, for which he had been doing many simulations – “Without knowing that I was simulating for UA1,” René highlights. But instead, at the end of 1980, he joined the OPAL experiment, where he performed all the simulations and created GEANT3.

    While working on the HBOOK system, in 1974 René had developed a memory management and I/O system called ZBOOK. This tool was an alternative to the HYDRA system, which was being developed in the bubble chambers group by the late Julius Zoll (also author of another management system called Patchy).

    Thinking that it was meaningless to have two competing systems, in 1981, the late Emilio Pagiola proposed the development of a new software package called GEM. While three people were working hard on the GEM project, René and Julius together started to run benchmarks to compare their systems, ZBOOK and HYDRA, with GEM. Through these tests, they came to the conclusion that the new system was by far slower than theirs.

    In 1983 Ian Butterworth, the then Director for Computing, decided that only the ZBOOK system would be supported at CERN and that GEM had to be stopped, and HYDRA was frozen. “My group leader, Hans Grote, came to my office, shook my hand and told me: ‘Congratulations René, you won.’ But I immediately thought that this decision was not fair, because actually both systems had good features and Julius Zoll was a great software developer.”

    In consequence of this decision, René and Julius started a collaboration and joined forces to develop a package integrating the best features of both ZBOOK and HYDRA. The new project was called ZEBRA, from the combination of the names of the two original systems. “When Julius and I announced that we were collaborating, Ian Butterworth immediately called both of us to his office and told us that, if in 6 months the ZEBRA system was not functioning, we would be fired from CERN. But indeed, less than two months later we were already able to show a running primary version of the ZEBRA system.”

    At the same time, histogram and visualization tools were under development. René put together an interactive version of HBOOK and HPLOT, called HTV, which run on Tektronix machines. But in 1982 the advent of personal workstations marked a revolution. The first personal workstation introduced in Europe, the Apollo, represented a leap in terms of characteristics and performance: it was faster, had more memory and better user interface than any other previous device. “I was invited by the Apollo company to go to Boston and visit them,” René recounts. “When I first saw the Apollo workstation, I was shocked. I immediately realized that it could speed up our development by a factor of 10. I put myself at work and I think that in just three days I adapted some 20000 lines of code for it.”

    The work of René in adapting HTV for the Apollo workstation attracted the interest of the late Rudy Böck, Luc Pape and Jean-Pierre Revol from the UA1 collaboration, who also suggested some improvements. Therefore, in 1984 the three of them elaborated a proposal for a new package, which would be based on HBOOK and ZEBRA, that they called PAW, from Physics Analysis Workstation.

    2
    The PAW team: (from the left) René Brun, Pietro Zanarini, Olivier Couet (standing) and Carlo Vandoni.

    After a first period of uncertainties, the PAW project developed quickly and many new features were introduced, thanks also to the increasing memory space of the workstations. “At a certain point, the PAW software was growing so fast that we started to receive complaints from users who could not keep up with the development,” says René smiling. “Maybe we were a bit naïve, but certainly full of enthusiasm.”

    The programming language generally used for scientific computing was FORTRAN. In particular, at that time FORTRAN 77 (introduced in 1977) was widespread in the high-energy physics community and the main reason for its success was the fact that it was well structured and quite easy to learn. Besides, very efficient implementations of it were available on all the machines used at the time. As a consequence, when the new FORTRAN 90 appeared, it seemed obvious that it would replace FORTRAN 77 and that it would be as successful as the previous version. “I remember well the leader of the computing division, Paolo Zanella, saying: ‘I don’t know what the next programming language will do but I know its name: FORTRAN.’”

    In 1990 and 91 René, together with Mike Metcalf, who was a great expert of FORTRAN, worked hard to adapt the ZEBRA package to FORTRAN 90. But this effort did not lead to a satisfactory result and discussions raised about the opportunity to keep working with FORTRAN or moving to another language. It was the period when object-oriented programming was taking its first steps and also when Tim Berners Lee joined René’s group.

    Berners-Lee was supposed to develop a documentation system, called XFIND, to replace the previous FIND that could run only on IBM machines, which had to be usable on other devices. He believed, though, that the procedure he was supposed to implement was a bit clumsy and certainly not the best approach to the problem. So, he proposed a different solution with a more decentralized and adaptable approach, which required first of all a work of standardization. In this context, Berners-Lee developed the by-now-very-famous idea of the World Wide Web servers and clients, developed using an object-oriented language (Object C).

    It was a very hot period, because the phase of design and simulation of the experiments for the new accelerator LHC had been launched. It was important to take a decision about the programming language and the software tools to use in these new projects.

    At the workshop of ERICE, organized by INFN in November 1990, and then at the Computing in High Energy Physics (CHEP) conference in Annecy (France), in September 1992, the high-energy physics “software gurus” of the world gathered to discuss about programming languages and possible orientations for software in HEP. Among the many languages proposed, there were also Eiffel, Prolog, Modula2 and others.

    In 1994 two Research and Development (RD) projects were launched: RD44, with the objective of implementing in C++ a new version of GEANT (which will become GEANT4), and RD45, aiming to investigate object-oriented database solutions for the LEP experiments.

    According to René, his division was split in three opinion groups: those who wanted to stay with FORTRAN 90, those who bet on C++ and those who were interested in using commercial products. “I presented a proposal to develop a package that would take PAW to the OO word. But the project, which I called ZOO, was rejected and I was even invited to take a sabbatical leave” René admits.

    This blow, though, proved later to be indeed a strike of luck for René. He was suggested by his division leader, David Williams, to join the NA49 experiment in the North Area, which needed somebody to help developing the software. At first, he refused. He had been leading for years both the GEANT and the PAW projects and making simulation or developing software for different groups and applications, thus accepting to go back working in a specific experiment appeared to him as a big limitation.

    But he gave it second thoughts and realized that it was an opportunity to take some time to develop new software, with total freedom. He went to visit the NA49 building in the Prevessin site and, seeing from the windows pine trees and squirrels, he felt that it was indeed the kind of quiet environment he needed for his new project. Therefore, he moved his workstation from his office to the Prevessin site (“I did it during a weekend, without even telling David Williams”) and, while working for NA49, he taught himself C++ by converting in this new OO language a large part of his HBOOK software.

    At the beginning of 1995, René was joined in NA49 by Fons Rademakers, with whom he had already collaborated. The two of them worked very hard for several months and produced the first version of what became the famous ROOT system. The name comes simply from the combination of the starting letter of the email addresses of the two founders (René and Rdm, for Rademakers), the double O of Object Oriented and the word Technology. But the meaning or the word ‘root’ also fitted well with its being a basic framework for more software to be developed and with the use of tree structures in its architecture.

    In November of the same year, René gave a seminar to present the ROOT system. “The Computing Division auditorium was unexpectedly crowded!” René recalls, “I think it was because people thought that Fons and I had disappeared from the software arena, while all of a sudden we were back again!” And actually the ROOT system generated considerable interest.

    But while René and Fons were completely absorbed by the work on their new software package, the RD45 project, which had the mandate to decide what new software had to be adopted by the new LHC experiments, had proposed to use the commercial product “Objectivity” and a lot of work was ongoing to develop applications to meet the HEP needs. According to René, there was a clear intention to obstruct the development and diffusion of ROOT. In spring 1996 the CERN director for computing, Lorenzo Foa, declared that the ROOT project was considered as a private initiative of NA49 which was not supported by the CERN management and that the official line of development was the one around Objectivity.

    “I think that the LHC Computing Board didn’t have the right insight into the architecture of these software tools to be able to judge which solution was the best. Thus, they had to trust what they were told,” René comments. “It is always a problem when there is such a divide between the experts – and users – working on something and the people who are to take important decisions.”

    Nevertheless, René and Fons continued developing ROOT and implementing new features, taking advantage of the lessons learnt with the previous software packages (in particular the requests and criticisms of the users). In addition, they followed closely the development of the official line with Objectivity, in order to know what people using it were looking for and what the problems or difficulties were. “The more we looked into Objectivity, the more we realized it could not meet the needs of our community,” René adds, “we knew that the system would fail and that eventually people would realize it. This gave us even more energy and motivation to work hard and improve our product.”

    They had continuous support from the NA49 and ALICE collaborations, as well as from many people in ATLAS and CMS, who saw good potentiality in their software package. At the time, René was collaborating with many people in both experiments, including Fabiola Gianotti and Daniel Froidevaux, in particular for detector simulations. Besides, many users trusted them for the relationship created along many years through the user support of PAW and GEANT.

    Things started to change when interest for ROOT raised outside CERN. In 1998, the two experiments of Fermilab, CDF and D0, decided to discuss about the future of their software approach, in view of the soon-coming Run II of the Tevatron. Hence, they opened two calls for proposals of software solutions, one for data storage and one for data analysis and visualization. René submitted ROOT to both calls. During the CHEP conference in Chicago the proposals were discussed and the last day it was publicly announced that CDF and D0 would adopt ROOT. “I was not expecting it,” says René, “I remember that when the communication was given, everybody turned their face and looked at me.” Soon later, the experiments of RHIC at the Brookhaven National Laboratory took the same decision. The BaBar experiment at SLAC, after years spent attempting to use Objectivity, had realized that it was not as good a system as expected, so moved to ROOT as well.

    Gradually, it was clear that the HEP community was ‘naturally’ going towards ROOT, so the CERN management had to accept this situation and, eventually, support it. But this happened only in 2002. With more manpower allocated to the project, ROOT continued developing fast and the number of users increased dramatically. It also started to spread to other branches of science and into the financial world. “In 2010, we had on average 12000 downloads per month of the software package and the ROOT website had more visitors than the CERN one”.

    3
    The logo of the ROOT software package.

    René retired in 2012, but his two most important brainchildren, ROOT and GEANT, keep growing thanks to the work of many young scientists. “I think that it is essential to have a continuous stimulus that pushes you to improve your products and come out with new solutions. For this, the contribution of young people is very important,” comments René. But, as he admits, what really made him and his colleagues work hard for so many years is the fact that the software packages they were developing had always some competitors and, in many cases, they were challenged and even obstructed. “When you are contrasted, but you know you are right, you are condemned to succeed.”

    The great attention to the users’ needs has also been very important, because it helped to shape the software and build a trust relationship with people. “I have always said that you have to put the user support at the highest priority,” René explains. “If you reply to a request in 10 minutes you get 10 points, in one hour you get 2 points, and in one day you go already to -10 points. Answering questions and comments is fundamental, because if the users are satisfied with the support you give them, they are willing to trust what you propose next.”

    Now that he is retired, René still follows the software development at CERN, but only as an external observer. This does not mean that he has left apart his scientific interests, on the contrary he is now dedicating most of his energies to a more theoretical project, since he is developing a physics model. In his spare time, he likes gardening. He loves flowers, but he cannot avoid looking at them with a scientific eye: “A colleague of mine, who is mathematician, and I developed a mathematical model about the way flowers are structured and grow.”

    Brilliant minds are always at work.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Meet CERN in a variety of places:


    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS
    ATLAS
    CERN/ATLAS detector

    ALICE
    CERN ALICE New

    CMS
    CERN/CMS Detector

    LHCb

    CERN/LHCb

    LHC

    CERN/LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles


    Quantum Diaries

     
  • richardmitnick 1:59 pm on October 17, 2017 Permalink | Reply
    Tags: , , , Eliane Epple, , Particle Physics, ,   

    From ALICE at CERN: Women in STEM – “Focus on Eliane Epple” 

    CERN
    CERN New Masthead

    16 October 2017
    Virginia Greco

    1
    Eliane Epple
    A postdoctoral researcher at Yale University, Eliane is working on an analysis involving hard scattering events that produce direct photons and has recently done her first shift as Run Manager for ALICE.

    When she started studying physics in Stuttgart, her hometown, Eliane Epple was already passionate about particle physics. But since it was not possible to specialize in this field at her university, after two years she moved to Munich and attended the Technical University Munich (TUM). Here, she followed courses for two more years before joining a research project led by Prof. Laura Fabbietti, who had just received a big grant and was starting her research group. The subject of Eliane’s Diploma thesis was the study of the interactions of kaons – and other particles containing strange quarks – with nuclear matter (protons and neutrons). More in detail, for her Diploma she analyzed the decay products of a resonance called Λ(1405), which is by some theories treated as a molecular bound state of an anti-kaon and a nucleon. Its is in this sense a pre-stage of a kaonic nuclear cluster that she later studied during her PhD, still working with Prof. Fabbietti.

    In particular, Eliane and colleagues were investigating the possible existence of anti-kaonic bound-states formed by, for example, two nucleons and one anti-kaon.­ Besides Fabbietti’s team, other groups all over the world were working on this topic, since a number of theoretical physicists had hypothesized that the attraction between nucleons and anti-kaons should be strong enough to give rise to this bound state, at least for a short time. “I analyzed data from the High Acceptance Di-Electron Spectrometer (HADES) at GSI.

    GSI HADES

    In particular, I looked for particles produced in p+p collisions that could originate bfrom the decay of this anti-kaon-nucleon bound state,” explains Eliane. “It was a very controversial topic at the time, because there were groups that, analyzing a certain set of data, could see a signal compatible with the detection of such bound state, while others couldn’t. I didn’t find any signal proving this hypothesis, but at the same time my results set un upper limit for the existence of this bound state at the beam energy of 3.5 GeV.”

    “In order to set a limit,” Eliane continues, “you compare the result of your data analysis with the outcome of a simulation, performed assuming the hypothesis that the signal you are looking for but didn’t see exists. In other words, you develop a model for this case and study how much signal you can introduce and still keep consistency with your data. You proceed to add more and more signal strength to your model in little steps, until you reach a threshold: if you overcome it, the model doesn’t fit anymore with the data. This threshold is an upper limit.”

    She also combined her results with data from other experiments and showed that it was very unlikely that the signal seen by some other groups could be due to an anti-kaon-nucleon bound state. “Actually, I think that this signal exists because there are many compelling reasons from our theory colleagues, but it is very challenging to see, first of all because the production cross section of this state is probably very small, which means that it occurs rarely, so we need to take a lot of data. In addition, it might be a very broad state, so we are not going to find a narrow peak. As a consequence, understanding the background well is essential.”

    When she completed her PhD in 2014, she decided to change field. “In that situation, you have two possible choices,” explains Eliane, “either you stay on the same topic and become an expert in a very specific field, or you change and broaden your horizon. In this second case, you do not become a specialist of one topic but rather increase your ‘portfolio’. I preferred to go for this second option and do something completely new. This way is much harder because you basically start from the beginning but I think it benefits a researcher in the long term to look at a field, in this case QCD, from many perspectives. I thus also encourage some young researchers to give low energy QCD research a chance and see what people do beyond the TeV scale.” Therefore, she joined the research group led by John Harris and Helen Caines at Yale University, in New Haven (US), where she has been working for two and a half years now, and entered the ALICE collaboration.

    Her present research activities focus on hard probes in high-energy collisions. “The proton is a very fascinating object, there is a lot going on in it,” Eliane comments. “When you scatter two protons at low energy (an energy range where I have previously been working on), you see how the ‘entire’ proton behaves, you are not able to distinguish its internal structure. On the contrary, at the high energies of LHC, when you collide two protons you start seeing what happens inside, you can observe how partons collide with each other.”

    In these hard scattering events, particles with a high transverse momentum are present in the final state. Eliane is analyzing Pb-Pb events in which a parton and a photon (a gamma) are produced. Photons do not interact with strongly-interacting matter, hence, when the Quark Gluon Plasma (QGP) is created in ALICE by smashing lead nuclei, a photon produced in the collision can traverse this medium and get out unaffected. In the opposite direction, a parton moves away from the collision vertex and fragments into a particle shower. The sum of the momenta of the particles in this shower have to balance the momentum of the photon (combining these fragments with the gamma on the other side is called gamma-hadron correlation), and altogether they carry the total momentum of the mother parton.

    The objective of this research is measuring the fragmentation function, which describes the correlation between the momentum of the mother and those of each particle in the shower. Normally, most of the daughter particles carry a small fraction (less than 20%*) of the momentum of the mother, whereas very few of them have a high fraction of this momentum. “By studying the behaviour of the particle shower in Pb-Pb collisions, in comparison with pp and p-Pb collisions, we can understand how the QGP medium modifies it,” explains Eliane. “We may have, for example, fewer of these very high momentum fragments and therefore more of the low momentum ones, or the shower might be broader. This study gives information about the properties of the medium that is created.”

    There are measurements of gamma-hadron correlations performed in PHENIX,

    BNL RHIC PHENIX

    at 200 GeV which show that in gold-gold collisions the fragmentation function changes, giving fewer particles with high momentum fractions and many more particles having a low momentum fraction. ALICE is investigating what happens at higher energies.

    Eliane is now working in collaboration with a graduate student at her institute and other colleagues in Berkeley. “We are performing a very complex analysis. In our events, we have to identify gammas on one side and the hadron showers on the other. But gammas can also be decay products of other particles, such as pions and other mesons. Thus, it is important to avoid this background signal and take into consideration only events in which the gamma is produced in the primary vertex. This is not easy and requires a number of following steps.”

    Eliane will continue working at Yale for some time. Then, she will either look for another post-doctoral position in ALICE or will directly apply for some grants, most likely in Europe. “There are various opportunities in Germany to get research funding to start your own research group.”

    Even though she likes her present topic of analysis, in the future she might change for something more basic: the substructure and dynamic of the proton. “The proton is a very complex and fascinating object in its own right, we still do not know much about its internal dynamics,” she highlights. In any case, the most important thing for her is to settle on a research topic that will give her deeper insight into QCD properties — something she is very intrigued by.

    In addition to doing data analysis, Eliane coordinates the activities of the EMCal calibration group and EMCal photon object group and, lately, has been Run Manager for the data taking. With so much work and a four-year-old daughter, there is not much time left. Nevertheless, when she can, she attends classes of modern dance to de-stress and relax.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Meet CERN in a variety of places:


    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS
    ATLAS
    CERN/ATLAS detector

    ALICE
    CERN ALICE New

    CMS
    CERN/CMS Detector

    LHCb

    CERN/LHCb

    LHC

    CERN/LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles


    Quantum Diaries

     
  • richardmitnick 11:45 am on October 17, 2017 Permalink | Reply
    Tags: , , LZ- LUX-ZEPLIN experiment, Particle Physics, ,   

    From SURF: “LZ team installs detector in water tank” 

    SURF logo
    Sanford Underground levels

    Sanford Underground Research facility

    October 16, 2017
    Constance Walter

    1
    Sally Shaw, a post-doc with the University of California Santa Barbara, poses next to the sodium iodide detector recently installed inside the water tank. Courtesy photo.

    The huge water tank that for four years housed the Large Underground Xenon (LUX) dark matter detector now stands empty. A small sign over the opening that reads, “Danger! Confined space,” bars physical entry, but a solitary note sung by Michael Gaylor, a science professor from Dakota State University, once jumped that barrier and reverberated for 35.4 seconds.

    Starting this week, the tank will be filled with the sounds of collaboration members installing a small detector that will be used to measure radioactivity in the cavern. It’s all part of the plan to build and install the much larger, second-generation dark matter detector, LUX-ZEPLIN (LZ).

    LBNL Lux Zeplin project at SURF

    “We need to pin down the background event rate to better shield our experiment,” said Sally Shaw, a post doc form from the University of California, Santa Barbara (UCSB).

    The detector, a 5-inch by 5-inch cylinder of sodium iodide, will be placed inside the water tank and surrounded by 8 inches of lead bricks. The crystal will be covered on all sides except one, which will be left bare to measure the gamma rays that are produced when things like thorium, uranium and potassium decay. Over the next two weeks, the team will change the position of the detector five times to determine the directionality of the gamma rays.

    Scott Haselschwardt, a graduate student at UCSB, said this is especially important because there is a rhyolite intrusion that runs below the tank and up the west wall of the cavern.

    “This rock is more radioactive than other types of rock, so it can create more backgrounds,” he said. This wasn’t a problem for LUX, Haselschwardt said, but it was smaller than LZ and, therefore, surrounded by more ultra-pure water.

    But LZ is 10 times larger and still must fit inside the same tank, potentially exposing it to more of the radiation that naturally occurs within the rock cavern. And while this radiation is harmless to humans, it can wreak havoc on highly sensitive experiments like LZ.

    “Because it is so much closer to the edges of the water tank, there was a proposal to put in extra shielding—perhaps a lead ring at the bottom of the tank to shield the experiment,” Shaw said.

    Like its much smaller cousin, LZ hopes to find WIMPs, weakly interacting massive particles. Every component must be tested to ensure it is free of any backgrounds, including more than 500 photomultiplier tubes, the titanium for the cryostat and the liquid scintillator that will surround the xenon container. But if the backgrounds emanating from the walls of the cavern are too high, it won’t matter.

    “The whole point is to see whether the lead needs to be used in the design of the shield,” said Umit Utku, a graduate student at University College in London. “Maybe we will realize we don’t need it.”

    Shaw, who created a design for lead shielding within the tank, said it’s critical to fully understand the backgrounds now.

    “If we do need extra shielding, we must adjust the plans before installation of the experiment begins,” she said.

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    About us.
    The Sanford Underground Research Facility in Lead, South Dakota, advances our understanding of the universe by providing laboratory space deep underground, where sensitive physics experiments can be shielded from cosmic radiation. Researchers at the Sanford Lab explore some of the most challenging questions facing 21st century physics, such as the origin of matter, the nature of dark matter and the properties of neutrinos. The facility also hosts experiments in other disciplines—including geology, biology and engineering.

    The Sanford Lab is located at the former Homestake gold mine, which was a physics landmark long before being converted into a dedicated science facility. Nuclear chemist Ray Davis earned a share of the Nobel Prize for Physics in 2002 for a solar neutrino experiment he installed 4,850 feet underground in the mine.

    Homestake closed in 2003, but the company donated the property to South Dakota in 2006 for use as an underground laboratory. That same year, philanthropist T. Denny Sanford donated $70 million to the project. The South Dakota Legislature also created the South Dakota Science and Technology Authority to operate the lab. The state Legislature has committed more than $40 million in state funds to the project, and South Dakota also obtained a $10 million Community Development Block Grant to help rehabilitate the facility.

    In 2007, after the National Science Foundation named Homestake as the preferred site for a proposed national Deep Underground Science and Engineering Laboratory (DUSEL), the South Dakota Science and Technology Authority (SDSTA) began reopening the former gold mine.

    In December 2010, the National Science Board decided not to fund further design of DUSEL. However, in 2011 the Department of Energy, through the Lawrence Berkeley National Laboratory, agreed to support ongoing science operations at Sanford Lab, while investigating how to use the underground research facility for other longer-term experiments. The SDSTA, which owns Sanford Lab, continues to operate the facility under that agreement with Berkeley Lab.

    The first two major physics experiments at the Sanford Lab are 4,850 feet underground in an area called the Davis Campus, named for the late Ray Davis. The Large Underground Xenon (LUX) experiment is housed in the same cavern excavated for Ray Davis’s experiment in the 1960s.
    LUX/Dark matter experiment at SURFLUX/Dark matter experiment at SURF

    In October 2013, after an initial run of 80 days, LUX was determined to be the most sensitive detector yet to search for dark matter—a mysterious, yet-to-be-detected substance thought to be the most prevalent matter in the universe. The Majorana Demonstrator experiment, also on the 4850 Level, is searching for a rare phenomenon called “neutrinoless double-beta decay” that could reveal whether subatomic particles called neutrinos can be their own antiparticle. Detection of neutrinoless double-beta decay could help determine why matter prevailed over antimatter. The Majorana Demonstrator experiment is adjacent to the original Davis cavern.

    Another major experiment, the Long Baseline Neutrino Experiment (LBNE)—a collaboration with Fermi National Accelerator Laboratory (Fermilab) and Sanford Lab, is in the preliminary design stages. The project got a major boost last year when Congress approved and the president signed an Omnibus Appropriations bill that will fund LBNE operations through FY 2014. Called the “next frontier of particle physics,” LBNE will follow neutrinos as they travel 800 miles through the earth, from FermiLab in Batavia, Ill., to Sanford Lab.

    Fermilab LBNE
    LBNE

     
  • richardmitnick 8:09 pm on October 13, 2017 Permalink | Reply
    Tags: , Baby MIND, , , , , Particle Physics,   

    From CERN: “Baby MIND born at CERN now ready to move to Japan” 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead

    CERN

    13 Oct 2017
    Stefania Pandolfi

    1
    Baby MIND under test on the T9 beamline at the Proton Synchrotron experimental hall in the East Area, summer 2017 (Image: Alain Blondel/University of Geneva)

    A member of the CERN Neutrino Platform family of neutrino detectors, Baby MIND, is now ready to be shipped from CERN to Japan in 4 containers to start the experimental endeavour it has been designed and built for. The containers are being loaded on 17 and 18 October and scheduled to arrive by mid-December.

    Baby MIND is a 75-tonne neutrino detector prototype for a Magnetised Iron Neutrino Detector (MIND). Its goal is to precisely identify and track positively or negatively charged muons – the product of muon neutrinos from the (Tokai to Kamioka) beam line, interacting with matter in the WAGASCI neutrino detector, in Japan.

    T2K map, T2K Experiment, Tokai to Kamioka, Japan

    The more detailed the identification of the muon that crosses the Baby MIND detector, the more we can learn about the original neutrino, in view of contributing to a more precise understanding of the neutrino oscillations phenomenon*.

    The journey of these muon neutrinos starts from the Japan Proton Accelerator Research Complex (J-PARC) in Tokai. They travel all the way to the Super-Kamiokande Detector in Kamioka, some 295 km away.

    Super-Kamiokande Detector, located under Mount Ikeno near the city of Hida, Gifu Prefecture, Japan

    On their journey, the neutrinos pass through the near detector complex building, located 280 m downstream from Tokai, where the WAGASCI + Baby MIND suite of detectors are. Baby MIND aims to measure the velocity and charge of muons produced by the neutrino interactions with matter in the WAGASCI detector. Muons precise tracking will help testing our ability to reconstruct important characteristics of their parent neutrinos. This, in turn, is important because in studying muon neutrino oscillations on their journey from Tokai to Kamioka, it is crucial to know how strongly and how often they interact with matter.

    Born from prototyping activities launched within the AIDA project, since its approval in December 2015 by the CERN Research Board, the Baby MIND collaboration – comprising CERN, University of Geneva, the Institute of Nuclear research in Moscow, the Universities of Glasgow, Kyoto, Sofia, Tokyo, Uppsala and Valencia – has been busy designing, prototyping, constructing and testing this detector. The magnet construction phase, which lasted 6 months, was completed in mid-February 2017, two weeks ahead of schedule.

    The fully assembled Baby MIND detector was tested on a beam line (link sends e-mail) at the experimental zone of the Proton Synchrotron in the East Hall during Summer 2017. These tests showed that the detector is working as expected and, therefore, ready to go.

    2
    Baby MIND under test on the T9 beamline at the Proton Synchrotron experimental hall in the East Area, summer 2017 (Image: Alain Blondel/University of Geneva)

    *Neutrino oscillations

    Neutrinos are everywhere. Each second, several billion of these particles coming from the Sun, the Earth and our galaxy, pass through our bodies. And yet, they fly past unnoticed. Indeed, despite their cosmic abundance and ubiquity, neutrinos are extremely difficult to study because they hardly interact with matter. For this reason, they are among the least understood particles in the Standard Model (SM) of particle physics.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    What we know is that they come in three types or ‘flavours’ – electron neutrino, muon neutrino and tau neutrino. Since their first detection in 1956, and until the late 1990s neutrinos were thought to be massless, in line with the SM predictions. However, a few years later, the Super-Kamiokande experiment in Japan and then the Sudbury Neutrino Observatory in Canada independently demonstrated that neutrinos can change (oscillate) from one flavour to another spontaneously.

    Sudbury Neutrino Observatory, , no longer operating

    This is only possible if neutrinos have masses, however small, and the probability of changing flavour is proportional to their difference in mass and the distance they travel. This ground-breaking discovery was awarded with the 2015 Physics Nobel Prize.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

    Quantum Diaries

     
  • richardmitnick 5:03 pm on October 13, 2017 Permalink | Reply
    Tags: , , Particle Physics,   

    From CERN Courier: “Birth of a symmetry” 


    CERN Courier

    Oct 13, 2017
    Frank Close

    1
    Model of Leptons

    Half a century ago, Steven Weinberg spent the summer at Cape Cod, working on a new theory of the strong interaction of pions.

    1
    Steven Weinberg

    By October 1967, the idea had morphed into a theory of the weak and electromagnetic interactions, and the following month he published a paper that would revolutionise our understanding of the fundamental forces.

    Weinberg’s paper “A Model of Leptons”, published in Physical Review Letters (PRL) on 20 November 1967, determined the direction of high-energy particle physics through the final decades of the 20th century. Just two and a half pages long, it is one of the most highly cited papers in the history of theoretical physics. Its contents are the core of the Standard Model of particles physics, now almost half a century old and still passing every experimental test.

    Most particle physicists today have grown up with the Standard Model’s orderly account of the fundamental particles and interactions, but things were very different in the 1960s.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    Quantum electrodynamics (QED) had been well established as the description of the electromagnetic interaction, but there were no mature theories of the strong and weak nuclear forces. By the 1960s, experimental discoveries showed that the weak force exhibits some common features with QED, in particular that it might be mediated by a vector boson analogous to the photon. Theoretical arguments also suggested that QED’s underlying “U(1)” group structure could be generalised to the larger group SU(2), but there was a serious problem with such a scheme: the W boson suspected to mediate the weak force would have to be very massive empirically, whereas the mathematical symmetry of the theory required it to be massless like the photon.

    The importance of symmetries in understanding the fundamental forces was already becoming clear at the time, in particular how nature might hide its symmetries. Could “hidden symmetry” lead to a massive W boson while preserving the mathematical consistency of the theory? It was arguably Weinberg’s developments, in 1967, that brought this concept to life.

    Strong inspiration

    Weinberg’s inspiration was an earlier idea of [Yoichiro] Nambu in which fermions – such as the proton or neutron – can behave like a left- or right-handed screw as they move. If mass is ignored, these two “chiral” states act independently and the theory leads to the existence of a particle with properties similar to those of the pion – specifically a pseudoscalar, which means that it has no spin and its wavefunction changes sign under mirror symmetry. Nambu’s original investigations, however, had not examined how the three versions of the pion, with positive, negative or zero charge, shared their common “pion-ness” when interacting with one another. This commonality, or symmetry, is mathematically expressed by the group SU(2), which had been known in nuclear physics since the 1930s and in mathematics for much longer.

    It was this symmetry that Weinberg used as his point of departure in building a theory of the strong force, where nucleons interact with pions of all charges and the proton and neutron themselves form two “faces” of the underlying SU(2) structure. Empirical observations of the interactions between pions and nucleons showed that the underlying symmetry of SU(2) tended to act on the left- or right-handed chiral possibilities independently. The mathematical structure of the resulting equations to describe this behaviour, as Weinberg discovered, is called SU(2)×SU(2).

    2
    Original manuscript

    However, in nature this symmetry is not perfect because nucleons have mass. Had they been massless, they would have travelled at the speed of light, the left- and right-handed possibilities acting truly independently of one another and the symmetry left intact. That nucleons have a mass, so that the left and right states get mixed up when perceived by observers in different inertial frames, breaks the chiral symmetry. Nambu had investigated this effect as far back as 1959, but without the added richness of the SU(2)×SU(2) mathematical structure that Weinberg brought to the problem. Weinberg had been investigating this more sophisticated theory in around 1965, initially with considerable success. He derived theorems that explained the observed interactions of pions and nucleons at low energies, such as in nuclear physics. He was able to predict how pions behaved when they scattered from one another and, with a few well-defined assumptions, paved the way for a whole theory of hadronic physics at low energies.

    Meanwhile, in 1964, Brout and Englert, Higgs, Kibble, Guralnik and Hagen had demonstrated that the vector bosons of a Yang–Mills theory (one that is like QED but where attributes such as electric charge can be exchanged by the vector bosons themselves) put forward a decade earlier could become massive without spoiling the fundamental gauge symmetry. This “mass-generating mechanism” suggested that a complete Yang–Mills theory of the strong interaction might be possible. In addition to the well-known pion, examples of massive vector particles that feel the strong force had already been found, notably the rho-meson. Like the pion, this too occurs in three charged varieties: positive, negative and zero. Superficially these rho-mesons had the hallmarks of being the gauge bosons of the strong interactions, but they also have mass. Was the strong interaction the theatre for applying the mass-generating mechanism?

    Despite at first seeming so promising, the idea failed to fit the data. For some phenomena, the SU(2)×SU(2) symmetry empirically is broken, but for others where spin didn’t matter it works perfectly. When these patterns were incorporated into the maths, the rho-meson stubbornly remained massless, contrary to reality.

    Epiphany on the road

    In the middle of September 1967, while driving his red Camaro to work at MIT, Weinberg realised that he had been applying the right ideas to the wrong problem. Instead of the strong interactions, for which the SU(2)×SU(2) idea refused to work, the massless photon and the hypothetical massive W boson of the electromagnetic and weak interactions fitted perfectly with this picture. To call this possibility “hypothetical” hardly does justice to the time: the W boson was not discovered until 1984, and in 1967 was so disregarded as to receive at best a passing mention, if any, in textbooks.

    Weinberg needed a concrete model to illustrate his general idea. The numerous strongly interacting hadrons that had been discovered in the 1950s and 1960s were, for him, a quagmire, so he restricted his attention to the electron and neutrino. Here too it is worth recalling the state of knowledge at the time. The constituent quark model with three flavours – up, down and strange – had been formulated in 1964, but was widely disregarded. The experiments at SLAC that would help establish these constituents were a year away from announcing their results, and Bjorken’s ideas of a quark model, articulated at conferences that summer, were not yet widely accepted either. Finally, with only three flavours of quark, Weinberg’s ideas would lead to empirically unwanted “strangeness-changing neutral currents”. All these problems would eventually be solved, but in 1967 Weinberg made a wise choice to focus on leptons and leave quarks well alone.

    3
    Proving validity

    Following the discovery of parity violation in the 1950s, it was clear that the electron can spin like a left- or right-handed screw, whereas the massless neutrino is only left-handed. The left–right symmetry, which had been a feature of the strong interaction, was gone. Instead of two SU(2), the mathematics now only needed one, the second being replaced by the unitary group U(1). So Weinberg set up the equations of SU(2)×U(1) – the same structure that, unknown to him, had been proposed by Sheldon Glashow in 1961 and by Abdus Salam and John Ward in 1964 in attempts to marry the electromagnetic and weak interactions. His theory, like theirs, required two massive electrically charged bosons – the W+ and W– carriers of the weak force – and two neutral bosons: the massless photon and a massive Z0. If correct, it would show that the electromagnetic and weak forces are unified, taking physics a step closer to the goal of a single theory of all fundamental interactions.

    “The history of attempts to unify weak and electromagnetic interactions is very long, and will not be reviewed here.” So began the first footnote in Steven Weinberg’s seminal November 1967 paper, which led to him being awarded the 1979 Nobel Prize in Physics with Salam and Glashow. Weinberg’s footnote mentioned Fermi’s primitive idea for unification in 1934, and also the model that Glashow proposed in 1961.

    Clarity of thought

    Weinberg started his paper by articulating the challenge of unifying the electroweak forces as both an opportunity and a threat. He focused on the leptons – those fermions, such as the electron and neutrino, which do not feel the strong force. “Leptons interact only with photons, and with the [weak] bosons that presumably mediate weak interactions. What could be more natural than to unite these spin-one bosons [the photon and the weak bosons] into a multiplet,” he pondered. That was the opportunity. The threat was that “standing in the way of this synthesis are the obvious differences in the masses of the photon and [weak] boson.”

    Weinberg then suggests a solution: perhaps “the symmetries relating the weak and electromagnetic interactions are exact [at a fundamental level] but are [hidden in practice]”. He then draws attention to the ideas of Higgs, Brout, Englert, Guralnik, Hagen and Kibble, and uses these to give masses to the W and Z in his model. In a further important insight, Weinberg shows how this symmetry-breaking mechanism leaves the photon massless.

    His opening paragraph ended with the prescient observation that: “The model may be renormalisable.” The argument upon which this remark is based appears at the very end of the paper, although with somewhat less confidence than the promise hinted at the beginning. He begins the final paragraph with a question: “Is this model renormalisable?” The extent of his intuition is revealed in his argument: although the presence of a massive vector boson hitherto had been a scourge, the theory with which he had begun had no such mass and, as such, was “probably renormalisable”. So, he pondered: “The question is whether this renormalisablity is lost [by the spontaneous breaking of the symmetry].” And the conclusion: “If this model is renormalisable, what happens when we extend it…to the hadrons?”

    By speculating that his model may be renormalisable, Weinberg was hugely prescient, as ’t Hooft and Veltman would prove four years later. And perhaps it was a chance encounter at the Solvay Congress in Belgium two weeks before his paper was submitted that helped convince Weinberg that he was on the right track.

    Solvay secrets

    By the end of September 1967, Weinberg had his ideas in place as he set off to Belgium to attend the 14th Solvay Congress on Fundamental Problems in Elementary Particle Physics, held in Brussels from 2 to 7 October. He did not speak about his forthcoming paper, but did make some remarks after other talks, in particular following a presentation by Hans Peter Durr about a theorem of Jeffrey Goldstone and spontaneous symmetry breaking. During a general discussion session following Durr’s talk, Weinberg mused: “This raises a question I can’t answer: are such models renormalisable?” He continued with a similar argument to that which later appeared in his paper, ending with: “I hope someone will be able to find out whether or not [this] is a renormalisable theory of weak and electromagnetic interactions.”

    There was remarkably little reaction to Weinberg’s remarks, and he himself has recalled “a general lack of interest”. The only recorded statement came from François Englert, who insisted that the theory is renormalisable; then, remarkably, there is no further discussion. Englert and Robert Brout, then relatively junior scientists, had both attended the same Brussels meeting.

    4
    Nobel prize

    At some point during the Solvay conference, Weinberg presented a hand-written draft of his paper to Durr, and 40 years later I obtained a copy by a roundabout route. Weinberg himself had not seen it in all that time, and thought that all record of his Nobel-winning manuscript had been lost. The original manuscript is notable for there being no sign of second thoughts, or editing, which suggests that it was a provisional final draft of an idea that had been worked through in the preceding days. The only hint of modification after the first draft had been written is a memo squeezed in at the end of a reference to Higgs, to include references to Brout and Englert, and to Guralnik, Hagen and Kibble, for the idea of spontaneous symmetry breaking, on which the paper was based. Weinberg’s intuition about the renormalisability of the model is already present in this manuscript, and is identical to what appears in his PRL paper. There is no mention of Glashow’s SU(2)×U(1) model in the draft, but this is included in the version that was published in PRL the following month. This is the only substantial difference. This manuscript was submitted to the editors of PRL on Weinberg’s return to the US, and received by them on 17 October. It appeared in print on 20 November.

    Lasting impact

    Weinberg’s genius was to assemble together the various pieces of a jigsaw and display the whole picture. The basic idea of mass generation was due to the assorted theorists mentioned above, in the summer of 1964. However, a crucial feature of Weinberg’s model was the trick of being able to give masses to the W and Z while leaving the photon massless. This extension of the mass-generating mechanism was due to Tom Kibble, in 1967, which Weinberg recognises and credits.

    As was the case with his comments in Brussels the previous month, Weinberg’s paper appeared in November 1967 to a deafening silence. “Rarely has so great an accomplishment been so widely ignored,” wrote Sidney Coleman in Science in 1979. Today, Weinberg’s paper has been cited more than 10,000 times. Having been cited but twice in the four years from 1967 to 1971, suddenly it became so important that researchers have cited it three times every week throughout half a century. There is no parallel for this in the history of particle physics. The reason is that in 1971 an event took place that has defined the direction of the field ever since: Gerard ’t Hooft made his debut, and he and Martinus Veltman demonstrated the renormalisability of spontaneously broken Yang–Mills theories. A decade later the W and Z bosons were discovered by experiments at CERN’s Super Proton Synchrotron.

    CERN Super Proton Synchrotron

    A further 30 years were to pass before the discovery of the Higgs boson at the Large Hadron Collider completed the electroweak menu.

    CERN CMS Higgs Event

    CERN ATLAS Higgs Event

    And in the meantime, completing the Standard Model, quantum chromodynamics was established as the theory of the strong interactions, based on the group SU(3).

    This episode in particle physics is not only one of the seminal breakthroughs in our understanding of the physical world, but touches on the profound link between mathematics and nature. On one hand it shows how it is easier to be Beethoven or Shakespeare than to be Steven Weinberg: change a few notes in a symphony or a phrase in a play, and you can still have a wonderful work of art; change a few symbols in Weinberg’s equations and the edifice falls apart – for if nature does not read your creation, however beautiful it might be, its use for science is diminished. Like all great theorists, Weinberg revealed a new aspect of reality by writing symbols on a sheet of paper and manipulating them according to the logic of mathematics. It took decades of technological progress to enable the discoveries of W and Higgs bosons and other entities that were already “known” to mathematics 50 years ago.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

     
  • richardmitnick 4:16 pm on October 13, 2017 Permalink | Reply
    Tags: , , CLEAR, , , Particle Physics   

    From CERN Courier: “CLEAR prospects for accelerator research” 


    CERN Courier

    Oct 13, 2017

    1
    CLEAR’s plasma-lens experiment will test ways to drive strong currents through a plasma for particle-beam transverse focusing.
    Image credit: M Volpi.

    A new user facility for accelerator R&D, the CERN Linear Electron Accelerator for Research (CLEAR), started operation in August and is ready to provide beam for experiments. CLEAR evolved from the former CTF3 test facility for the Compact Linear Collider (CLIC), which ended a successful programme in December 2016. Following approval of the CLEAR proposal, the necessary hardware modifications started in January and the facility is now able to host and test a broad range of ideas in the accelerator field.

    CLEAR’s primary goal is to enhance and complement the existing accelerator R&D programme at CERN, as well as offering a training infrastructure for future accelerator physicists and engineers. The focus is on general accelerator R&D and component studies for existing and possible future accelerator applications. This includes studies of high-gradient acceleration methods, such as CLIC X-band and plasma technologies, as well as prototyping and validation of accelerator components for the high-luminosity LHC upgrade.

    The scientific programme for 2017 includes: a combined test of critical CLIC technologies, continuing previous tests performed at CTF3; measurements of radiation effects on electronic components to be installed on space missions in a Jovian environment and for dosimetry tests aimed at medical applications; beam instrumentation R&D; and the use of plasma for beam focusing. Further experiments, such as those exploring THz radiation for accelerator applications and direct impedance measurements of equipment to be installed in CERN accelerators, are also planned.

    The experimental programme for 2018 and beyond is still open to new and challenging proposals. An international scientific committee is currently being formed to prioritise proposals, and a user request form is available at the CLEAR website: http://clear.web.cern.ch/.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

     
  • richardmitnick 9:53 pm on October 12, 2017 Permalink | Reply
    Tags: , , , , , Particle Physics, , Xenon is a heavy noble gas that exists in trace quantities in the air, Xenon takes a turn in the LHC   

    From Symmetry: “Xenon takes a turn in the LHC” 

    Symmetry Mag
    Symmetry

    10/12/17
    Sarah Charley

    1
    For the first time, the Large Hadron Collider is accelerating xenon nuclei for experiments.

    LHC

    CERN/LHC Map

    CERN LHC Tunnel

    CERN LHC particles

    Most of the year, the Large Hadron Collider at CERN collides protons. LHC scientists have also accelerated lead nuclei stripped of their electrons. Today, for just about eight hours, they are experimenting with a different kind of nucleus: xenon.

    Xenon is a heavy noble gas that exists in trace quantities in the air. Xenon nuclei are about 40 percent lighter than lead nuclei, so xenon-xenon collisions have a different geometry and energy distribution than lead-lead collisions.

    “When two high-energy nuclei collide, they can momentarily form a droplet of quark gluon plasma, the primordial matter that filled our universe just after the big bang,” says Peter Steinberg, a physicist at the US Department of Energy’s Brookhaven National Laboratory and a heavy-ion coordinator for the ATLAS experiment at CERN. “The shape of the colliding nuclei influences the initial shape of this droplet, which in turn influences how the plasma flows and finally shows up in the angles of the particles we measure. We’re hoping that these smaller droplets from xenon-xenon collisions give us deeper insight into how this still-mysterious process works at truly subatomic length scales.”

    Not all particles that travel through CERN’s long chain of interconnected accelerators wind up in the LHC. Earlier this year, scientists were loading xenon ions into the accelerator and firing them at a fixed-target experiment instead.

    “We can have particles from two different sources feeding into CERN’s accelerator complex,” says Michaela Schaumann, a physicist in LHC operation working on the heavy-ion program. “The LHC’s injectors are so flexible that, once everything is set up properly, they can alternate between accelerating protons and accelerating ions a few times a minute.”

    Having the xenon beam already available provided an opportunity to send xenon into the LHC for first (and potentially only) time. It took some serious additional work to bring the beam quality up to collider levels, Schaumann says, but today it was ready to go.

    “We are keeping the intensities very low in order to fulfil machine protection requirements and be able to use the same accelerator configuration we apply during the proton-proton runs with xenon beams,” Schaumann says. “We needed to adjust the frequency of the accelerator cavities [because more massive xenon ions circulate more slowly than protons], but many of the other machine settings stayed roughly the same.”

    This novel run tests scientists’ knowledge of beam physics and shows the flexibility of the LHC. Scientists say they are hopeful it could reveal something new.

    “We can learn a lot about the properties of the hot, dense matter from smaller collision systems,” Steinberg says. “They are a valuable bridge to connect what we observe in lead-lead collisions to strikingly similar observations in proton-proton interactions.”

    3
    The LHC screen during the xenon-ion run. (Image: CERN)

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 8:58 pm on October 12, 2017 Permalink | Reply
    Tags: "The spin property of Majoranas distinguishes them from other types of quasi-particles that emerge in materials", An elusive particle notable for behaving simultaneously like matter and antimatter, , , Particle Physics, ,   

    From Research at Princeton Blog: “Spotting the spin of the Majorana fermion under the microscope” 

    Princeton University
    Research at Princeton Blog

    October 12, 2017
    Catherine Zandonella, Office of the Dean for Research

    1
    The figure shows a schematic of the experiment. A magnetized scanning tunneling microscope tip was used to probe the spin property of the quantum wave function of the Majorana fermion at the end of a chain of iron atoms on the surface of a superconductor made of lead. Image courtesy of Yazdani Lab, Princeton University.

    Researchers at Princeton University have detected a unique quantum property of an elusive particle notable for behaving simultaneously like matter and antimatter. The particle, known as the Majorana fermion, is prized by researchers for its potential to open the doors to new quantum computing possibilities.

    In the study published this week in the journal Science, the research team described how they enhanced an existing imaging technique, called scanning tunneling microscopy, to capture signals from the Majorana particle at both ends of an atomically thin iron wire stretched on the surface of a crystal of lead. Their method involved detecting a distinctive quantum property known as spin, which has been proposed for transmitting quantum information in circuits that contain the Majorana particle.

    “The spin property of Majoranas distinguishes them from other types of quasi-particles that emerge in materials,” said Ali Yazdani, Princeton’s Class of 1909 Professor of Physics. “The experimental detection of this property provides a unique signature of this exotic particle.”

    The finding builds on the team’s 2014 discovery, also published in Science, of the Majorana fermion in a single atom-wide chain of iron atoms atop a lead substrate. In that study, the scanning tunneling microscope was used to visualize Majoranas for the first time, but provided no other measurements of their properties.

    “Our aim has been to probe some of the specific quantum properties of Majoranas. Such experiments provide not only further confirmation of their existence in our chains, but open up possible ways of using them.” Yazdani said.

    First theorized in the late 1930s by the Italian physicist Ettore Majorana, the particle is fascinating because it acts as its own antiparticle. In the last few years, scientists have realized that they can engineer one-dimensional wires, such as the chains of atoms on the superconducting surface in the current study, to make Majorana fermions emerge in solids. In these wires, Majoranas occur as pairs at either end of the chains, provided the chains are long enough for the Majoranas to stay far enough apart that they do not annihilate each other. In a quantum computing system, information could be simultaneously stored at both ends of the wire, providing a robustness against outside disruptions to the inherently fragile quantum states.

    Previous experimental efforts to detect Majoranas have used the fact that it is both a particle and an antiparticle. The telltale signature is called a zero-bias peak in a quantum tunneling measurement. But studies have shown that such signals could also occur due to a pair of ordinary quasiparticles that can emerge in superconductors. Professor of Physics Andrei Bernevig and his team, who with Yazdani’s group proposed the atomic chain platform, developed the theory that showed that spin-polarized measurements made using a scanning tunneling microscope can distinguish between the presence of a pair of ordinary quasi-particles and a Majorana.

    Typically, scanning tunneling microscopy (STM) involves dragging a fine-tipped electrode over a structure, in this case the chain of iron atoms, and detecting its electronic properties, from which an image can be constructed. To perform spin-sensitive measurements, the researchers create electrodes that are magnetized in different orientations. These “spin-polarized” STM measurements revealed signatures that agree with the theoretical calculations by Bernevig and his team.

    “It turns out that, unlike in the case of a conventional quasi-particle, the spin of the Majorana cannot be screened out by the background. In this sense it is a litmus test for the presence of the Majorana state,” Bernevig said.

    The quantum spin property of Majorana may also make them more useful for applications in quantum information. For example, wires with Majoranas at either end can be used to transfer information between far away quantum bits that rely on the spin of electrons. Entanglement of the spins of electrons and Majoranas may be the next step in harnessing their properties for quantum information transfer.

    The STM studies were conducted by three co-first authors in the Yazdani group: scientist Sangjun Jeon, graduate student Yonglong Xie, and former postdoctoral research associate Jian Li (now a professor at Westlake University in Hangzhou, China). The research also included contributions from postdoctoral research associate Zhijun Wang in Bernevig’s group.

    This work has been supported by the Gordon and Betty Moore Foundation as part of the EPiQS initiative (grant GBMF4530), U.S. Office of Naval Research (grants ONR-N00014-14-1-0330, ONR-N00014-11-1-0635, and ONR- N00014-13-1-0661) , the National Science Foundation through the NSF-MRSEC program (grants DMR-142054 and DMR-1608848) and an EAGER Award (grant NOA -AWD1004957), the U.S. Army Research Office MURI program (grant W911NF-12-1-046), the U.S. Department of Energy Office of Basic Energy Sciences, the Simons Foundation, the David and Lucile Packard Foundation, and the Eric and Wendy Schmidt Transformative Technology Fund at Princeton.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    Princeton University Campus

    About Princeton: Overview

    Princeton University is a vibrant community of scholarship and learning that stands in the nation’s service and in the service of all nations. Chartered in 1746, Princeton is the fourth-oldest college in the United States. Princeton is an independent, coeducational, nondenominational institution that provides undergraduate and graduate instruction in the humanities, social sciences, natural sciences and engineering.

    As a world-renowned research university, Princeton seeks to achieve the highest levels of distinction in the discovery and transmission of knowledge and understanding. At the same time, Princeton is distinctive among research universities in its commitment to undergraduate teaching.

    Today, more than 1,100 faculty members instruct approximately 5,200 undergraduate students and 2,600 graduate students. The University’s generous financial aid program ensures that talented students from all economic backgrounds can afford a Princeton education.

    Princeton Shield

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: