Tagged: Accelerator Science Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 7:24 am on August 27, 2016 Permalink | Reply
    Tags: Accelerator Science, ,   

    From Symmetry: “Winners declared in SUSY bet” 

    Symmetry Mag

    Symmetry

    08/26/16
    Kathryn Jepsen

    1
    Peter Munch Andersen

    Physicists exchanged cognac in Copenhagen at the conclusion of a bet about supersymmetry and the LHC.

    As a general rule, theorist Nima Arkani-Hamed does not get involved in physics bets.

    “Theoretical physicists like to take bets on all kinds of things,” he says. “I’ve always taken the moral high ground… Nature decides. We’re all in pursuit of the truth. We’re all on the same side.”

    But sometimes you’re in Copenhagen for a conference, and you’re sitting in a delightfully unusual restaurant—one that sort of reminds you of a cave—and a fellow physicist gives you the opportunity to get in on a decade-old wager about supersymmetry and the Large Hadron Collider. Sometimes then, you decide to bend your rule. “It was just such a jovial atmosphere, I figured, why not?”

    That’s how Arkani-Hamed found himself back in Copenhagen this week, passing a 1000-Krone bottle of cognac to one of the winners of the bet, Director of the Niels Bohr International Academy Poul Damgaard.

    Arkani-Hamed had wagered that experiments at the LHC would find evidence of supersymmetry by the arbitrary date of June 16, 2016. Supersymmetry, SUSY for short, is a theory that predicts the existence of partner particles for the members of the Standard Model of particle physics.

    CERN/LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    Standard model of Supersymmetry DESY
    Standard model of Supersymmetry DESY

    The deadline was not met. But in a talk at the Niels Bohr Institute, Arkani-Hamed pointed out that the end of the gamble does not equal the end of the theory.

    “I was not a good student in school,” Arkani-Hamed explained. “One of my big problems was not getting homework done on time. It was a constant battle with my teachers… Just give me another week! It’s kind of like the bet.”

    He pointed out that so far the LHC has gathered just 1 percent of the total amount of data it aims to collect.

    With that data, scientists can indeed rule out the most vanilla form of supersymmetry. But that’s not the version of supersymmetry Arkani-Hamed would expect the LHC to find anyway, he said.

    It is still possible LHC experiments will find evidence of other SUSY models—including the one Arkani-Hamed prefers, called split SUSY, which adds superpartners to just half of the Standard Model’s particles. And if LHC scientists don’t find evidence of SUSY, Arkani-Hamed pointed out, the theoretical problems it aimed to solve will remain an exciting challenge for the next generation of theorists to figure out.

    “I think Winston Churchill said that in victory you should be magnanimous,” Damgaard said after Arkani-Hamed’s talk. “I know also he said that in defeat you should be defiant. And that’s certainly Nima.”

    Arkani-Hamed shrugged. But it turned out he was not the only optimist in the room. Panelist Yonit Hochberg of the University of California, Berkeley conducted an informal poll of attendees. She found that the majority still think that in the next 20 years, as data continues to accumulate, experiments at the LHC will discover something new.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 2:34 pm on August 24, 2016 Permalink | Reply
    Tags: Accelerator Science, , , ,   

    From CERN: “LHC pushes limits of performance’ 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead

    CERN

    19 Aug 2016
    Harriet Kim Jarlett

    1

    The Large Hadron Collider’s (LHC) performance continued to surpass expectations, when this week it achieved 2220 proton bunches in each of its counter-rotating beams – the most it will achieve this year.

    This is not the maximum the machine is capable of holding (at full intensity the beam will have nearly 2800 bunches) but it is currently limited by a technical issue in the Super Proton Synchrotron (SPS).

    CERN  Super Proton Synchrotron
    “CERN Super Proton Synchrotron

    “Performance is excellent, given this limitation,” says Mike Lamont, head of the Operations team. “We’re 10% above design luminosity (which we surpassed in June), we have these really long fills (where the beam is circulating for up to 20 hours or so) and very good collision rates. 2220 bunches is just us squeezing as much in as we can, given the restrictions, to maximize delivery to the experiments.”

    As an example of the machine’s brilliant performance, with almost two months left in this year’s run it has already reached an integrated luminosity of 22fb-1 – very close to the goal for 2016 of 25fb-1 (up from 4fb-1 last year.)

    Luminosity is an essential indicator of the performance of an accelerator, measuring the potential number of collisions that can occur in a given amount of time, and integrated luminosity (measured in inverse femtobarns, fb-1) is the accumulated number of potential collisions. At its peak, the LHC’s proton-proton collision rate reaches about 1 billion collisions per second giving a chance that even the rarest processes at the highest energy could occur.

    The SPS is currently experiencing a small fault that could be exacerbated by high beam intensity – hence the number of proton bunches sent to the LHC per injection is limited to 96, compared to the normal 288.

    “Once this issue is fixed in the coming year-end technical stop, we’ll be able to push up the number of bunches even further. Next year we should be able to go to new record levels,” says Lamont with a wry grin.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Meet CERN in a variety of places:

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

    Quantum Diaries

     
  • richardmitnick 10:01 am on August 17, 2016 Permalink | Reply
    Tags: Accelerator Science, , , CERN: Facts & Figures,   

    From CERN: “Facts & Figures” 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead

    CERN

    The Large Hadron Collider (LHC) is the most powerful particle accelerator ever built. The accelerator sits in a tunnel 100 metres underground at CERN, the European Organization for Nuclear Research, on the Franco-Swiss border near Geneva, Switzerland.

    What is the LHC?

    The LHC is a particle accelerator that pushes protons or ions to near the speed of light. It consists of a 27-kilometre ring of superconducting magnets with a number of accelerating structures that boost the energy of the particles along the way.

    Why is it called the “Large Hadron Collider”?

    “Large” refers to its size, approximately 27km in circumference
    “Hadron” because it accelerates protons or ions, which belong to the group of particles called hadrons
    “Collider” because the particles form two beams travelling in opposite directions, which are made to collide at four points around the machine

    How does the LHC work?

    The CERN accelerator complex is a succession of machines with increasingly higher energies. Each machine accelerates a beam of particles to a given energy before injecting the beam into the next machine in the chain. This next machine brings the beam to an even higher energy and so on. The LHC is the last element of this chain, in which the beams reach their highest energies.

    1
    The CERN accelerator complex (Image: CERN)

    Inside the LHC, two particle beams travel at close to the speed of light before they are made to collide. The beams travel in opposite directions in separate beam pipes – two tubes kept at ultrahigh vacuum. They are guided around the accelerator ring by a strong magnetic field maintained by superconducting electromagnets. Below a certain characteristic temperature, some materials enter a superconducting state and offer no resistance to the passage of electrical current. The electromagnets in the LHC are therefore chilled to ‑271.3°C (1.9K) – a temperature colder than outer space – to take advantage of this effect. The accelerator is connected to a vast distribution system of liquid helium, which cools the magnets, as well as to other supply services.

    What are the main goals of the LHC?

    The Standard Model of particle physics – a theory developed in the early 1970s that describes the fundamental particles and their interactions – has precisely predicted a wide variety of phenomena and so far successfully explained almost all experimental results in particle physics..

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.
    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    But the Standard Model is incomplete. It leaves many questions open, which the LHC will help to answer.

    What is the origin of mass? The Standard Model does not explain the origins of mass, nor why some particles are very heavy while others have no mass at all. However, theorists Robert Brout, François Englert and Peter Higgs made a proposal that was to solve this problem. The Brout-Englert-Higgs mechanism gives a mass to particles when they interact with an invisible field, now called the “Higgs field”, which pervades the universe.
    Particles that interact intensely with the Higgs field are heavy, while those that have feeble interactions are light. In the late 1980s, physicists started the search for the Higgs boson, the particle associated with the Higgs field. In July 2012, CERN announced the discovery of the Higgs boson, which confirmed the Brout-Englert-Higgs mechanism.

    CERN ATLAS Higgs Event
    CERN ATLAS Higgs Event

    CERN CMS Higgs Event
    CERN CMS Higgs Event

    However, finding it is not the end of the story, and researchers have to study the Higgs boson in detail to measure its properties and pin down its rarer decays.

    Will we discover evidence for supersymmetry? The Standard Model does not offer a unified description of all the fundamental forces, as it remains difficult to construct a theory of gravity similar to those for the other forces. Supersymmetry – a theory that hypothesises the existence of more massive partners of the standard particles we know – could facilitate the unification of fundamental forces.

    Standard model of Supersymmetry DESY
    Standard model of Supersymmetry DESY

    What are dark matter and dark energy? The matter we know and that makes up all stars and galaxies only accounts for 4% of the content of the universe. The search is then still open for particles or phenomena responsible for dark matter (23%) and dark energy (73%).

    Why is there far more matter than antimatter in the universe? Matter and antimatter must have been produced in the same amounts at the time of the Big Bang, but from what we have observed so far, our Universe is made only of matter.

    How does the quark-gluon plasma give rise to the particles that constitute the matter of our Universe?

    Quark gluon plasma. Duke University
    Quark gluon plasma. Duke University

    For part of each year, the LHC provides collisions between lead ions, recreating conditions similar to those just after the Big Bang. When heavy ions collide at high energies they form for an instant the quark-gluon plasma, a “fireball” of hot and dense matter that can be studied by the experiments.

    How was the LHC designed?

    Scientists started thinking about the LHC in the early 1980s, when the previous accelerator, the LEP, was not yet running. In December 1994, CERN Council voted to approve the construction of the LHC and in October 1995, the LHC technical design report was published.

    Contributions from Japan, the USA, India and other non-Member States accelerated the process and between 1996 and 1998, four experiments (ALICE, ATLAS, CMS and LHCb) received official approval and construction work started on the four sites.

    LHC Run 2

    What are the detectors at the LHC?

    There are seven experiments installed at the LHC: ALICE, ATLAS, CMS, LHCb, LHCf, TOTEM and MoEDAL. They use detectors to analyse the myriad of particles produced by collisions in the accelerator. These experiments are run by collaborations of scientists from institutes all over the world. Each experiment is distinct, and characterized by its detectors.

    What is the data flow from the LHC experiments?

    The CERN Data Centre stores more than 30 petabytes of data per year from the LHC experiments, enough to fill about 1.2 million Blu-ray discs, i.e. 250 years of HD video. Over 100 petabytes of data are permanently archived, on tape.

    Costs for Run 1
    Exploitation costs of the LHC when running (direct and indirect costs) represent about 80% of the CERN annual budget for operation, maintenance, technical stops, repairs and consolidation work in personnel and materials (for machine, injectors, computing, experiments).
    The directly allocated resources for the years 2009-2012 were about 1.1 billion CHF.

    Costs for LS1
    The cost of the Long Shutdown 1 (22 months) is estimated at 150 Million CHF. The maintenance and upgrade works represent about 100 MCHF for the LHC and 50 MCHF for the accelerator complex without the LHC.

    What is the LHC power consumption?

    The total power consumption of the LHC (and experiments) is equivalent to 600 GWh per year, with a maximum of 650 GWh in 2012 when the LHC was running at 4 TeV. For Run 2, the estimated power consumption is 750 GWh per year.
    The total CERN energy consumption is 1.3 TWh per year while the total electrical energy production in the world is around 20000 TWh, in the European Union 3400 TWh, in France around 500 TWh, and in Geneva canton 3 TWh.

    What are the main achievements of the LHC so far?

    10 September 2008: LHC first beam (see press release)

    23 November 2009: LHC first collisions (see press release)

    30 November 2009: world record with beam energy of 1.18 TeV (see press release)

    16 December 2009: world record with collisions at 2.36 TeV and significant quantities of data recorded (see press release)

    March 2010: first beams at 3.5 TeV (19 March) and first high energy collisions at 7 TeV (30 March) (see press release)

    8 November 2010: LHC first lead-ion beams (see press release)

    22 April 2011: LHC sets new world record beam intensity (see press release)

    5 April 2012: First collisions at 8 TeV (see press release)

    4 July 2012: Announcement of the discovery of a Higgs-like particle at CERN (see press release)

    For more information about the Higgs boson:
    The Higgs boson
    CERN and the Higgs boson
    The Basics of the Higgs boson
    How standard is the Higgs boson discovered in 2012?
    Higgs update 4 July

    28 September 2012: Tweet from CERN: “The LHC has reached its target for 2012 by delivering 15 fb-1 (around a million billion collisions) to ATLAS and CMS ”

    14 February 2013: At 7.24 a.m, the last beams for physics were absorbed into the LHC, marking the end of Run 1 and the beginning of the Long Shutdown 1 (see press release)

    8 October 2013: Physics Nobel prize to François Englert and Peter Higgs “for the theoretical discovery of a mechanism that contributes to our understanding of the origin of mass of subatomic particles, and which recently was confirmed through the discovery of the predicted fundamental particle, by the ATLAS and CMS experiments at CERN’s Large Hadron Collider” (see press release)

    See LHC Milestones.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Meet CERN in a variety of places:

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

    Quantum Diaries

     
  • richardmitnick 1:45 pm on August 16, 2016 Permalink | Reply
    Tags: Accelerator Science, Big PanDA, , , , , ,   

    From BNL: “Big PanDA Tackles Big Data for Physics and Other Future Extreme Scale Scientific Applications” 

    Brookhaven Lab

    August 16, 2016
    Karen McNulty Walsh
    kmcnulty@bnl.gov
    (631) 344-8350
    Peter Genzer
    (631) 344-3174
    genzer@bnl.gov

    1
    A workload management system developed by a team including physicists from Brookhaven National Laboratory taps into unused processing time on the Titan supercomputer at the Oak Ridge Leadership Computing Facility to tackle complex physics problems. New funding will help the group extend this approach, giving scientists in other data-intensive fields access to valuable supercomputing resources.

    A billion times per second, particles zooming through the Large Hadron Collider (LHC) at CERN, the European Organization for Nuclear Research, smash into one another at nearly the speed of light, emitting subatomic debris that could help unravel the secrets of the universe.

    CERN/LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    Collecting the data from those collisions and making it accessible to more than 6000 scientists in 45 countries, each potentially wanting to slice and analyze it in their own unique ways, is a monumental challenge that pushes the limits of the Worldwide LHC Computing Grid (WLCG), the current infrastructure for handling the LHC’s computing needs. With the move to higher collision energies at the LHC, the demand just keeps growing.

    To help meet this unprecedented demand and supplement the WLCG, a group of scientists working at U.S. Department of Energy (DOE) national laboratories and collaborating universities has developed a way to fit some of the LHC simulations that demand high computing power into untapped pockets of available computing time on one of the nation’s most powerful supercomputers—similar to the way tiny pebbles can fill the empty spaces between larger rocks in a jar. The group—from DOE’s Brookhaven National Laboratory, Oak Ridge National Laboratory (ORNL), University of Texas at Arlington, Rutgers University, and University of Tennessee, Knoxville—just received $2.1 million in funding for 2016-2017 from DOE’s Advanced Scientific Computing Research (ASCR) program to enhance this “workload management system,” known as Big PanDA, so it can help handle the LHC data demands and be used as a general workload management service at DOE’s Oak Ridge Leadership Computing Facility (OLCF), a DOE Office of Science User Facility at ORNL.

    “The implementation of these ideas in an operational-scale demonstration project at OLCF could potentially increase the use of available resources at this Leadership Computing Facility by five to ten percent,” said Brookhaven physicist Alexei Klimentov, a leader on the project. “Mobilizing these previously unusable supercomputing capabilities, valued at millions of dollars per year, could quickly and effectively enable cutting-edge science in many data-intensive fields.”

    Proof-of-concept tests using the Titan supercomputer at Oak Ridge National Laboratory have been highly successful. This Leadership Computing Facility typically handles large jobs that are fit together to maximize its use. But even when fully subscribed, some 10 percent of Titan’s computing capacity might be sitting idle—too small to take on another substantial “leadership class” job, but just right for handling smaller chunks of number crunching. The Big PanDA (for Production and Distributed Analysis) system takes advantage of these unused pockets by breaking up complex data analysis jobs and simulations for the LHC’s ATLAS and ALICE experiments and “feeding” them into the “spaces” between the leadership computing jobs.

    CERN/ATLAS detector
    CERN/ATLAS detector

    AliceDetectorLarge
    CERN/Alice Detector
    When enough capacity is available to run a new big job, the smaller chunks get kicked out and reinserted to fill in any remaining idle time.

    “Our team has managed to access opportunistic cycles available on Titan with no measurable negative effect on the supercomputer’s ability to handle its usual workload,” Klimentov said. He and his collaborators estimate that up to 30 million core hours or more per month may be harvested using the Big PanDA approach. From January through July of 2016, ATLAS detector simulation jobs ran for 32.7 million core hours on Titan, using only opportunistic, backfill resources. The results of the supercomputing calculations are shipped to and stored at the RHIC & ATLAS Computing Facility, a Tier 1 center for the WLCG located at Brookhaven Lab, so they can be made available to ATLAS researchers across the U.S. and around the globe.

    The goal now is to translate the success of the Big PanDA project into operational advances that will enhance how the OLCF handles all of its data-intensive computing jobs. This approach will provide an important model for future exascale computing, increasing the coherence between the technology base used for high-performance, scalable modeling and simulation and that used for data-analytic computing.

    “This is a novel and unique approach to workload management that could run on all current and future leadership computing facilities,” Klimentov said.

    Specifically, the new funding will help the team develop a production scale operational demonstration of the PanDA workflow within the OLCF computational and data resources; integrate OLCF and other leadership facilities with the Grid and Clouds; and help high-energy and nuclear physicists at ATLAS and ALICE—experiments that expect to collect 10 to 100 times more data during the next 3 to 5 years—achieve scientific breakthroughs at times of peak LHC demand.

    As a unifying workload management system, Big PanDA will also help integrate Grid, leadership-class supercomputers, and Cloud computing into a heterogeneous computing architecture accessible to scientists all over the world as a step toward a global cyberinfrastructure.

    “The integration of heterogeneous computing centers into a single federated distributed cyberinfrastructure will allow more efficient utilization of computing and disk resources for a wide range of scientific applications,” said Klimentov, noting how the idea mirrors Aristotle’s assertion that “the whole is greater than the sum of its parts.”

    This project is supported by the DOE Office of Science.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    BNL Campus

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world.Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 3:14 pm on August 15, 2016 Permalink | Reply
    Tags: Accelerator magnets, Accelerator Science, , , , , Niobium-tin (Nb3Sn), Rutherford cable   

    From CERN: “Once upon a time, there was a superconducting niobium-tin…” 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead

    CERN

    25 Jul 2016 [Just now in social media.]
    Stefania Pandolfi

    CERN HL-LHC bloc

    1
    A Rutherford cabling machine is used to assembly of the high-performance cables, made from state-of-the-art Nb3Sn conductor, for the LHC High Luminosity upgrade. (Photo: Max Brice/CERN)

    Extraordinary research needs extraordinary machines: the upgrade project of the LHC, the High-Luminosity LHC (HL-LHC), has the goal of achieving instantaneous luminosities a factor of five larger than the LHC nominal value, and it relies on magnetic fields reaching the level of 12 Tesla. The superconducting niobium-titanium (NbTi) used in the LHC magnets can only bear magnetic fields of up to 9-10 Tesla. Therefore, an alternative solution for the superconducting magnets materials was needed. The key innovative technology to develop superconducting magnets beyond 10 Tesla has been found in the niobium-tin (Nb3Sn) compound.

    This compound was actually discovered in 1954, eight years before NbTi, but when the LHC was built, the greater availability and ductility of the NbTi alloy and its excellent electrical and mechanical properties led scientists to choose it over Nb3Sn.

    The renewed interest in Nb3Sn relies on the fact that it can produce stronger magnetic fields. In the HL-LHC, it will be used in the form of cables to produce strong 11 T main dipole magnets and the inner triplet quadrupole magnets that will be located at the ATLAS (Point 1) and CMS (Point 5) interaction points.

    The Nb3Sn wires that will be used in the coils of the HL-LHC magnets are made up of a copper matrix, within which there are several filaments of about 0.05 mm in diameter. These filaments are not initially superconducting, as they would be too brittle to withstand the cabling process and would lose their superconducting properties Therefore, the unreacted, not-yet-superconducting Nb3Sn wires must first be assembled into cables and the cables then wound into a coil. Finally the coil must be heat-treated at about 650 oC for several days to make it superconducting via a complex reaction and diffusion process.

    The cabling of the strands is done in the superconducting laboratory in Building 163 using a machine, which cables together 40 unreacted strands of Nb3Sn into what is known as a Rutherford cable. The Rutherford cable is so far the only type of superconducting cable used in accelerator magnets. It consists of several wires that are highly compacted in a trapezoidal cross section to obtain high current density.

    “The Nb3Sn cables for the 11 T dipole magnet series and for the insertion quadrupole magnets have been developed by our section here at CERN,” says Amalia Ballarino, head of the Superconductor and Superconducting Devices (SCD) section of the Magnets, Superconductors and Cryostats (MSC) group in the Technology department. “In the superconducting laboratory, in Building 163, we are now producing the series of cables for the new magnets that will be part of the HL-LHC.”

    There are several challenges connected to the cabling of the wires. First of all, the mechanical deformation due to the cabling must have a negligible influence on the shape, and therefore on the electrical performance, of the internal filaments. The deformed wire must be able to cope with the heat treatment without its performance deteriorating. To assure field quality, all the wires must be cabled, with the same tension, into a precise geometry across the whole cable length.

    “With the HL-LHC, for the first time there will be Nb3Sn magnets in an accelerator, it’s a big responsibility”, adds Ballarino. “For HL-LHC, we are not in an R&D phase anymore, and this means that we have reached the highest possible level of performance associated with the present state-of-the-art generation of Nb3Sn wires,” points out Ballarino. “Future higher-energy accelerators will require fundamental research on Nb3Sn wire to produce even stronger magnetic fields,” she concludes.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Meet CERN in a variety of places:

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

    Quantum Diaries

     
  • richardmitnick 12:00 pm on August 13, 2016 Permalink | Reply
    Tags: Accelerator Science, , , Light sources, , ,   

    From CERN Courier: “MAX IV paves the way for ultimate X-ray microscope” 

    CERN Courier

    Sweden’s MAX IV facility is the first storage ring to employ a multi-bend achromat. Mikael Eriksson and Dieter Einfeld describe how this will produce smaller and more stable X-ray beams, taking synchrotron science closer to the X-ray diffraction limit.

    Aug 12, 2016

    Mikael Eriksson, Maxlab, Lund, Sweden,
    Dieter Einfeld, ESRF, Grenoble, France.

    1
    http://www.lightsources.org/facility/maxiv

    Since the discovery of X-rays by Wilhelm Röntgen more than a century ago, researchers have striven to produce smaller and more intense X-ray beams. With a wavelength similar to interatomic spacings, X-rays have proved to be an invaluable tool for probing the microstructure of materials. But a higher spectral power density (or brilliance) enables a deeper study of the structural, physical and chemical properties of materials, in addition to studies of their dynamics and atomic composition.

    For the first few decades following Röntgen’s discovery, the brilliance of X-rays remained fairly constant due to technical limitations of X-ray tubes. Significant improvements came with rotating-anode sources, in which the heat generated by electrons striking an anode could be distributed over a larger area. But it was the advent of particle accelerators in the mid-1900s that gave birth to modern X-ray science. A relativistic electron beam traversing a circular storage ring emits X-rays in a tangential direction. First observed in 1947 by researchers at General Electric in the US, such synchrotron radiation has taken X-ray science into new territory by providing smaller and more intense beams.

    Generation game

    First-generation synchrotron X-ray sources were accelerators built for high-energy physics experiments, which were used “parasitically” by the nascent synchrotron X-ray community. As this community started to grow, stimulated by the increased flux and brilliance at storage rings, the need for dedicated X-ray sources with different electron-beam characteristics resulted in several second-generation X-ray sources. As with previous machines, however, the source of the X-rays was the bending magnets of the storage ring.

    The advent of special “insertion devices” led to present-day third-generation storage rings – the first being the European Synchrotron Radiation Facility (ESRF) in Grenoble, France, and the Advanced Light Source (ALS) at Lawrence Berkeley National Laboratory in Berkeley, California, which began operation in the early 1990s.

    ESRF. Grenoble, France
    ESRF. Grenoble, France

    2
    LBL/ALS

    Instead of using only the bending magnets as X-ray emitters, third-generation storage rings have straight sections that allow periodic magnet structures called undulators and wigglers to be introduced. These devices consist of rows of short magnets with alternating field directions so that the net beam deflection cancels out. Undulators can house 100 or so permanent short magnets, each emitting X-rays in the same direction, which boosts the intensity of the emitted X-rays by two orders of magnitude. Furthermore, interference effects between the emitting magnets can concentrate X-rays of a given energy by another two orders of magnitude.

    Third-generation light sources have been a major success story, thanks in part to the development of excellent modelling tools that allow accelerator physicists to produce precise lattice designs. Today, there are around 50 third-generation light sources worldwide, with a total number of users in the region of 50,000. Each offers a number of X-ray beamlines (up to 40 at the largest facilities) that fan out from the storage ring: X-rays pass through a series of focusing and other elements before being focused on a sample positioned at the end station, with the longest beamlines (measuring 150 m or more) at the largest light sources able to generate X-ray spot sizes a few tens of nanometres in diameter. Facilities typically operate around the clock, during which teams of users spend anywhere between a few hours to a few days undertaking experimental shifts, before returning to their home institutes with the data.

    Although the corresponding storage-ring technology for third-generation light sources has been regarded as mature, a revolutionary new lattice design has led to another step up in brightness. The MAX IV facility at Maxlab in Lund, Sweden, which was inaugurated in June, is the first such facility to demonstrate the new lattice. Six years in construction, the facility has demanded numerous cutting-edge technologies – including vacuum systems developed in conjunction with CERN – to become the most brilliant source of X-rays in the world.

    3
    Iron-block magnets

    Initial ideas for the MAX IV project started at the end of the 20th century. Although the flagship of the Maxlab laboratory, the low-budget MAX II storage ring, was one of the first third-generation synchrotron radiation sources, it was soon outcompeted by several larger and more powerful sources entering operation. Something had to be done to maintain Maxlab’s accelerator programme.

    The dominant magnetic lattice at third-generation light sources consists of double-bend achromats (DBAs), which have been around since the 1970s.

    DBAs
    4
    MAX IV undulator

    A typical storage ring contains 10–30 achromats, each consisting of two dipole magnets and a number of magnet lenses: quadrupoles for focusing and sextupoles for chromaticity correction (at MAX IV we also added octupoles to compensate for amplitude-dependent tune shifts). The achromats are flanked by straight sections housing the insertion devices, and the dimensions of the electron beam in these sections is minimised by adjusting the dispersion of the beam (which describes the dependence of an electron’s transverse position on its energy) to zero. Other storage-ring improvements, for example faster correction of the beam orbit, have also helped to boost the brightness of modern synchrotrons. The key quantity underpinning these advances is the electron-beam emittance, which is defined as the product of the electron-beam size and its divergence.

    Despite such improvements, however, today’s third-generation storage rings have a typical electron-beam emittance of between 2–5 nm rad, which is several hundred times larger than the diffraction limit of the X-ray beam itself. This is the point at which the size and spread of the electron beam approaches the diffraction properties of X-rays, similar to the Abbe diffraction limit for visible light. Models of machine lattices with even smaller electron-beam emittances predict instabilities and/or short beam lifetimes that make the goal of reaching the diffraction limit at hard X-ray energies very distant.

    Although it had been known for a long time that a larger number of bends decreases the emittance (and therefore increases the brilliance) of storage rings, in the early 1990s, one of the present authors (DE) and others recognised that this could be achieved by incorporating a higher number of bends into the achromats. Such a multi-bend achromat (MBA) guides electrons around corners more smoothly, therefore decreasing the degradation in horizontal emittance. A few synchrotrons already employ triple-bend achromats, and the design has also been used in several particle-physics machines, including PETRA at DESY, PEP at SLAC and LEP at CERN, proving that a storage ring with an energy of a few GeV produces a very low emittance.

    DESY Petra III interior
    DESY Petra III

    4
    PEP II at SLAC. http://www.sciencephoto.com/media/613/view

    5
    CERN LEP

    To avoid prohibitively large machines, however, the MBA demands much smaller magnets than are currently employed at third-generation synchrotrons.

    In 1995, our calculations showed that a seven-bend achromat could yield an emittance of 0.4 nm rad for a 400 m-circumference machine – 10 times lower than the ESRF’s value at the time. The accelerator community also considered a six-bend achromat for the Swiss Light Source and a five-bend achromat for a Canadian light source, but the small number of achromats in these lattices meant that it was difficult to make significant progress towards a diffraction-limited source. One of us (ME) took the seven-bend achromat idea and turned it into a real engineering proposal for the design of MAX IV. But the design then went through a number of evolutions. In 2002, the first layout of a potential new source was presented: a 277 m-circumference, seven-bend lattice that would reach an emittance of 1 nm rad for a 3 GeV electron beam. By 2008, we had settled on an improved design: a 520 m-circumference, seven-bend lattice with an emittance of 0.31 nm rad, which will be reduced by a factor of two once the storage ring is fully equipped with undulators. This is more or less the design of the final MAX IV storage ring.

    In total, the team at Maxlab spent almost a decade finding ways to keep the lattice circumference at a value that was financially realistic, and even constructed a 36 m-circumference storage ring called MAX III to develop the necessary compact magnet technology. There were tens of problems that we had to overcome. Also, because the electron density was so high, we had to elongate the electron bunches by a factor of four by using a second radio-frequency (RF) cavity system.

    Block concept

    MAX IV stands out in that it contains two storage rings operated at an energy of 1.5 and 3 GeV. Due to the different energies of each, and because the rings share an injector and other infrastructure, high-quality undulator radiation can be produced over a wide spectral range with a marginal additional cost. The storage rings are fed electrons by a 3 GeV S-band linac made up of 18 accelerator units, each comprising one SLAC Energy Doubler RF station. To optimise the economy over a potential three-decade-long operation lifetime, and also to favour redundancy, a low accelerating gradient is used.

    The 1.5 GeV ring at MAX IV consists of 12 DBAs, each comprising one solid-steel block that houses all the DBA magnets (bends and lenses). The idea of the magnet-block concept, which is also used in the 3 GeV ring, has several advantages. First, it enables the magnets to be machined with high precision and be aligned with a tolerance of less than 10 μm without having to invest in aligning laboratories. Second, blocks with a handful of individual magnets come wired and plumbed direct from the delivering company, and no special girders are needed because the magnet blocks are rigidly self-supporting. Last, the magnet-block concept is a low-cost solution.

    We also needed to build a different vacuum system, because the small vacuum tube dimensions (2 cm in diameter) yield a very poor vacuum conductance. Rather than try to implement closely spaced pumps in such a compact geometry, our solution was to build 100% NEG-coated vacuum systems in the achromats. NEG (non-evaporable getter) technology, which was pioneered at CERN and other laboratories, uses metallic surface sorption to achieve extreme vacuum conditions. The construction of the MAX IV vacuum system raised some interesting challenges, but fortunately CERN had already developed the NEG coating technology to perfection. We therefore entered a collaboration that saw CERN coat the most intricate parts of the system, and licences were granted to companies who manufactured the bulk of the vacuum system. Later, vacuum specialists from the Budker Institute in Novosibirsk, Russia, mounted the linac and 3 GeV-ring vacuum systems.

    Due to the small beam size and high beam current, intra beam scattering and “Touschek” lifetime effects must also be addressed. Both are due to a high electron density at small-emittance/high-current rings in which electrons are brought into collisions with themselves. Large energy changes among the electrons bring some of them outside of the energy acceptance of the ring, while smaller energy deviations cause the beam size to increase too much. For these reasons, a low-frequency (100 MHz) RF system with bunch-elongating harmonic cavities was introduced to decrease the electron density and stabilise the beam. This RF system also allows powerful commercial solid-state FM-transmitters to be used as RF sources.

    When we first presented the plans for the radical MAX IV storage ring in around 2005, people working at other light sources thought we were crazy. The new lattice promised a factor of 10–100 increase in brightness over existing facilities at the time, offering users unprecedented spatial resolutions and taking storage rings within reach of the diffraction limit. Construction of MAX IV began in 2010 and commissioning began in August 2014, with regular user operation scheduled for early 2017.

    On 25 August 2015, an amazed accelerator staff sat looking at the beam-position monitor read-outs at MAX IV’s 3 GeV ring. With just the calculated magnetic settings plugged in, and the precisely CNC-machined magnet blocks, each containing a handful of integrated magnets, the beam went around turn after turn with proper behaviour. For the 3 GeV ring, a number of problems remained to be solved. These included dynamic issues – such as betatron tunes, dispersion, chromaticity and emittance – in addition to more trivial technical problems such as sparking RF cavities and faulty power supplies.

    As of MAX IV’s inauguration on 21 June, the injector linac and the 3 GeV ring are operational, with the linac also delivering X-rays to the Short Pulse Facility. A circulating current of 180 mA can be stored in the 3 GeV ring with a lifetime of around 10 h, and we have verified the design emittance with a value in the region of 300 pm rad. Beamline commissioning is also well under way, with some 14 beamlines under construction and a goal to increase that number to more than 20.

    Sweden has a well-established synchrotron-radiation user community, although around half of MAX IV users will come from other countries. A variety of disciplines and techniques are represented nationally, which must be mirrored by MAX IV’s beamline portfolio. Detailed discussions between universities, industry and the MAX IV laboratory therefore take place prior to any major beamline decisions. The high brilliance of the MAX IV 3 GeV ring and the temporal characteristics of the Short Pulse Facility are a prerequisite for the most advanced beamlines, with imaging being one promising application.

    Towards the diffraction limit

    MAX IV could not have reached its goals without a dedicated staff and help from other institutes. As CERN has helped us with the intricate NEG-coated vacuum system, and the Budker Institute with the installation of the linac and ring vacuum systems, the brand new Solaris light source in Krakow, Poland (which is an exact copy of the MAX IV 1.5 GeV ring) has helped with operations, and many other labs have offered advice. The MAX IV facility has also been marked out for its environmental credentials: its energy consumption is reduced by the use of high-efficiency RF amplifiers and small magnets that have a low power consumption. Even the water-cooling system of MAX IV transfers heat energy to the nearby city of Lund to warm houses.

    The MAX IV ring is the first of the MBA kind, but several MBA rings are now in construction at other facilities, including the ESRF, Sirius in Brazil and the Advanced Photon Source (APS) at Argonne National Laboratory [ANL] in the US.

    ANL APS
    ANL/APS

    The ESRF is developing a hybrid MBA lattice that would enter operation in 2019 and achieve a horizontal emittance of 0.15 nm rad. The APS has decided to pursue a similar design that could enter operation by the end of the decade and, being larger than the ESRF, the APS can strive for an even lower emittance of around 0.07 nm rad. Meanwhile, the ALS in California is moving towards a conceptual design report, and Spring-8 in Japan is pursuing a hybrid MBA that will enter operation on a similar timescale.

    Indeed, a total of some 10 rings are currently in construction or planned. We can therefore look forward to a new generation of synchrotron storage rings with very high transverse-coherent X-rays. We will then have witnessed an increase of 13–14 orders of magnitude in the brightness of synchrotron X-ray sources in a period of seven decades, and put the diffraction limit at high X-ray energies firmly within reach.

    One proposal would see such a diffraction-limited X-ray source installed in the 6.3 km-circumference tunnel that once housed the Tevatron collider at Fermilab, Chicago. Perhaps a more plausible scenario is PETRA IV at DESY in Hamburg, Germany. Currently the PETRA III ring is one of the brightest in the world, but this upgrade (if it is funded) could result in a 0.007 nm rad (7 pm rad) emittance or even lower. Storage rings will then have reached the diffraction limit at an X-ray wavelength of 1 Å. This is the Holy Grail of X-ray science, providing the highest resolution and signal-to-noise ratio possible, in addition to the lowest-radiation damage and the fastest data collection. Such an X-ray microscope will allow the study of ultrafast chemical reactions and other processes, taking us to the next chapter in synchrotron X-ray science.

    Further reading

    E Al-Dmour et al. 2014 J. Synchrotron Rad. 21 878.
    D Einfeld et al. 1995 Proceedings: PAC p177.
    M Eriksson et al. 2008 NIM-A 587 221.
    M Eriksson et al. 2016 IPAC 2016, MOYAA01, Busan, Korea.
    MAX IV Detailed Design Report http://www.maxlab.lu.se/maxlab/max4/index.html.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 7:17 am on August 13, 2016 Permalink | Reply
    Tags: Accelerator Science, , , , , ,   

    From Quanta: “What No New Particles Means for Physics” 

    Quanta Magazine
    Quanta Magazine

    August 9, 2016
    Natalie Wolchover

    1
    Olena Shmahalo/Quanta Magazine

    Physicists at the Large Hadron Collider (LHC) in Europe have explored the properties of nature at higher energies than ever before, and they have found something profound: nothing new.

    It’s perhaps the one thing that no one predicted 30 years ago when the project was first conceived.

    The infamous “diphoton bump” that arose in data plots in December has disappeared, indicating that it was a fleeting statistical fluctuation rather than a revolutionary new fundamental particle. And in fact, the machine’s collisions have so far conjured up no particles at all beyond those catalogued in the long-reigning but incomplete “Standard Model” of particle physics.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.
    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    In the collision debris, physicists have found no particles that could comprise dark matter, no siblings or cousins of the Higgs boson, no sign of extra dimensions, no leptoquarks — and above all, none of the desperately sought supersymmetry particles that would round out equations and satisfy “naturalness,” a deep principle about how the laws of nature ought to work.

    CERN ATLAS Higgs Event
    CERN ATLAS Higgs Event

    CERN CMS Higgs Event
    CERN CMS Higgs Event

    “It’s striking that we’ve thought about these things for 30 years and we have not made one correct prediction that they have seen,” said Nima Arkani-Hamed, a professor of physics at the Institute for Advanced Study in Princeton, N.J.

    The news has emerged at the International Conference on High Energy Physics in Chicago over the past few days in presentations by the ATLAS and CMS experiments, whose cathedral-like detectors sit at 6 and 12 o’clock on the LHC’s 17-mile ring.

    CERN/ATLAS detector
    CERN/ATLAS detector

    CERN/CMS Detector
    CERN/CMS Detector

    Both teams, each with over 3,000 members, have been working feverishly for the past three months analyzing a glut of data from a machine that is finally running at full throttle after being upgraded to nearly double its previous operating energy. It now collides protons with 13 trillion electron volts (TeV) of energy — more than 13,000 times the protons’ individual masses — providing enough raw material to beget gargantuan elementary particles, should any exist.

    2
    Lucy Reading-Ikkanda for Quanta Magazine

    So far, none have materialized. Especially heartbreaking for many is the loss of the diphoton bump, an excess of pairs of photons that cropped up in last year’s teaser batch of 13-TeV data, and whose origin has been the speculation of some 500 papers by theorists. Rumors about the bump’s disappearance in this year’s data began leaking in June, triggering a community-wide “diphoton hangover.”

    “It would have single-handedly pointed to a very exciting future for particle experiments,” said Raman Sundrum, a theoretical physicist at the University of Maryland. “Its absence puts us back to where we were.”

    The lack of new physics deepens a crisis that started in 2012 during the LHC’s first run, when it became clear that its 8-TeV collisions would not generate any new physics beyond the Standard Model. (The Higgs boson, discovered that year, was the Standard Model’s final puzzle piece, rather than an extension of it.) A white-knight particle could still show up later this year or next year, or, as statistics accrue over a longer time scale, subtle surprises in the behavior of the known particles could indirectly hint at new physics. But theorists are increasingly bracing themselves for their “nightmare scenario,” in which the LHC offers no path at all toward a more complete theory of nature.

    Some theorists argue that the time has already come for the whole field to start reckoning with the message of the null results. The absence of new particles almost certainly means that the laws of physics are not natural in the way physicists long assumed they are. “Naturalness is so well-motivated,” Sundrum said, “that its actual absence is a major discovery.”

    Missing Pieces

    The main reason physicists felt sure that the Standard Model could not be the whole story is that its linchpin, the Higgs boson, has a highly unnatural-seeming mass. In the equations of the Standard Model, the Higgs is coupled to many other particles. This coupling endows those particles with mass, allowing them in turn to drive the value of the Higgs mass to and fro, like competitors in a tug-of-war. Some of the competitors are extremely strong — hypothetical particles associated with gravity might contribute (or deduct) as much as 10 million billion TeV to the Higgs mass — yet somehow its mass ends up as 0.125 TeV, as if the competitors in the tug-of-war finish in a near-perfect tie. This seems absurd — unless there is some reasonable explanation for why the competing teams are so evenly matched.

    4
    Maria Spiropulu of the California Institute of Technology, pictured in the LHC’s CMS control room, brushed aside talk of a nightmare scenario, saying, “Experimentalists have no religion.” Courtesy of Maria Spiropulu

    Supersymmetry, as theorists realized in the early 1980s, does the trick. It says that for every “fermion” that exists in nature — a particle of matter, such as an electron or quark, that adds to the Higgs mass — there is a supersymmetric “boson,” or force-carrying particle, that subtracts from the Higgs mass. This way, every participant in the tug-of-war game has a rival of equal strength, and the Higgs is naturally stabilized. Theorists devised alternative proposals for how naturalness might be achieved, but supersymmetry had additional arguments in its favor: It caused the strengths of the three quantum forces to exactly converge at high energies, suggesting they were unified at the beginning of the universe. And it supplied an inert, stable particle of just the right mass to be dark matter.

    “We had figured it all out,” said Maria Spiropulu, a particle physicist at the California Institute of Technology and a member of CMS. “If you ask people of my generation, we were almost taught that supersymmetry is there even if we haven’t discovered it. We believed it.”

    Standard model of Supersymmetry DESY
    Standard model of Supersymmetry DESY

    Hence the surprise when the supersymmetric partners of the known particles didn’t show up — first at the Large Electron-Positron Collider in the 1990s, then at the Tevatron in the 1990s and early 2000s, and now at the LHC. As the colliders have searched ever-higher energies, the gap has widened between the known particles and their hypothetical superpartners, which must be much heavier in order to have avoided detection. Ultimately, supersymmetry becomes so “broken” that the effects of the particles and their superpartners on the Higgs mass no longer cancel out, and supersymmetry fails as a solution to the naturalness problem. Some experts argue that we’ve passed that point already. Others, allowing for more freedom in how certain factors are arranged, say it is happening right now, with ATLAS and CMS excluding the stop quark — the hypothetical superpartner of the 0.173-TeV top quark — up to a mass of 1 TeV. That’s already a nearly sixfold imbalance between the top and the stop in the Higgs tug-of-war. Even if a stop heavier than 1 TeV exists, it would be pulling too hard on the Higgs to solve the problem it was invented to address.

    “I think 1 TeV is a psychological limit,” said Albert de Roeck, a senior research scientist at CERN, the laboratory that houses the LHC, and a professor at the University of Antwerp in Belgium.

    Some will say that enough is enough, but for others there are still loopholes to cling to. Among the myriad supersymmetric extensions of the Standard Model, there are more complicated versions in which stop quarks heavier than 1 TeV conspire with additional supersymmetric particles to counterbalance the top quark, tuning the Higgs mass. The theory has so many variants, or individual “models,” that killing it outright is almost impossible. Joe Incandela, a physicist at the University of California, Santa Barbara, who announced the discovery of the Higgs boson on behalf of the CMS collaboration in 2012, and who now leads one of the stop-quark searches, said, “If you see something, you can make a model-independent statement that you see something. Seeing nothing is a little more complicated.”

    Particles can hide in nooks and crannies. If, for example, the stop quark and the lightest neutralino (supersymmetry’s candidate for dark matter) happen to have nearly the same mass, they might have stayed hidden so far. The reason for this is that, when a stop quark is created in a collision and decays, producing a neutralino, very little energy will be freed up to take the form of motion. “When the stop decays, there’s a dark-matter particle just kind of sitting there,” explained Kyle Cranmer of New York University, a member of ATLAS. “You don’t see it. So in those regions it’s very difficult to look for.” In that case, a stop quark with a mass as low as 0.6 TeV could still be hiding in the data.

    Experimentalists will strive to close these loopholes in the coming years, or to dig out the hidden particles. Meanwhile, theorists who are ready to move on face the fact that they have no signposts from nature about which way to go. “It’s a very muddled and uncertain situation,” Arkani-Hamed said.

    New Hope

    Many particle theorists now acknowledge a long-looming possibility: that the mass of the Higgs boson is simply unnatural — its small value resulting from an accidental, fine-tuned cancellation in a cosmic game of tug-of-war — and that we observe such a peculiar property because our lives depend on it. In this scenario, there are many, many universes, each shaped by different chance combinations of effects. Out of all these universes, only the ones with accidentally lightweight Higgs bosons will allow atoms to form and thus give rise to living beings. But this “anthropic” argument is widely disliked for being seemingly untestable.

    In the past two years, some theoretical physicists have started to devise totally new natural explanations for the Higgs mass that avoid the fatalism of anthropic reasoning and do not rely on new particles showing up at the LHC. Last week at CERN, while their experimental colleagues elsewhere in the building busily crunched data in search of such particles, theorists held a workshop to discuss nascent ideas such as the relaxion hypothesis — which supposes that the Higgs mass, rather than being shaped by symmetry, was sculpted dynamically by the birth of the cosmos — and possible ways to test these ideas. Nathaniel Craig of the University of California, Santa Barbara, who works on an idea called neutral naturalness, said in a phone call from the CERN workshop, “Now that everyone is past their diphoton hangover, we’re going back to these questions that are really aimed at coping with the lack of apparent new physics at the LHC.”

    Arkani-Hamed, who, along with several colleagues, recently proposed another new approach called Nnaturalness, said, “There are many theorists, myself included, who feel that we’re in a totally unique time, where the questions on the table are the really huge, structural ones, not the details of the next particle. We’re very lucky to get to live in a period like this — even if there may not be major, verified progress in our lifetimes.”

    As theorists return to their blackboards, the 6,000 experimentalists with CMS and ATLAS are reveling in their exploration of a previously uncharted realm. “Nightmare, what does it mean?” said Spiropulu, referring to theorists’ angst about the nightmare scenario. “We are exploring nature. Maybe we don’t have time to think about nightmares like that, because we are being flooded in data and we are extremely excited.”

    There’s still hope that new physics will show up. But discovering nothing, in Spiropulu’s view, is a discovery all the same — especially when it heralds the death of cherished ideas. “Experimentalists have no religion,” she said.

    Some theorists agree. Talk of disappointment is “crazy talk,” Arkani-Hamed said. “It’s actually nature! We’re learning the answer! These 6,000 people are busting their butts and you’re pouting like a little kid because you didn’t get the lollipop you wanted?”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Formerly known as Simons Science News, Quanta Magazine is an editorially independent online publication launched by the Simons Foundation to enhance public understanding of science. Why Quanta? Albert Einstein called photons “quanta of light.” Our goal is to “illuminate science.” At Quanta Magazine, scientific accuracy is every bit as important as telling a good story. All of our articles are meticulously researched, reported, edited, copy-edited and fact-checked.

     
  • richardmitnick 12:59 pm on July 26, 2016 Permalink | Reply
    Tags: Accelerator Science, , inSPIRE - HEP High-Energy Physics Literature Database, , ,   

    From Symmetry: “The most important website in particle physics” 

    Symmetry Mag

    Symmetry

    07/26/16
    Matthew R. Francis

    The first website to be hosted in the US has grown to be an invaluable hub for open science.

    1
    Sandbox Studio, Chicago with Lexi Fodor

    With tens of thousands of particle physicists working in the world today, the biggest challenge a researcher can have is keeping track of what everyone else is doing. The articles they write, the collaborations they form, the experiments they run—all of those things are part of being current. After all, high-energy particle physics is a big enterprise, not the province of a few isolated people working out of basement laboratories.

    Particle physicists have a tool that helps them with that. The INSPIRE database allows scientists to search for published papers by topic, author, scholarly journal, what previous papers the authors cited and which newer papers have used it as a reference.

    2

    “I don’t know any other discipline with such a central tool as INSPIRE,” says Sünje Dallmeier-Tiessen, an information scientist at CERN who manages INSPIRE’s open-access initiative. If you’re a high-energy physicist, “everything that relates to your daily work-life, you can find there.”

    Researchers in high-energy physics and related fields use INSPIRE for their professional profiles, job-hunting and promotional materials. They use it to keep track of other people’s research in their disciplines and for finding good resources to cite in their own papers.

    INSPIRE has been around in one form or another since 1969, says Bernard Hecker, who is in charge of SLAC’s portion of INSPIRE. “So we have a high level of credibility with people who use the service.” It is the successor of the Stanford Physics Information Retrieval System (SPIRES) database, the main literature database for high energy physics since the 1970s.

    INSPIRE contains up-to-date information about over a million papers, including those published in the major journals. INSPIRE’s database also interacts with the arXiv, a free-access site that hosts papers independently of whether they’re published in journals or not. “We text-mine everything [on the arXiv], and then provide search to the content, and search based on specific algorithms we run,” Dallmeier-Tiessen says.

    In that way, INSPIRE is a powerful addition to the arXiv, which itself provides access to many articles that would otherwise require expensive journal subscriptions or exorbitant one-time fees.

    A lot of human labor is involved. The arXiv, for example, doesn’t distinguish between two people with the same last name and same first initial. “We have a strong interest in keeping dynamic profiles and disambiguating different researchers with similar names,” Hecker says.

    To that end, the INSPIRE team looks at author lists on published papers to match individual researchers with their correct institutions. This includes collaborating with the Institute of High Energy Physics in China, as well as cross-checking other databases.

    The goal, Hecker says, is “trying to find the stuff that’s directly relevant and not stuff that’s not relevant.” After all, researchers will only use the site if its useful, a complicated challenge that INSPIRE has met consistently. “We’re trying to optimize the time researchers spend on the site.”

    Now That’s What I Call Physics

    Every January, the INSPIRE team releases a list of the top 40 most cited articles in high-energy physics that year.

    Looking over the list for 2015, you might be forgiven for thinking it was a slow year. The most commonly referenced articles were papers from previous years, some just a few years old, a few going back several decades.

    But even in years without a blockbuster discovery such as the Higgs boson or gravitational waves, INSPIRE’s list is still useful a snapshot of where the minds of the research community are focused.

    In 2015, researchers prioritized studying the Higgs boson. The two most widely referenced articles of 2015 were the papers announcing its discovery by researchers at the ATLAS and CMS detectors at the Large Hadron Collider. The INSPIRE “top 40” for 2015 also includes the original 1964 theoretical papers by Peter Higgs, François Englert, and Robert Brout predicting the existence of the Higgs.

    Another topic that stood out in 2015 was the cosmic microwave background, a pattern of light that could tell us about conditions in the universe just after the Big Bang. Four highly cited papers, including the third most-referenced, came from the Planck cosmic microwave background experiment, with a fifth devoted to the final WMAP cosmic microwave background data.

    It seems that cosmology was on physicists’ minds. Two more top papers were the first measurements of dark energy from the late ’90s, while yet two more described results from the dark matter experiments LUX and XENON100.

    Open science, open data, open code

    INSPIRE grew out of the Stanford Public Information Retrieval System (SPIRES), a database started at SLAC National Accelerator Laboratory in 1969 when the internet was in its infancy.

    After Tim Berners-Lee developed the World Wide Web at CERN, SPIRES was the first US-hosted website.

    Like high-energy physics itself, the database is international and cooperative. SLAC joined with Fermi National Accelerator Laboratory in the United States, DESY in Germany, and CERN in Switzerland, which now hosts the site, to create the modern version of INSPIRE. The newest member of the collaboration is IHEP Beijing in China. Institutions in France and Japan also collaborate on particular projects.

    INSPIRE has changed a lot since its inception, and a new version is coming out soon. The biggest change will extend INSPIRE’s database to include repositories for data and computer code.

    Starting later this year, INSPIRE will integrate with the HEPDATA open-data archive and the github code-collaboration system to increase visibility for both data and code that scientists write. The INSPIRE team will also roll out a new interface, so it looks “less like something from 1995,” Hecker says.

    From its inception as a way to share printed articles by mail, INSPIRE continues to be a valuable resource to the community. With more papers coming out every year and no sign of decrease in the number of particle physicists working, the need to build on past research—and construct collaborations—is more important than ever.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 12:05 pm on July 26, 2016 Permalink | Reply
    Tags: Accelerator Science, , FNAL Maintenance and upgrades: the 2016 summer shutdown, , ,   

    From FNAL: “Maintenance and upgrades: the 2016 summer shutdown” 

    FNAL II photo

    FNAL Art Image
    FNAL Art Image by Angela Gonzales

    Fermilab is an enduring source of strength for the US contribution to scientific research world wide.

    July 22, 2016
    Sergei Nagaitsev

    Every year, the summer shutdown of the accelerator complex provides a short break in Fermilab accelerator operations. It allows for a brief time to reflect on the successful operation of the past year. With the achievement of performance metrics and internal goals fresh in our minds, the summer shutdown is the period where we look ahead to next year’s goals. The machine improvements and installation of new beamlines, components and systems look good on paper, but the real test follows during machine operation.

    The summer shutdown also provides a chance for some much needed system maintenance. Many systems have been running all year with minimal upkeep. Some systems have been patched up to keep them operational until the anticipated shutdown. Power supply, vacuum, sump, lighting, cooling, ventilation, water supply and many more systems – all need to be maintained.

    The accelerator operators have an opportunity to work on projects, training and procedures. Some operators will help out support groups with various tasks and expand their system knowledge. The addition of the new interlock region in the Main Injector could allow Booster Neutrino Beamline startup several weeks early. This will provide the operators with some time to focus on the Linac and Booster operation while work continues in Main Injector.

    For some operators, a multiweek break from shiftwork and a chance to renormalize to working days is a welcome change. In any case, the machine startup in November, with new challenges and goals, will test even the experienced operators.

    The 2016 summer shutdown will start Aug. 1 and will last 15 weeks. Here’s a brief summary of what we’ll be doing:

    Accelerator shutdown summary

    The Proton Source Group will install the first of five full (56-cell) Marx modulators, or high-voltage pulse generators. These solid-state modulators will replace the aging tube-driven system that was originally installed in the late 1960s.

    The Booster Group will focus on installing two new radio-frequency power stations. This work also involves substantial gallery modification to make space for the two new driver and modulator stations.

    Three major jobs in the Recycler Ring will take place during the shutdown. One is to upgrade Recycler vacuum pumps. It will span approximately one-third of the ring circumference. A second task is to install the Recycler collimator. This will be the cornerstone of providing regular 700-kilowatt-beam operations to the complex. The collimators are designed to absorb off-momentum beam in a controlled manner, thus containing losses in the machine in a designated area. Third, we are installing a new 2.5-megahertz RF cavity, which will rebunch the beam destined for the Muon Campus.

    The NuMI Group will replace the existing beam target and perform general, shutdown-period maintenance.

    A new beamline connection from the Recycler to the Muon Campus was installed in previous shutdowns. The remaining work in the Delivery Ring and new Muon Campus beamlines does not require a shutdown and will be ready to commission beam to the Muon g-2 experiment next spring.

    In one sector of the switchyard, we will rework the P3 line vacuum system and install some additional diagnostic equipment in the beamline.

    In the first few weeks of the shutdown, we will also test the new Booster Neutrino Beam horn in one of the service buildings. The testing needs to be done with the operational power supply since it’s the only supply that can generate the proper pulse form to commission the horn.

    With technical help from the Particle Physics Division, the Technical Division and ESH&Q, the 15-week shutdown will be a safe and productive one.Sergei Nagaitsev

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    FNAL Icon
    Fermilab Campus

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics. Fermilab is America’s premier laboratory for particle physics and accelerator research, funded by the U.S. Department of Energy. Thousands of scientists from universities and laboratories around the world
    collaborate at Fermilab on experiments at the frontiers of discovery.

     
  • richardmitnick 10:14 am on July 26, 2016 Permalink | Reply
    Tags: Accelerator Science, , , , , , , ,   

    From Physics Today: “High-energy lab has high-energy director” 

    Physics Today bloc

    Physics Today

    21 July 2016
    Toni Feder

    CERN director general Fabiola Gianotti looks at what lies ahead for particle physics.

    1
    Fabiola Gianotti in December 2015, just before she became CERN’s director general. Credit: CERN

    Fabiola Gianotti shot to prominence on 4 July 2012, with the announcement of the discovery of the Higgs boson. At the time, she was the spokesperson of ATLAS, which along with the Compact Muon Solenoid (CMS) experiment spotted the Higgs at the Large Hadron Collider (LHC) at CERN.

    CERN ATLAS Higgs Event
    CERN/ATLAS detector
    CERN ATLAS Higgs Event; CERN/ATLAS detector

    CERN CMS Higgs Event
    CERN/CMS Detector
    CERN CMS Higgs Event; CERN/CMS Detector

    In the excitement over the Higgs discovery, Gianotti was on the cover of Time. She was hailed as among the most influential and the most inspirational women of our time. She was listed among the “leading global thinkers of 2013” by Foreign Policy magazine.

    “I am not very comfortable in the limelight,” says Gianotti. “Particle physics is a truly collaborative field. The discovery of the Higgs boson is the result of the work of thousands of physicists over more than 20 years.”

    Gianotti first went to CERN in 1987 as a graduate student at the University of Milan. She has been there ever since. And she seems comfortable at the helm, a job she has held since the beginning of this year.

    “The main challenge is to cope with so many different aspects, and switching my brain instantly from one problem to another one,” she says. “There are many challenges—human challenges, scientific challenges, technological challenges, budget challenges. But the challenges are interesting and engaging.”

    As of this summer, the LHC is in the middle of its second run, known as Run 2. In June the collider reached a record luminosity of 1034 cm−2s−1. It produces proton–proton collisions of energy of 13 TeV. A further push to the design energy of 14 TeV may be made later in Run 2 or in Run 3, which is planned for 2021–23. An upgrade following the third run will increase the LHC’s luminosity by an order of magnitude.

    Physics Today’s Toni Feder caught up with Gianotti in June, about six months into her five-year appointment in CERN’s top job.

    PT: Last fall the ATLAS and CMS experiments both reported hints of a signal at 750 GeV. What would the implications be of finding a particle at that energy?

    GIANOTTI: At the moment, we don’t know if what the experiments observed last year is the first hint of a signal or just a fluctuation. But if the bump turns into a signal, then the implications are extraordinary. Its presumed features would not be something we can classify within the best-known scenarios for physics beyond the standard model. So it would be something unexpected, and for researchers there is nothing more exciting than a surprise.

    The experiments are analyzing the data from this year’s run and will release results in the coming weeks. We can expect them on the time scale of ICHEP in Chicago at the beginning of August. [ICHEP is the International Conference on High Energy Physics.]

    PT: The LHC is up to nearly the originally planned collision energy. The next step is to increase the luminosity. How will that be done?

    GIANOTTI: To increase the luminosity, we will have to replace components of the accelerator—for example, the magnets sitting on each side of the ATLAS and CMS collision regions. These are quadrupoles that squeeze the beams and therefore increase the interaction probability. We will replace them with higher-field, larger-aperture magnets. There are also other things we have to do to upgrade the accelerator. The present schedule for the installation of the hardware components is at the end of Run 3—that is, during the 2024–26 shutdown. The operation of the high-luminosity LHC will start after this installation, so on the time scale of 2027.

    The high-luminosity LHC will allow the experiments to collect 10 times as much data. Improving the precision will be extremely important, in particular for the interaction strength—so-called couplings—of the Higgs boson with other particles. New physics can alter these couplings from the standard-model expectation. Hence the Higgs boson is a door to new physics.

    The high-luminosity LHC will also increase the discovery potential for new physics: Experiments will be able to detect particles with masses 20% to 30% larger than before the upgrade.

    And third, if new physics is discovered at the LHC in Run 2 or Run 3, the high-luminosity LHC will allow the first precise measurements of the new physics to be performed with a very well-known accelerator and very well-known experiments. So it would provide powerful constraints on the underlying theory.

    PT: What are some of the activities at CERN aside from the LHC?

    GIANOTTI: I have spent my scientific career working on high-energy colliders, which are very close to my heart. However, the open questions today in particle physics are difficult and crucial, and there is no single way to attack them. We can’t say today that a high-energy collider is the way to go and let’s forget about other approaches. Or underground experiments are the way to go. Or neutrino experiments are the way to go. There is no exclusive way. I think we have to be very inclusive, and we have to address the outstanding questions with all the approaches that our discipline has developed over the decades.

    In this vein, at CERN we have a scientific diversity program. It includes the study of antimatter through a dedicated facility, the Antiproton Decelerator; precise measurements of rare decays; and many other projects. We also participate in accelerator-based neutrino programs, mainly in the US. And we are doing R&D and design studies for the future high-energy colliders: an electron–positron collider in the multi-TeV region [the Compact Linear Collider] and future circular colliders.

    PT: Japan is the most likely host for a future International Linear Collider, an electron–positron collider (see Physics Today, March 2013, page 23). What’s your sense about whether the ILC will go ahead and whether it’s the best next step for high-energy physics?

    GIANOTTI: Japan is consulting with international partners to see if a global collaboration can be built. It’s a difficult decision to be taken, and it has to be taken by the worldwide community.

    Europe will produce a new road map, the European Strategy for Particle Physics, on the time scale of 2019–20. That will be a good opportunity to think about the future of the discipline, based also on the results from the LHC Run 2 and other facilities in the world.

    PT: How is CERN affected by tight financial situations in member countries?

    GIANOTTI: CERN has been running for many years with a constant budget, with constant revenues from member states, at a level of CHF 1.2 billion [$1.2 billion] per year. We strive to squeeze the operation of the most powerful accelerator in the world, its upgrade, and other interesting projects within this budget.

    PT: Will Brexit affect CERN?

    GIANOTTI: We are not directly affected because CERN membership is not related to being members of the European Union.

    PT: You have said you have four areas that you want to maintain and expand at CERN: science, technology and innovation, education, and peaceful collaboration. Please elaborate.

    GIANOTTI: Science first. We do research in fundamental physics, with the aim of understanding the elementary particles and their interactions, which also gives us very important indications about the structure and evolution of the universe.

    In order to accomplish these scientific goals, we have to develop cutting-edge technologies in many domains, from superconducting magnets to vacuum technology, cryogenics, electronics, computing, et cetera.

    These technologies are transferred to society and find applications in many other sectors—for example, in the medical fields with imaging and cancer therapy, but also solar panels, not to mention the World Wide Web. Fundamental research requires very sophisticated instruments and is a driver of innovation.

    Another component of our mission is education and training. The CERN population is very young: The age distribution of the 12 000 citizens from all over the world working on our experiments peaks at 27 years, and almost 50% are below 35. About half of our PhD students remain in academia or research, and about half go to industry. It is our duty to prepare them to be tomorrow’s scientists or tomorrow’s employees of industry—and in any case, good citizens.

    How do we prepare them to be good citizens? CERN was created in the early 1950s to promote fundamental research and to foster peaceful collaboration among European countries after the war. Today we have scientists of more than 110 nationalities, some from countries that are in conflict with each other, some from countries that do not even recognize each other’s right to exist. And yet they work together in a peaceful way, animated by the same passion for knowledge.

    PT: You are the first woman to head CERN. What do you see as the significance of this?

    GIANOTTI: The CERN director general should be appointed on the basis of his or her capabilities to run the laboratory and not on the basis of gender arguments. This being said, I hope that my being a woman can be useful as an encouragement to girls and young women who would like to do fundamental research but might hesitate. It shows them they have similar opportunities as their male colleagues.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    “Our mission

    The mission of Physics Today is to be a unifying influence for the diverse areas of physics and the physics-related sciences.

    It does that in three ways:

    • by providing authoritative, engaging coverage of physical science research and its applications without regard to disciplinary boundaries;
    • by providing authoritative, engaging coverage of the often complex interactions of the physical sciences with each other and with other spheres of human endeavor; and
    • by providing a forum for the exchange of ideas within the scientific community.”

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
Follow

Get every new post delivered to your Inbox.

Join 595 other followers

%d bloggers like this: