Tagged: CERN LHC Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 10:01 am on August 17, 2016 Permalink | Reply
    Tags: , , CERN LHC, CERN: Facts & Figures,   

    From CERN: “Facts & Figures” 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead

    CERN

    The Large Hadron Collider (LHC) is the most powerful particle accelerator ever built. The accelerator sits in a tunnel 100 metres underground at CERN, the European Organization for Nuclear Research, on the Franco-Swiss border near Geneva, Switzerland.

    What is the LHC?

    The LHC is a particle accelerator that pushes protons or ions to near the speed of light. It consists of a 27-kilometre ring of superconducting magnets with a number of accelerating structures that boost the energy of the particles along the way.

    Why is it called the “Large Hadron Collider”?

    “Large” refers to its size, approximately 27km in circumference
    “Hadron” because it accelerates protons or ions, which belong to the group of particles called hadrons
    “Collider” because the particles form two beams travelling in opposite directions, which are made to collide at four points around the machine

    How does the LHC work?

    The CERN accelerator complex is a succession of machines with increasingly higher energies. Each machine accelerates a beam of particles to a given energy before injecting the beam into the next machine in the chain. This next machine brings the beam to an even higher energy and so on. The LHC is the last element of this chain, in which the beams reach their highest energies.

    1
    The CERN accelerator complex (Image: CERN)

    Inside the LHC, two particle beams travel at close to the speed of light before they are made to collide. The beams travel in opposite directions in separate beam pipes – two tubes kept at ultrahigh vacuum. They are guided around the accelerator ring by a strong magnetic field maintained by superconducting electromagnets. Below a certain characteristic temperature, some materials enter a superconducting state and offer no resistance to the passage of electrical current. The electromagnets in the LHC are therefore chilled to ‑271.3°C (1.9K) – a temperature colder than outer space – to take advantage of this effect. The accelerator is connected to a vast distribution system of liquid helium, which cools the magnets, as well as to other supply services.

    What are the main goals of the LHC?

    The Standard Model of particle physics – a theory developed in the early 1970s that describes the fundamental particles and their interactions – has precisely predicted a wide variety of phenomena and so far successfully explained almost all experimental results in particle physics..

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.
    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    But the Standard Model is incomplete. It leaves many questions open, which the LHC will help to answer.

    What is the origin of mass? The Standard Model does not explain the origins of mass, nor why some particles are very heavy while others have no mass at all. However, theorists Robert Brout, François Englert and Peter Higgs made a proposal that was to solve this problem. The Brout-Englert-Higgs mechanism gives a mass to particles when they interact with an invisible field, now called the “Higgs field”, which pervades the universe.
    Particles that interact intensely with the Higgs field are heavy, while those that have feeble interactions are light. In the late 1980s, physicists started the search for the Higgs boson, the particle associated with the Higgs field. In July 2012, CERN announced the discovery of the Higgs boson, which confirmed the Brout-Englert-Higgs mechanism.

    CERN ATLAS Higgs Event
    CERN ATLAS Higgs Event

    CERN CMS Higgs Event
    CERN CMS Higgs Event

    However, finding it is not the end of the story, and researchers have to study the Higgs boson in detail to measure its properties and pin down its rarer decays.

    Will we discover evidence for supersymmetry? The Standard Model does not offer a unified description of all the fundamental forces, as it remains difficult to construct a theory of gravity similar to those for the other forces. Supersymmetry – a theory that hypothesises the existence of more massive partners of the standard particles we know – could facilitate the unification of fundamental forces.

    Standard model of Supersymmetry DESY
    Standard model of Supersymmetry DESY

    What are dark matter and dark energy? The matter we know and that makes up all stars and galaxies only accounts for 4% of the content of the universe. The search is then still open for particles or phenomena responsible for dark matter (23%) and dark energy (73%).

    Why is there far more matter than antimatter in the universe? Matter and antimatter must have been produced in the same amounts at the time of the Big Bang, but from what we have observed so far, our Universe is made only of matter.

    How does the quark-gluon plasma give rise to the particles that constitute the matter of our Universe?

    Quark gluon plasma. Duke University
    Quark gluon plasma. Duke University

    For part of each year, the LHC provides collisions between lead ions, recreating conditions similar to those just after the Big Bang. When heavy ions collide at high energies they form for an instant the quark-gluon plasma, a “fireball” of hot and dense matter that can be studied by the experiments.

    How was the LHC designed?

    Scientists started thinking about the LHC in the early 1980s, when the previous accelerator, the LEP, was not yet running. In December 1994, CERN Council voted to approve the construction of the LHC and in October 1995, the LHC technical design report was published.

    Contributions from Japan, the USA, India and other non-Member States accelerated the process and between 1996 and 1998, four experiments (ALICE, ATLAS, CMS and LHCb) received official approval and construction work started on the four sites.

    LHC Run 2

    What are the detectors at the LHC?

    There are seven experiments installed at the LHC: ALICE, ATLAS, CMS, LHCb, LHCf, TOTEM and MoEDAL. They use detectors to analyse the myriad of particles produced by collisions in the accelerator. These experiments are run by collaborations of scientists from institutes all over the world. Each experiment is distinct, and characterized by its detectors.

    What is the data flow from the LHC experiments?

    The CERN Data Centre stores more than 30 petabytes of data per year from the LHC experiments, enough to fill about 1.2 million Blu-ray discs, i.e. 250 years of HD video. Over 100 petabytes of data are permanently archived, on tape.

    Costs for Run 1
    Exploitation costs of the LHC when running (direct and indirect costs) represent about 80% of the CERN annual budget for operation, maintenance, technical stops, repairs and consolidation work in personnel and materials (for machine, injectors, computing, experiments).
    The directly allocated resources for the years 2009-2012 were about 1.1 billion CHF.

    Costs for LS1
    The cost of the Long Shutdown 1 (22 months) is estimated at 150 Million CHF. The maintenance and upgrade works represent about 100 MCHF for the LHC and 50 MCHF for the accelerator complex without the LHC.

    What is the LHC power consumption?

    The total power consumption of the LHC (and experiments) is equivalent to 600 GWh per year, with a maximum of 650 GWh in 2012 when the LHC was running at 4 TeV. For Run 2, the estimated power consumption is 750 GWh per year.
    The total CERN energy consumption is 1.3 TWh per year while the total electrical energy production in the world is around 20000 TWh, in the European Union 3400 TWh, in France around 500 TWh, and in Geneva canton 3 TWh.

    What are the main achievements of the LHC so far?

    10 September 2008: LHC first beam (see press release)

    23 November 2009: LHC first collisions (see press release)

    30 November 2009: world record with beam energy of 1.18 TeV (see press release)

    16 December 2009: world record with collisions at 2.36 TeV and significant quantities of data recorded (see press release)

    March 2010: first beams at 3.5 TeV (19 March) and first high energy collisions at 7 TeV (30 March) (see press release)

    8 November 2010: LHC first lead-ion beams (see press release)

    22 April 2011: LHC sets new world record beam intensity (see press release)

    5 April 2012: First collisions at 8 TeV (see press release)

    4 July 2012: Announcement of the discovery of a Higgs-like particle at CERN (see press release)

    For more information about the Higgs boson:
    The Higgs boson
    CERN and the Higgs boson
    The Basics of the Higgs boson
    How standard is the Higgs boson discovered in 2012?
    Higgs update 4 July

    28 September 2012: Tweet from CERN: “The LHC has reached its target for 2012 by delivering 15 fb-1 (around a million billion collisions) to ATLAS and CMS ”

    14 February 2013: At 7.24 a.m, the last beams for physics were absorbed into the LHC, marking the end of Run 1 and the beginning of the Long Shutdown 1 (see press release)

    8 October 2013: Physics Nobel prize to François Englert and Peter Higgs “for the theoretical discovery of a mechanism that contributes to our understanding of the origin of mass of subatomic particles, and which recently was confirmed through the discovery of the predicted fundamental particle, by the ATLAS and CMS experiments at CERN’s Large Hadron Collider” (see press release)

    See LHC Milestones.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Meet CERN in a variety of places:

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

    Quantum Diaries

     
  • richardmitnick 1:45 pm on August 16, 2016 Permalink | Reply
    Tags: , Big PanDA, , CERN LHC, , , ,   

    From BNL: “Big PanDA Tackles Big Data for Physics and Other Future Extreme Scale Scientific Applications” 

    Brookhaven Lab

    August 16, 2016
    Karen McNulty Walsh
    kmcnulty@bnl.gov
    (631) 344-8350
    Peter Genzer
    (631) 344-3174
    genzer@bnl.gov

    1
    A workload management system developed by a team including physicists from Brookhaven National Laboratory taps into unused processing time on the Titan supercomputer at the Oak Ridge Leadership Computing Facility to tackle complex physics problems. New funding will help the group extend this approach, giving scientists in other data-intensive fields access to valuable supercomputing resources.

    A billion times per second, particles zooming through the Large Hadron Collider (LHC) at CERN, the European Organization for Nuclear Research, smash into one another at nearly the speed of light, emitting subatomic debris that could help unravel the secrets of the universe.

    CERN/LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    Collecting the data from those collisions and making it accessible to more than 6000 scientists in 45 countries, each potentially wanting to slice and analyze it in their own unique ways, is a monumental challenge that pushes the limits of the Worldwide LHC Computing Grid (WLCG), the current infrastructure for handling the LHC’s computing needs. With the move to higher collision energies at the LHC, the demand just keeps growing.

    To help meet this unprecedented demand and supplement the WLCG, a group of scientists working at U.S. Department of Energy (DOE) national laboratories and collaborating universities has developed a way to fit some of the LHC simulations that demand high computing power into untapped pockets of available computing time on one of the nation’s most powerful supercomputers—similar to the way tiny pebbles can fill the empty spaces between larger rocks in a jar. The group—from DOE’s Brookhaven National Laboratory, Oak Ridge National Laboratory (ORNL), University of Texas at Arlington, Rutgers University, and University of Tennessee, Knoxville—just received $2.1 million in funding for 2016-2017 from DOE’s Advanced Scientific Computing Research (ASCR) program to enhance this “workload management system,” known as Big PanDA, so it can help handle the LHC data demands and be used as a general workload management service at DOE’s Oak Ridge Leadership Computing Facility (OLCF), a DOE Office of Science User Facility at ORNL.

    “The implementation of these ideas in an operational-scale demonstration project at OLCF could potentially increase the use of available resources at this Leadership Computing Facility by five to ten percent,” said Brookhaven physicist Alexei Klimentov, a leader on the project. “Mobilizing these previously unusable supercomputing capabilities, valued at millions of dollars per year, could quickly and effectively enable cutting-edge science in many data-intensive fields.”

    Proof-of-concept tests using the Titan supercomputer at Oak Ridge National Laboratory have been highly successful. This Leadership Computing Facility typically handles large jobs that are fit together to maximize its use. But even when fully subscribed, some 10 percent of Titan’s computing capacity might be sitting idle—too small to take on another substantial “leadership class” job, but just right for handling smaller chunks of number crunching. The Big PanDA (for Production and Distributed Analysis) system takes advantage of these unused pockets by breaking up complex data analysis jobs and simulations for the LHC’s ATLAS and ALICE experiments and “feeding” them into the “spaces” between the leadership computing jobs.

    CERN/ATLAS detector
    CERN/ATLAS detector

    AliceDetectorLarge
    CERN/Alice Detector
    When enough capacity is available to run a new big job, the smaller chunks get kicked out and reinserted to fill in any remaining idle time.

    “Our team has managed to access opportunistic cycles available on Titan with no measurable negative effect on the supercomputer’s ability to handle its usual workload,” Klimentov said. He and his collaborators estimate that up to 30 million core hours or more per month may be harvested using the Big PanDA approach. From January through July of 2016, ATLAS detector simulation jobs ran for 32.7 million core hours on Titan, using only opportunistic, backfill resources. The results of the supercomputing calculations are shipped to and stored at the RHIC & ATLAS Computing Facility, a Tier 1 center for the WLCG located at Brookhaven Lab, so they can be made available to ATLAS researchers across the U.S. and around the globe.

    The goal now is to translate the success of the Big PanDA project into operational advances that will enhance how the OLCF handles all of its data-intensive computing jobs. This approach will provide an important model for future exascale computing, increasing the coherence between the technology base used for high-performance, scalable modeling and simulation and that used for data-analytic computing.

    “This is a novel and unique approach to workload management that could run on all current and future leadership computing facilities,” Klimentov said.

    Specifically, the new funding will help the team develop a production scale operational demonstration of the PanDA workflow within the OLCF computational and data resources; integrate OLCF and other leadership facilities with the Grid and Clouds; and help high-energy and nuclear physicists at ATLAS and ALICE—experiments that expect to collect 10 to 100 times more data during the next 3 to 5 years—achieve scientific breakthroughs at times of peak LHC demand.

    As a unifying workload management system, Big PanDA will also help integrate Grid, leadership-class supercomputers, and Cloud computing into a heterogeneous computing architecture accessible to scientists all over the world as a step toward a global cyberinfrastructure.

    “The integration of heterogeneous computing centers into a single federated distributed cyberinfrastructure will allow more efficient utilization of computing and disk resources for a wide range of scientific applications,” said Klimentov, noting how the idea mirrors Aristotle’s assertion that “the whole is greater than the sum of its parts.”

    This project is supported by the DOE Office of Science.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    BNL Campus

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world.Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 3:14 pm on August 15, 2016 Permalink | Reply
    Tags: Accelerator magnets, , , , CERN LHC, , Niobium-tin (Nb3Sn), Rutherford cable   

    From CERN: “Once upon a time, there was a superconducting niobium-tin…” 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead

    CERN

    25 Jul 2016 [Just now in social media.]
    Stefania Pandolfi

    CERN HL-LHC bloc

    1
    A Rutherford cabling machine is used to assembly of the high-performance cables, made from state-of-the-art Nb3Sn conductor, for the LHC High Luminosity upgrade. (Photo: Max Brice/CERN)

    Extraordinary research needs extraordinary machines: the upgrade project of the LHC, the High-Luminosity LHC (HL-LHC), has the goal of achieving instantaneous luminosities a factor of five larger than the LHC nominal value, and it relies on magnetic fields reaching the level of 12 Tesla. The superconducting niobium-titanium (NbTi) used in the LHC magnets can only bear magnetic fields of up to 9-10 Tesla. Therefore, an alternative solution for the superconducting magnets materials was needed. The key innovative technology to develop superconducting magnets beyond 10 Tesla has been found in the niobium-tin (Nb3Sn) compound.

    This compound was actually discovered in 1954, eight years before NbTi, but when the LHC was built, the greater availability and ductility of the NbTi alloy and its excellent electrical and mechanical properties led scientists to choose it over Nb3Sn.

    The renewed interest in Nb3Sn relies on the fact that it can produce stronger magnetic fields. In the HL-LHC, it will be used in the form of cables to produce strong 11 T main dipole magnets and the inner triplet quadrupole magnets that will be located at the ATLAS (Point 1) and CMS (Point 5) interaction points.

    The Nb3Sn wires that will be used in the coils of the HL-LHC magnets are made up of a copper matrix, within which there are several filaments of about 0.05 mm in diameter. These filaments are not initially superconducting, as they would be too brittle to withstand the cabling process and would lose their superconducting properties Therefore, the unreacted, not-yet-superconducting Nb3Sn wires must first be assembled into cables and the cables then wound into a coil. Finally the coil must be heat-treated at about 650 oC for several days to make it superconducting via a complex reaction and diffusion process.

    The cabling of the strands is done in the superconducting laboratory in Building 163 using a machine, which cables together 40 unreacted strands of Nb3Sn into what is known as a Rutherford cable. The Rutherford cable is so far the only type of superconducting cable used in accelerator magnets. It consists of several wires that are highly compacted in a trapezoidal cross section to obtain high current density.

    “The Nb3Sn cables for the 11 T dipole magnet series and for the insertion quadrupole magnets have been developed by our section here at CERN,” says Amalia Ballarino, head of the Superconductor and Superconducting Devices (SCD) section of the Magnets, Superconductors and Cryostats (MSC) group in the Technology department. “In the superconducting laboratory, in Building 163, we are now producing the series of cables for the new magnets that will be part of the HL-LHC.”

    There are several challenges connected to the cabling of the wires. First of all, the mechanical deformation due to the cabling must have a negligible influence on the shape, and therefore on the electrical performance, of the internal filaments. The deformed wire must be able to cope with the heat treatment without its performance deteriorating. To assure field quality, all the wires must be cabled, with the same tension, into a precise geometry across the whole cable length.

    “With the HL-LHC, for the first time there will be Nb3Sn magnets in an accelerator, it’s a big responsibility”, adds Ballarino. “For HL-LHC, we are not in an R&D phase anymore, and this means that we have reached the highest possible level of performance associated with the present state-of-the-art generation of Nb3Sn wires,” points out Ballarino. “Future higher-energy accelerators will require fundamental research on Nb3Sn wire to produce even stronger magnetic fields,” she concludes.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Meet CERN in a variety of places:

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

    Quantum Diaries

     
  • richardmitnick 7:17 am on August 13, 2016 Permalink | Reply
    Tags: , , CERN LHC, , , ,   

    From Quanta: “What No New Particles Means for Physics” 

    Quanta Magazine
    Quanta Magazine

    August 9, 2016
    Natalie Wolchover

    1
    Olena Shmahalo/Quanta Magazine

    Physicists at the Large Hadron Collider (LHC) in Europe have explored the properties of nature at higher energies than ever before, and they have found something profound: nothing new.

    It’s perhaps the one thing that no one predicted 30 years ago when the project was first conceived.

    The infamous “diphoton bump” that arose in data plots in December has disappeared, indicating that it was a fleeting statistical fluctuation rather than a revolutionary new fundamental particle. And in fact, the machine’s collisions have so far conjured up no particles at all beyond those catalogued in the long-reigning but incomplete “Standard Model” of particle physics.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.
    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    In the collision debris, physicists have found no particles that could comprise dark matter, no siblings or cousins of the Higgs boson, no sign of extra dimensions, no leptoquarks — and above all, none of the desperately sought supersymmetry particles that would round out equations and satisfy “naturalness,” a deep principle about how the laws of nature ought to work.

    CERN ATLAS Higgs Event
    CERN ATLAS Higgs Event

    CERN CMS Higgs Event
    CERN CMS Higgs Event

    “It’s striking that we’ve thought about these things for 30 years and we have not made one correct prediction that they have seen,” said Nima Arkani-Hamed, a professor of physics at the Institute for Advanced Study in Princeton, N.J.

    The news has emerged at the International Conference on High Energy Physics in Chicago over the past few days in presentations by the ATLAS and CMS experiments, whose cathedral-like detectors sit at 6 and 12 o’clock on the LHC’s 17-mile ring.

    CERN/ATLAS detector
    CERN/ATLAS detector

    CERN/CMS Detector
    CERN/CMS Detector

    Both teams, each with over 3,000 members, have been working feverishly for the past three months analyzing a glut of data from a machine that is finally running at full throttle after being upgraded to nearly double its previous operating energy. It now collides protons with 13 trillion electron volts (TeV) of energy — more than 13,000 times the protons’ individual masses — providing enough raw material to beget gargantuan elementary particles, should any exist.

    2
    Lucy Reading-Ikkanda for Quanta Magazine

    So far, none have materialized. Especially heartbreaking for many is the loss of the diphoton bump, an excess of pairs of photons that cropped up in last year’s teaser batch of 13-TeV data, and whose origin has been the speculation of some 500 papers by theorists. Rumors about the bump’s disappearance in this year’s data began leaking in June, triggering a community-wide “diphoton hangover.”

    “It would have single-handedly pointed to a very exciting future for particle experiments,” said Raman Sundrum, a theoretical physicist at the University of Maryland. “Its absence puts us back to where we were.”

    The lack of new physics deepens a crisis that started in 2012 during the LHC’s first run, when it became clear that its 8-TeV collisions would not generate any new physics beyond the Standard Model. (The Higgs boson, discovered that year, was the Standard Model’s final puzzle piece, rather than an extension of it.) A white-knight particle could still show up later this year or next year, or, as statistics accrue over a longer time scale, subtle surprises in the behavior of the known particles could indirectly hint at new physics. But theorists are increasingly bracing themselves for their “nightmare scenario,” in which the LHC offers no path at all toward a more complete theory of nature.

    Some theorists argue that the time has already come for the whole field to start reckoning with the message of the null results. The absence of new particles almost certainly means that the laws of physics are not natural in the way physicists long assumed they are. “Naturalness is so well-motivated,” Sundrum said, “that its actual absence is a major discovery.”

    Missing Pieces

    The main reason physicists felt sure that the Standard Model could not be the whole story is that its linchpin, the Higgs boson, has a highly unnatural-seeming mass. In the equations of the Standard Model, the Higgs is coupled to many other particles. This coupling endows those particles with mass, allowing them in turn to drive the value of the Higgs mass to and fro, like competitors in a tug-of-war. Some of the competitors are extremely strong — hypothetical particles associated with gravity might contribute (or deduct) as much as 10 million billion TeV to the Higgs mass — yet somehow its mass ends up as 0.125 TeV, as if the competitors in the tug-of-war finish in a near-perfect tie. This seems absurd — unless there is some reasonable explanation for why the competing teams are so evenly matched.

    4
    Maria Spiropulu of the California Institute of Technology, pictured in the LHC’s CMS control room, brushed aside talk of a nightmare scenario, saying, “Experimentalists have no religion.” Courtesy of Maria Spiropulu

    Supersymmetry, as theorists realized in the early 1980s, does the trick. It says that for every “fermion” that exists in nature — a particle of matter, such as an electron or quark, that adds to the Higgs mass — there is a supersymmetric “boson,” or force-carrying particle, that subtracts from the Higgs mass. This way, every participant in the tug-of-war game has a rival of equal strength, and the Higgs is naturally stabilized. Theorists devised alternative proposals for how naturalness might be achieved, but supersymmetry had additional arguments in its favor: It caused the strengths of the three quantum forces to exactly converge at high energies, suggesting they were unified at the beginning of the universe. And it supplied an inert, stable particle of just the right mass to be dark matter.

    “We had figured it all out,” said Maria Spiropulu, a particle physicist at the California Institute of Technology and a member of CMS. “If you ask people of my generation, we were almost taught that supersymmetry is there even if we haven’t discovered it. We believed it.”

    Standard model of Supersymmetry DESY
    Standard model of Supersymmetry DESY

    Hence the surprise when the supersymmetric partners of the known particles didn’t show up — first at the Large Electron-Positron Collider in the 1990s, then at the Tevatron in the 1990s and early 2000s, and now at the LHC. As the colliders have searched ever-higher energies, the gap has widened between the known particles and their hypothetical superpartners, which must be much heavier in order to have avoided detection. Ultimately, supersymmetry becomes so “broken” that the effects of the particles and their superpartners on the Higgs mass no longer cancel out, and supersymmetry fails as a solution to the naturalness problem. Some experts argue that we’ve passed that point already. Others, allowing for more freedom in how certain factors are arranged, say it is happening right now, with ATLAS and CMS excluding the stop quark — the hypothetical superpartner of the 0.173-TeV top quark — up to a mass of 1 TeV. That’s already a nearly sixfold imbalance between the top and the stop in the Higgs tug-of-war. Even if a stop heavier than 1 TeV exists, it would be pulling too hard on the Higgs to solve the problem it was invented to address.

    “I think 1 TeV is a psychological limit,” said Albert de Roeck, a senior research scientist at CERN, the laboratory that houses the LHC, and a professor at the University of Antwerp in Belgium.

    Some will say that enough is enough, but for others there are still loopholes to cling to. Among the myriad supersymmetric extensions of the Standard Model, there are more complicated versions in which stop quarks heavier than 1 TeV conspire with additional supersymmetric particles to counterbalance the top quark, tuning the Higgs mass. The theory has so many variants, or individual “models,” that killing it outright is almost impossible. Joe Incandela, a physicist at the University of California, Santa Barbara, who announced the discovery of the Higgs boson on behalf of the CMS collaboration in 2012, and who now leads one of the stop-quark searches, said, “If you see something, you can make a model-independent statement that you see something. Seeing nothing is a little more complicated.”

    Particles can hide in nooks and crannies. If, for example, the stop quark and the lightest neutralino (supersymmetry’s candidate for dark matter) happen to have nearly the same mass, they might have stayed hidden so far. The reason for this is that, when a stop quark is created in a collision and decays, producing a neutralino, very little energy will be freed up to take the form of motion. “When the stop decays, there’s a dark-matter particle just kind of sitting there,” explained Kyle Cranmer of New York University, a member of ATLAS. “You don’t see it. So in those regions it’s very difficult to look for.” In that case, a stop quark with a mass as low as 0.6 TeV could still be hiding in the data.

    Experimentalists will strive to close these loopholes in the coming years, or to dig out the hidden particles. Meanwhile, theorists who are ready to move on face the fact that they have no signposts from nature about which way to go. “It’s a very muddled and uncertain situation,” Arkani-Hamed said.

    New Hope

    Many particle theorists now acknowledge a long-looming possibility: that the mass of the Higgs boson is simply unnatural — its small value resulting from an accidental, fine-tuned cancellation in a cosmic game of tug-of-war — and that we observe such a peculiar property because our lives depend on it. In this scenario, there are many, many universes, each shaped by different chance combinations of effects. Out of all these universes, only the ones with accidentally lightweight Higgs bosons will allow atoms to form and thus give rise to living beings. But this “anthropic” argument is widely disliked for being seemingly untestable.

    In the past two years, some theoretical physicists have started to devise totally new natural explanations for the Higgs mass that avoid the fatalism of anthropic reasoning and do not rely on new particles showing up at the LHC. Last week at CERN, while their experimental colleagues elsewhere in the building busily crunched data in search of such particles, theorists held a workshop to discuss nascent ideas such as the relaxion hypothesis — which supposes that the Higgs mass, rather than being shaped by symmetry, was sculpted dynamically by the birth of the cosmos — and possible ways to test these ideas. Nathaniel Craig of the University of California, Santa Barbara, who works on an idea called neutral naturalness, said in a phone call from the CERN workshop, “Now that everyone is past their diphoton hangover, we’re going back to these questions that are really aimed at coping with the lack of apparent new physics at the LHC.”

    Arkani-Hamed, who, along with several colleagues, recently proposed another new approach called Nnaturalness, said, “There are many theorists, myself included, who feel that we’re in a totally unique time, where the questions on the table are the really huge, structural ones, not the details of the next particle. We’re very lucky to get to live in a period like this — even if there may not be major, verified progress in our lifetimes.”

    As theorists return to their blackboards, the 6,000 experimentalists with CMS and ATLAS are reveling in their exploration of a previously uncharted realm. “Nightmare, what does it mean?” said Spiropulu, referring to theorists’ angst about the nightmare scenario. “We are exploring nature. Maybe we don’t have time to think about nightmares like that, because we are being flooded in data and we are extremely excited.”

    There’s still hope that new physics will show up. But discovering nothing, in Spiropulu’s view, is a discovery all the same — especially when it heralds the death of cherished ideas. “Experimentalists have no religion,” she said.

    Some theorists agree. Talk of disappointment is “crazy talk,” Arkani-Hamed said. “It’s actually nature! We’re learning the answer! These 6,000 people are busting their butts and you’re pouting like a little kid because you didn’t get the lollipop you wanted?”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Formerly known as Simons Science News, Quanta Magazine is an editorially independent online publication launched by the Simons Foundation to enhance public understanding of science. Why Quanta? Albert Einstein called photons “quanta of light.” Our goal is to “illuminate science.” At Quanta Magazine, scientific accuracy is every bit as important as telling a good story. All of our articles are meticulously researched, reported, edited, copy-edited and fact-checked.

     
  • richardmitnick 6:39 am on August 13, 2016 Permalink | Reply
    Tags: , CERN LHC, , , ,   

    From Nature: “Physicists need to make the case for high-energy experiments” 

    Nature Mag
    Nature

    10 August 2016
    No writer credit

    The disappearance of a tantalizing LHC signal is disappointing for those who want to build the next big accelerator.

    1
    LHCb Experiment/LHCb Collaboration

    Science thrives on discovery, so it’s natural for physicists to mourn this week. As the high-energy-physics community gathered in Chicago on Friday, hopes were high (if cautious) that the Large Hadron Collider (LHC) at CERN, Europe’s particle-physics laboratory near Geneva, Switzerland, had chalked up another finding to build on the discovery of the Higgs boson.

    CERN/LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    Not so — the bump in the data that had caused such excitement was washed away with a flood of data that revealed it to be a mere statistical fluctuation.

    Ordinarily, physicists would be satisfied if the LHC continued its bread-and-butter existence of confirming with ever-greater precision the standard model — a remarkably successful theory that is known to be incomplete.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.
    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    But the excitement over the bump has left them hungry for more. As is evident from the 500 theory papers written about the bump, physics is ready for something new.

    That the LHC has not turned up anything beyond the standard model does not mean it never will. The machine has collected just one-tenth of the data that scientists hoped to amass by the end of 2022, and just 1% of those it could collect if a planned revamp to increase the intensity of collisions goes ahead.

    CERN HL-LHC bloc

    But the dry spell worries some. The idea of supersymmetry predicts that heavier counterparts to regular particles will become evident at higher collision energies.

    Standard model of Supersymmetry DESY
    Standard model of Supersymmetry DESY

    Before the LHC was switched on, fans of the theory would have gambled on being able to see something by now. And if the dry spell extends to a drought, high-energy physics could descend into what some call the nightmare scenario — the collider finds nothing beyond the Higgs boson. Without ‘new’ physics, there is no thread to pull to unravel the countless mysteries that the standard model fails to account for, including dark matter and gravity.

    3
    http://physicsworld.com/cws/article/news/2014/aug/20/china-pursues-52-km-collider-project

    There remain strong reasons to build a successor machine. But without another discovery, the public’s delight in high-energy physics could fade: there comes a time when exploration alone no longer satisfies.

    Convincing funding agencies to cough up several billion dollars to continue the same approach will therefore be tough, especially when neutrino and lab-based precision experiments cost a fraction of the price.

    FNAL LBNF/DUNE from FNAL to SURF
    FNAL LBNF/DUNE from FNAL to SURF

    4
    Workers float on a raft in the Super-Kamiokande neutrino observatory which lies beneath Mount Kamioka in Hida, Japan. NPR, Wikipedia

    It will be physicists’ job to consider carefully the worth of pursuing that discovery strategy. And if high-energy colliders remain essential, they need to work on their sales pitch.

    See the full article here .

    See also here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Nature is a weekly international journal publishing the finest peer-reviewed research in all fields of science and technology on the basis of its originality, importance, interdisciplinary interest, timeliness, accessibility, elegance and surprising conclusions. Nature also provides rapid, authoritative, insightful and arresting news and interpretation of topical and coming trends affecting science, scientists and the wider public.

     
  • richardmitnick 7:29 am on August 11, 2016 Permalink | Reply
    Tags: , CERN LHC, , , , SixTrack   

    From SixTrack 

    BOINCLarge
    BOINC is a leader in the field(s) of Distributed Computing, Grid Computing and Citizen Cyberscience.BOINC is more properly the Berkeley Open Infrastructure for Network Computing, developed at UC Berkeley.

    26 Jul 2016

    LHC Sixtrack

    The members of the SixTrack project from LHC@Home would like to thank all the volunteers who made their CPUs available to us! Your contribution is precious, as in our studies we need to scan a rather large parameter space in order to find the best working points for our machines, and this would be hard to do without the computing power you all offer to us!

    Since 2012 we have started performing measurements with beam dedicated to probing what we call the “dynamic aperture” (DA). This is the region in phase space where particles can move without experiencing a large increase of the amplitude of their motion. For large machines like the LHC this is an essential parameter for granting beam stability and allowing long data taking at the giant LHC detectors. The measurements will be benchmarked against numerical simulations, and this is the point where you play an important role! Currently we are finalising a first simulation campaign and we are in the process of writing up the results in a final document. As a next step we are going to analyse the second half of the measured data, for which a new tracking campaign will be needed. …so, stay tuned!

    Magnets are the main components of an accelerator, and non-linearities in their fields have direct impact on the beam dynamics. The studies we are carrying out with your help are focussed not only on the current operation of the LHC but also on its upgrade, i.e. the High Luminosity LHC (HL-LHC). The design of the new components of the machine is at its final steps, and it is essential to make sure that the quality of the magnetic fields of the newly built components allow to reach the highly demanding goals of the project. Two aspects are mostly relevant:

    specifications for field quality of the new magnets. The criterion to assess whether the magnets’ filed quality is acceptable is based on the computation of the DA, which should larger than a pre-defined lower bound. The various magnet classes are included in the simulations one by one and the impact on DA is evaluated and the expected field quality is varied until the acceptance criterion of the DA is met.

    dynamic aperture under various optics conditions, analysis of non-linear correction system, and optics optimisation are essential steps to determine the field quality goals for the magnet designers, as well as evaluate and optimise the beam performance.

    The studies involve accelerator physicists from both CERN and SLAC.

    Long story made short, the tracking simulations we perform require significant computer resources, and BOINC is very helpful in carrying out the studies. Thanks a lot for your help!
    The SixTrack team

    Latest papers:

    R. de Maria, M. Giovannozzi, E. McIntosh (CERN), Y. Cai, Y. Nosochkov, M-H. Wang (SLAC), DYNAMIC APERTURE STUDIES FOR THE LHC HIGH LUMINOSITY LATTICE, Presented at IPAC 2015.
    Y. Nosochkov, Y. Cai, M-H. Wang (SLAC), S. Fartoukh, M. Giovannozzi, R. de Maria, E. McIntosh (CERN), SPECIFICATION OF FIELD QUALITY IN THE INTERACTION REGION MAGNETS OF THE HIGH LUMINOSITY LHC BASED ON DYNAMIC APERTURE, Presented at IPAC 2014

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    BOINC WallPaper

    Visit the BOINC web page, click on Choose projects and check out some of the very worthwhile studies you will find. Then click on Download and run BOINC software/ All Versons. Download and install the current software for your 32bit or 64bit system, for Windows, Mac or Linux. When you install BOINC, it will install its screen savers on your system as a default. You can choose to run the various project screen savers or you can turn them off. Once BOINC is installed, in BOINC Manager/Tools, click on “Add project or account manager” to attach to projects. Many BOINC projects are listed there, but not all, and, maybe not the one(s) in which you are interested. You can get the proper URL for attaching to the project at the projects’ web page(s) BOINC will never interfere with any other work on your computer.

    My BOINC
    MyBOINC

    MAJOR PROJECTS RUNNING ON BOINC SOFTWARE

    SETI@home The search for extraterrestrial intelligence. “SETI (Search for Extraterrestrial Intelligence) is a scientific area whose goal is to detect intelligent life outside Earth. One approach, known as radio SETI, uses radio telescopes to listen for narrow-bandwidth radio signals from space. Such signals are not known to occur naturally, so a detection would provide evidence of extraterrestrial technology.

    Radio telescope signals consist primarily of noise (from celestial sources and the receiver’s electronics) and man-made signals such as TV stations, radar, and satellites. Modern radio SETI projects analyze the data digitally. More computing power enables searches to cover greater frequency ranges with more sensitivity. Radio SETI, therefore, has an insatiable appetite for computing power.

    Previous radio SETI projects have used special-purpose supercomputers, located at the telescope, to do the bulk of the data analysis. In 1995, David Gedye proposed doing radio SETI using a virtual supercomputer composed of large numbers of Internet-connected computers, and he organized the SETI@home project to explore this idea. SETI@home was originally launched in May 1999.”


    SETI@home is the birthplace of BOINC software. Originally, it only ran in a screensaver when the computer on which it was installed was doing no other work. With the powerand memory available today, BOINC can run 24/7 without in any way interfering with other ongoing work.

    seti
    The famous SET@home screen saver, a beauteous thing to behold.

    einstein@home The search for pulsars. “Einstein@Home uses your computer’s idle time to search for weak astrophysical signals from spinning neutron stars (also called pulsars) using data from the LIGO gravitational-wave detectors, the Arecibo radio telescope, and the Fermi gamma-ray satellite. Einstein@Home volunteers have already discovered more than a dozen new neutron stars, and we hope to find many more in the future. Our long-term goal is to make the first direct detections of gravitational-wave emission from spinning neutron stars. Gravitational waves were predicted by Albert Einstein almost a century ago, but have never been directly detected. Such observations would open up a new window on the universe, and usher in a new era in astronomy.”

    MilkyWay@Home Milkyway@Home uses the BOINC platform to harness volunteered computing resources, creating a highly accurate three dimensional model of the Milky Way galaxy using data gathered by the Sloan Digital Sky Survey. This project enables research in both astroinformatics and computer science.”

    Leiden Classical “Join in and help to build a Desktop Computer Grid dedicated to general Classical Dynamics for any scientist or science student!”

    World Community Grid (WCG) World Community Grid is a special case at BOINC. WCG is part of the social initiative of IBM Corporation and the Smarter Planet. WCG has under its umbrella currently eleven disparate projects at globally wide ranging institutions and universities. Most projects relate to biological and medical subject matter. There are also projects for Clean Water and Clean Renewable Energy. WCG projects are treated respectively and respectably on their own at this blog. Watch for news.

    Rosetta@home “Rosetta@home needs your help to determine the 3-dimensional shapes of proteins in research that may ultimately lead to finding cures for some major human diseases. By running the Rosetta program on your computer while you don’t need it you will help us speed up and extend our research in ways we couldn’t possibly attempt without your help. You will also be helping our efforts at designing new proteins to fight diseases such as HIV, Malaria, Cancer, and Alzheimer’s….”

    GPUGrid.net “GPUGRID.net is a distributed computing infrastructure devoted to biomedical research. Thanks to the contribution of volunteers, GPUGRID scientists can perform molecular simulations to understand the function of proteins in health and disease.” GPUGrid is a special case in that all processor work done by the volunteers is GPU processing. There is no CPU processing, which is the more common processing. Other projects (Einstein, SETI, Milky Way) also feature GPU processing, but they offer CPU processing for those not able to do work on GPU’s.

    gif

    These projects are just the oldest and most prominent projects. There are many others from which you can choose.

    There are currently some 300,000 users with about 480,000 computers working on BOINC projects That is in a world of over one billion computers. We sure could use your help.

    My BOINC

    graph

     
  • richardmitnick 11:36 am on August 7, 2016 Permalink | Reply
    Tags: , , CERN LHC, , ,   

    From physicsworld.com: “And so to bed for the 750 GeV bump” 

    physicsworld
    physicsworld.com

    Aug 5, 2016
    Tushna Commissariat

    1
    No bumps: ATLAS diphoton data – the solid black line shows the 2015 and 2016 data combined. (Courtesy: ATLAS Experiment/CERN)

    2
    Smooth dips: CMS diphoton data – blue lines show 2015 data, red are 2016 data and black are the combined result. (Courtesy: CMS collaboration/CERN)

    After months of rumours, speculation and some 500 papers posted to the arXiv in an attempt to explain it, the ATLAS and CMS collaborations have confirmed that the small excess of diphoton events, or “bump”, at 750 GeV detected in their preliminary data is a mere statistical fluctuation that has disappeared in the light of more data. Most folks in the particle-physics community will have been unsurprised if a bit disappointed by today’s announcement at the International Conference on High Energy Physics (ICHEP) 2016, currently taking place in Chicago.

    The story began around this time last year, soon after the LHC was rebooted and began its impressive 13 TeV run, when the ATLAS collaboration saw more events than expected around the 750 GeV mass window. This bump immediately caught the interest of physicists world over, simply because there was a sniff of “new physics” around it, meaning that the Standard Model of particle physics did not predict the existence of a particle at that energy. But also, it was the first interesting data to emerge from the LHC after its momentous discovery of the Higgs boson in 2012 and if it had held, would have been one of the most exciting discoveries in modern particle physics.

    According to ATLAS, “Last year’s result triggered lively discussions in the scientific communities about possible explanations in terms of new physics and the possible production of a new, beyond-Standard-Model particle decaying to two photons. However, with the modest statistical significance from 2015, only more data could give a conclusive answer.”

    And that is precisely what both ATLAS and CMS did, by analysing the 2016 dataset that is nearly four times larger than that of last year. Sadly, both years’ data taken together reveal that the excess is not large enough to be an actual particle. “The compatibility of the 2015 and 2016 datasets, assuming a signal with mass and width given by the largest 2015 excess, is on the level of 2.7 sigma. This suggests that the observation in the 2015 data was an upward statistical fluctuation.” The CMS statement is succinctly similar: “No significant excess is observed over the Standard Model predictions.”

    Tommaso Dorigo, blogger and CMS collaboration member, tells me that it is wisest to “never completely believe in a new physics signal until the data are confirmed over a long time” – preferably by multiple experiments. More interestingly, he tells me that the 750 Gev bump data seemed to be a “similar signal” to the early Higgs-to-gamma-gamma data the LHC physicists saw in 2011, when they were still chasing the particle. In much the same way, more data were obtained and the Higgs “bump” went on to be an official discovery. With the 750 GeV bump, the opposite is true. “Any new physics requires really really strong evidence to be believed because your belief in the Standard Model is so high and you have seen so many fluctuations go away,” says Dorigo.

    And this is precisely what Colombia University’s Peter Woit – who blogs at Not Even Wrong – told me in March this year when I asked him how he thought the bump would play out. Woit pointed out that particle physics has a long history of “bumps” that may look intriguing at first glance, but will most likely be nothing. “If I had to guess, this will disappear,” he said, adding that the real surprise for him was that “there aren’t more bumps” considering how good the LHC team is at analysing its data and teasing out any possibilities.

    It may be fair to wonder just why so many theorists decided to work with the unconfirmed data from last year and look for a possible explanation of what kind of particle it may have been and indeed, Dorigo says that “theorists should have known better”. But on the flip-side, the Standard Model predicted many a particle long before it was eventually discovered and so it is easy to see why many were keen to come up with the perfect new model.

    Despite the hype and the eventual letdown, Dorigo is glad that this bump has got folks talking about high-energy physics. “It doesn’t matter even if it fizzles out; it’s important to keep asking ourselves these questions,” he says. The main reason for this, Dorigo explains, is that “we are at a very special junction in particle physics as we decide what new machine to build” and some input from current colliders is necessary.”Right now there is no clear direction,” he says. In light of the fact that there has been no new physics (or any hint of supersymmetry) from the LHC to date, the most likely future devices would be an electron–positron collider or, in the long term, a muon collider. But a much clearer indication is necessary before these choices are made and for now, much more data are needed.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    PhysicsWorld is a publication of the Institute of Physics. The Institute of Physics is a leading scientific society. We are a charitable organisation with a worldwide membership of more than 50,000, working together to advance physics education, research and application.

    We engage with policymakers and the general public to develop awareness and understanding of the value of physics and, through IOP Publishing, we are world leaders in professional scientific communications.
    IOP Institute of Physics

     
  • richardmitnick 11:58 am on August 5, 2016 Permalink | Reply
    Tags: blip washes out- no new particle found, CERN LHC, ,   

    From Symmetry: “LHC bump fades with more data” 

    Symmetry Mag

    Symmetry

    08/05/16
    Sarah Charley

    1
    ATLAS detector. Claudia Marcelloni, CERN

    Possible signs of new particle seem to have washed out in an influx of new data.

    A curious anomaly seen by two Large Hadron Collider experiments is now looking like a statistical fluctuation.

    CERN/LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    The anomaly—an unanticipated excess of photon pairs with a combined mass of 750 billion electronvolts—was first reported by both the ATLAS and CMS experiments in December 2015.

    CERN/CMS Detector
    CERN/CMS Detector

    Such a bump in the data could indicate the existence of a new particle. The Higgs boson, for instance, materialized in the LHC data as an excess of photon pairs with a combined mass of 125 billion electronvolts.

    CERN CMS Higgs Event
    CERN CMS Higgs Event

    However, with only a handful of data points, the two experiments could not discern whether that was the case or if it were the result of normal statistical variance.

    After quintupling their 13-TeV dataset between April and July this year, both experiments report that the bump has greatly diminished and, in some analyses, completely disappeared.

    What made this particular bump interesting is that both experiments saw the same anomaly in completely independent data sets, says Wade Fisher, a physicist at Michigan State University.

    “It’s like finding your car parked next to an identical copy,” he says. “That’s a very rare experience, but it doesn’t mean that you’ve discovered something new about the world. You’d have to keep track of every time it happened and compare what you observe to what you’d expect to see if your observation means anything.”

    Theorists predicted that a particle of that size could have been a heavier cousin of the Higgs boson or a graviton, the theoretical particle responsible for gravity. While data from more than 1000 trillion collisions have smoothed out this bump, scientists on the ATLAS experiment still cannot completely rule out its existence.

    “There’s up fluxes and down fluxes in statistics,” Fisher says. “Up fluctuations can sometimes look like the early signs of a new particles, and down fluctuations can sometimes make the signatures of a particle disappear. We’ll need the full 2016 data set to be more confident about what we’re seeing.”

    Scientists on both experiments are currently scrutinizing the huge influx of data to both better understand predicted processes and look for new physics and phenomena.

    “New physics can manifest itself in many different ways—we learn more if it surprises us rather than coming in one of the many many theories we’re already probing,” says Steve Nahn, a CMS researcher working at Fermilab. “So far the canonical Standard Model is holding up quite well, and we haven’t seen any surprises, but there’s much more data coming from the LHC, so there’s much more territory to explore.”

    See the full article here .

    See also this article from New Scientist.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 10:14 am on July 26, 2016 Permalink | Reply
    Tags: , , CERN LHC, , , , , ,   

    From Physics Today: “High-energy lab has high-energy director” 

    Physics Today bloc

    Physics Today

    21 July 2016
    Toni Feder

    CERN director general Fabiola Gianotti looks at what lies ahead for particle physics.

    1
    Fabiola Gianotti in December 2015, just before she became CERN’s director general. Credit: CERN

    Fabiola Gianotti shot to prominence on 4 July 2012, with the announcement of the discovery of the Higgs boson. At the time, she was the spokesperson of ATLAS, which along with the Compact Muon Solenoid (CMS) experiment spotted the Higgs at the Large Hadron Collider (LHC) at CERN.

    CERN ATLAS Higgs Event
    CERN/ATLAS detector
    CERN ATLAS Higgs Event; CERN/ATLAS detector

    CERN CMS Higgs Event
    CERN/CMS Detector
    CERN CMS Higgs Event; CERN/CMS Detector

    In the excitement over the Higgs discovery, Gianotti was on the cover of Time. She was hailed as among the most influential and the most inspirational women of our time. She was listed among the “leading global thinkers of 2013” by Foreign Policy magazine.

    “I am not very comfortable in the limelight,” says Gianotti. “Particle physics is a truly collaborative field. The discovery of the Higgs boson is the result of the work of thousands of physicists over more than 20 years.”

    Gianotti first went to CERN in 1987 as a graduate student at the University of Milan. She has been there ever since. And she seems comfortable at the helm, a job she has held since the beginning of this year.

    “The main challenge is to cope with so many different aspects, and switching my brain instantly from one problem to another one,” she says. “There are many challenges—human challenges, scientific challenges, technological challenges, budget challenges. But the challenges are interesting and engaging.”

    As of this summer, the LHC is in the middle of its second run, known as Run 2. In June the collider reached a record luminosity of 1034 cm−2s−1. It produces proton–proton collisions of energy of 13 TeV. A further push to the design energy of 14 TeV may be made later in Run 2 or in Run 3, which is planned for 2021–23. An upgrade following the third run will increase the LHC’s luminosity by an order of magnitude.

    Physics Today’s Toni Feder caught up with Gianotti in June, about six months into her five-year appointment in CERN’s top job.

    PT: Last fall the ATLAS and CMS experiments both reported hints of a signal at 750 GeV. What would the implications be of finding a particle at that energy?

    GIANOTTI: At the moment, we don’t know if what the experiments observed last year is the first hint of a signal or just a fluctuation. But if the bump turns into a signal, then the implications are extraordinary. Its presumed features would not be something we can classify within the best-known scenarios for physics beyond the standard model. So it would be something unexpected, and for researchers there is nothing more exciting than a surprise.

    The experiments are analyzing the data from this year’s run and will release results in the coming weeks. We can expect them on the time scale of ICHEP in Chicago at the beginning of August. [ICHEP is the International Conference on High Energy Physics.]

    PT: The LHC is up to nearly the originally planned collision energy. The next step is to increase the luminosity. How will that be done?

    GIANOTTI: To increase the luminosity, we will have to replace components of the accelerator—for example, the magnets sitting on each side of the ATLAS and CMS collision regions. These are quadrupoles that squeeze the beams and therefore increase the interaction probability. We will replace them with higher-field, larger-aperture magnets. There are also other things we have to do to upgrade the accelerator. The present schedule for the installation of the hardware components is at the end of Run 3—that is, during the 2024–26 shutdown. The operation of the high-luminosity LHC will start after this installation, so on the time scale of 2027.

    The high-luminosity LHC will allow the experiments to collect 10 times as much data. Improving the precision will be extremely important, in particular for the interaction strength—so-called couplings—of the Higgs boson with other particles. New physics can alter these couplings from the standard-model expectation. Hence the Higgs boson is a door to new physics.

    The high-luminosity LHC will also increase the discovery potential for new physics: Experiments will be able to detect particles with masses 20% to 30% larger than before the upgrade.

    And third, if new physics is discovered at the LHC in Run 2 or Run 3, the high-luminosity LHC will allow the first precise measurements of the new physics to be performed with a very well-known accelerator and very well-known experiments. So it would provide powerful constraints on the underlying theory.

    PT: What are some of the activities at CERN aside from the LHC?

    GIANOTTI: I have spent my scientific career working on high-energy colliders, which are very close to my heart. However, the open questions today in particle physics are difficult and crucial, and there is no single way to attack them. We can’t say today that a high-energy collider is the way to go and let’s forget about other approaches. Or underground experiments are the way to go. Or neutrino experiments are the way to go. There is no exclusive way. I think we have to be very inclusive, and we have to address the outstanding questions with all the approaches that our discipline has developed over the decades.

    In this vein, at CERN we have a scientific diversity program. It includes the study of antimatter through a dedicated facility, the Antiproton Decelerator; precise measurements of rare decays; and many other projects. We also participate in accelerator-based neutrino programs, mainly in the US. And we are doing R&D and design studies for the future high-energy colliders: an electron–positron collider in the multi-TeV region [the Compact Linear Collider] and future circular colliders.

    PT: Japan is the most likely host for a future International Linear Collider, an electron–positron collider (see Physics Today, March 2013, page 23). What’s your sense about whether the ILC will go ahead and whether it’s the best next step for high-energy physics?

    GIANOTTI: Japan is consulting with international partners to see if a global collaboration can be built. It’s a difficult decision to be taken, and it has to be taken by the worldwide community.

    Europe will produce a new road map, the European Strategy for Particle Physics, on the time scale of 2019–20. That will be a good opportunity to think about the future of the discipline, based also on the results from the LHC Run 2 and other facilities in the world.

    PT: How is CERN affected by tight financial situations in member countries?

    GIANOTTI: CERN has been running for many years with a constant budget, with constant revenues from member states, at a level of CHF 1.2 billion [$1.2 billion] per year. We strive to squeeze the operation of the most powerful accelerator in the world, its upgrade, and other interesting projects within this budget.

    PT: Will Brexit affect CERN?

    GIANOTTI: We are not directly affected because CERN membership is not related to being members of the European Union.

    PT: You have said you have four areas that you want to maintain and expand at CERN: science, technology and innovation, education, and peaceful collaboration. Please elaborate.

    GIANOTTI: Science first. We do research in fundamental physics, with the aim of understanding the elementary particles and their interactions, which also gives us very important indications about the structure and evolution of the universe.

    In order to accomplish these scientific goals, we have to develop cutting-edge technologies in many domains, from superconducting magnets to vacuum technology, cryogenics, electronics, computing, et cetera.

    These technologies are transferred to society and find applications in many other sectors—for example, in the medical fields with imaging and cancer therapy, but also solar panels, not to mention the World Wide Web. Fundamental research requires very sophisticated instruments and is a driver of innovation.

    Another component of our mission is education and training. The CERN population is very young: The age distribution of the 12 000 citizens from all over the world working on our experiments peaks at 27 years, and almost 50% are below 35. About half of our PhD students remain in academia or research, and about half go to industry. It is our duty to prepare them to be tomorrow’s scientists or tomorrow’s employees of industry—and in any case, good citizens.

    How do we prepare them to be good citizens? CERN was created in the early 1950s to promote fundamental research and to foster peaceful collaboration among European countries after the war. Today we have scientists of more than 110 nationalities, some from countries that are in conflict with each other, some from countries that do not even recognize each other’s right to exist. And yet they work together in a peaceful way, animated by the same passion for knowledge.

    PT: You are the first woman to head CERN. What do you see as the significance of this?

    GIANOTTI: The CERN director general should be appointed on the basis of his or her capabilities to run the laboratory and not on the basis of gender arguments. This being said, I hope that my being a woman can be useful as an encouragement to girls and young women who would like to do fundamental research but might hesitate. It shows them they have similar opportunities as their male colleagues.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    “Our mission

    The mission of Physics Today is to be a unifying influence for the diverse areas of physics and the physics-related sciences.

    It does that in three ways:

    • by providing authoritative, engaging coverage of physical science research and its applications without regard to disciplinary boundaries;
    • by providing authoritative, engaging coverage of the often complex interactions of the physical sciences with each other and with other spheres of human endeavor; and
    • by providing a forum for the exchange of ideas within the scientific community.”

     
  • richardmitnick 10:48 am on July 9, 2016 Permalink | Reply
    Tags: , , CERN LHC,   

    From Ethan Siegel: “Did The LHC Discover A New Type Of Particle?” 

    From Ethan Siegel

    Jul 9, 2016

    CERN/LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    1
    The CMS detector at CERN, one of the two most powerful particle detectors ever assembled. Image credit: CERN.

    In the quest to advance our knowledge of the Universe, the biggest advances always seem to come when an experiment or measurement indicates something new: something our best theories to that date hadn’t predicted before. We all know that the LHC is looking for fundamental particles beyond the Standard Model, including hints of supersymmetry, technicolor, extra dimensions and more. Is it possible that the LHC just discovered a new type of particle, and the results just got buried in the news? That’s the question of Andrea Lelli, who wants to know why

    ” the news about tetraquark particles discovered in the LHC was published in some scientific feeds, but it seems the news did not catch mainstream attention. Isn’t this a valuable discovery, even though tetraquarks were already theorized? What does it mean for the standard model exactly?”

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.
    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    Let’s find out.

    2
    The particles and antiparticles of the Standard Model. Image credit: E. Siegel.

    When it comes to the particles we know in the Universe, we have:

    the quarks, which make up protons and neutrons (among other things)
    the leptons, including the electron and the very light neutrinos,
    the antiquarks and antileptons, the antiparticle counterparts of the above two classes,

    we have the photon, the particle version of what we call light,
    we have the gluons, which bind the quarks together and are responsible for the strong nuclear force,
    we have the heavy gauge bosons — the W+, W- and the Z0 — which mediate the weak interactions and radioactive decays,
    and the Higgs boson.

    The main goal of the LHC was to find the Higgs, which it did, completing the gamut of expected particles in the Standard Model.

    CERN CMS Higgs Event
    CERN CMS Higgs Event

    CERN ATLAS Higgs Event
    CERN ATLAS Higgs Event

    The stretch goal, however, was to find new particles beyond the ones we expected. We hopes to find clues to the greatest unsolved problems in theoretical physics at these high energies. To find something that could provide a hint to dark matter, the matter-antimatter asymmetry of the Universe, the reason particles have the masses they do, the reason strong decays don’t occur in certain fashions, etc. To find a new fundamental particle, and to give us either experimental support for a speculative theoretical idea or to surprise us, and push us in a new direction entirely.

    The closest we’ve gotten to that is a “hint” of a new particle whose decay shows up in the two-photon channel at 750 GeV. The threshold for discovery, however, requires a significance indicating there’s less than a 0.00003% chance of a fluke; the CMS and ATLAS data are at a 3% and a 10% chance of a fluke, respectively. That’s a pretty tenuous hint.

    4
    The ATLAS and CMS diphoton bumps, displayed together, clearly correlating at ~750 GeV. Image credit: CERN, CMS/ATLAS collaborations, image generated by Matt Strassler at https://profmattstrassler.com/2015/12/16/is-this-the-beginning-of-the-end-of-the-standard-model/.

    But the LHC does have a few new discoveries under its belt, although they aren’t quite fundamental discoveries in the “new particle” sense. What we got instead, however, was an announcement about the discovery of tetraquarks.

    5
    New Tetraquark Particle Found At Fermilab

    These aren’t new particles that are additions or extensions to the standard model: they don’t represent new forces, new interactions, or potential solutions to any of the great, outstanding problems of theoretical physics today. Rather, they’re entirely combinations of the existing particles that have never been seen before.

    The way quarks work is they come with a color: red, green or blue. (Antiquarks are cyan, magenta and yellow, respectively: the anti-colors of the quarks.) Gluons are exchanged between quarks to mediate the strong nuclear force, and they change the quark (or antiquark) colors when they do. But here’s the kicker: in order to exist in nature, any combination of quarks or antiquarks must be completely colorless. So you can have:

    Three quarks, since red+green+blue = colorless.
    Three antiquarks, since cyan+magenta+yellow = colorless.
    Or a quark-antiquark combination, since red+cyan (i.e., anti-red) = colorless.

    5
    Image credit: Wikipedia / Wikimedia Commons user Qashqaiilove.

    (You can also think of colors as “arrow” vectors in particular directions, and you have to get back to the origin to make something colorless.)

    The three quark combinations are known as baryons, and protons and neutrons are two such examples, along with more exotic combinations involving heavier quarks. Combinations of three antiquarks are known as anti-baryons, and include anti-protons and anti-neutrons. And the quark-antiquark combinations are known as mesons, which mediate the forces between atomic nuclei and have interesting life-and-decay properties on their own. Meson examples include the pion, the kaon, charmonium and the upsilon.

    But why stop there? Why not envision other color-free combinations? Why not something like:

    Two quarks and two antiquarks, a tetraquark?
    Or four quarks and one antiquark, a pentaquark?
    Or even something like five quarks and two antiquarks, a septaquark?

    6
    A pentaquark mass state discovered at the LHCb collaboration in 2015. The “spike” corresponds to the pentaquark. Image credit: CERN on behalf of the LHCb collaboration.

    (Having six quarks isn’t interesting or new: we already know how to make deuterium, a heavy isotope of hydrogen.) According to the Standard Model, this is not only possible, this is predicted. It’s a natural consequence of quantum chromodynamics: the science behind the strong nuclear force and those interactions.

    In the early 2000s, it was claimed that pentaquarks — these five quark/antiquark combinations — were discovered. Unfortunately, this was premature, as the 2003 result from Japan’s Laser Electron Photon Experiment at SPring-8 (LEPS) was unable to be reproduced and the other mid-2000s results were of poor significance. Tetraquark states were coming out at right around the same time. In 2003, the Belle experiment (also in Japan) announced a very controversial result: the discovery of a particle with a mass of 3872 MeV/c2 whose quantum numbers did not match any feasible baryon or meson-like states. For the first time, we had a tetraquark candidate.

    KEK Belle detector
    KEK Belle detector

    7
    Colour flux tubes produced by a configuration of four static quark-and-antiquark charges, representing calculations done in lattice QCD. Image credit: Wikimedia Commons user Pedro.bicudo, under a c.c.a.-s.a.-4.0 license.

    Belle went on, in 2007, to discover two other tetraquark candidates, including the first one with charm quarks inside of it, while Fermilab also uncovered a number of tetraquark candidates. But the biggest breakthrough in these “other” combination states came in 2013, when both Belle and the BES III experiment (in China) independently reported the discovery of the first confirmed tetraquark state.

    8
    BESIII detector

    It was the first tetraquark to be directly observed experimentally. Just like pions, it comes in positively charged, negatively charged and also neutral versions.

    Since then, the LHC has taken the lead, collecting more data on high-energy hadrons than any other experiment before it. The LHCb experiment, in particular, is the one designed to observe these particles. Some tetraquark candidates — like Fermilab’s bottom-quark containing candidate from the DØ experiment — were disfavored by the LHC. But others were directly observed, like Belle’s 2007 charm-containing tetraquark along with many new ones. And the latest tetraquark results that you allude to, reported here in Symmetry Magazine, detail four new tetraquark particles.

    7
    The LHCb detector room at CERN. Image credit: CERN.

    Symmetry Mag
    Symmetry, a journal of FNAL and SLAC, both USA

    The cool thing about these four new particles is that they’re made up of two charm and two strange quarks apiece (with two always being the “anti” version), making these the first tetraquarks to have no light (up and down) quarks in them. And just like you can have a single electron within an atom exist in many different unique states, the way these quarks are configured means that each of these particles have unique quantum numbers, including mass, spin, parity, and charge conjugation. Physicist Thomas Britton, who did much of this work for his Ph.D., detailed:

    We looked at every known particle and process to make sure these four structures couldn’t be explained by any pre-existing physics. It was like baking a six-dimensional cake with 98 ingredients and no recipe—just a picture of a cake.

    In other words, we’re 100% positive these aren’t any normal hadrons the Standard Model could have predicted, and pretty sure these really are tetraquarks!

    8
    B mesons can decay directly into a J/Ψ (psi) particle and a Φ (phi) particle. The CDF scientists found evidence that some B mesons unexpectedly decay into an intermediate quark structure identified as a Y particle. Image credit: Symmetry Magazine.

    The way they normally show up — as the picture details above — is by showing up in an intermediate stage (indicated by Y) of some decays. This is completely allowed by the Standard Model, but it’s a very rare process and so, in some sense, it’s amazing that we have the sheer amount of data and can measure it precisely enough to detect these classes of particles at all. Tetraquarks, pentaquarks and even higher combinations are expected to be real. Perhaps most oddly of all, the Standard Model predicts the existence of glueballs, which are bound states of gluons.

    It’s important to remember that in doing these tests, and in looking for these incredibly rare and difficult-to-find states of nature, we are doing the highest-precision tests of QCD — the theory underlying the strong force — of all-time. If these predicted states of quarks, antiquarks and gluons fail to materialize, then something about QCD is wrong, and that would be a way of going beyond the Standard Model, too! Finding these states is the first step; understanding the details of how they fit together, what their hierarchies are and how our known physics applies to these more and more complex systems is what comes next. As with everything in nature, the payoff for human advancement is hard to see when the initial discovery is made, but the joy of finding things out is always its own reward.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    “Starts With A Bang! is a blog/video blog about cosmology, physics, astronomy, and anything else I find interesting enough to write about. I am a firm believer that the highest good in life is learning, and the greatest evil is willful ignorance. The goal of everything on this site is to help inform you about our world, how we came to be here, and to understand how it all works. As I write these pages for you, I hope to not only explain to you what we know, think, and believe, but how we know it, and why we draw the conclusions we do. It is my hope that you find this interesting, informative, and accessible,” says Ethan

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
Follow

Get every new post delivered to your Inbox.

Join 591 other followers

%d bloggers like this: