Tagged: CERN LHC Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 3:36 pm on September 22, 2017 Permalink | Reply
    Tags: , ATLAS hunts for new physics with dibosons, , , CERN LHC, ,   

    From CERN Courier: “ATLAS hunts for new physics with dibosons” 


    CERN Courier

    Sep 22, 2017

    1
    WZ data

    Beyond the Standard Model of particle physics (SM), crucial open questions remain such as the nature of dark matter, the overabundance of matter compared to antimatter in the universe, and also the mass scale of the scalar sector (what makes the Higgs boson so light?). Theorists have extended the SM with new symmetries or forces that address these questions, and many such extensions predict new resonances that can decay into a pair of bosons (diboson), for example: VV, Vh, Vγ and γγ, where V stands for a weak boson (W and Z), h for the Higgs boson, and γ is a photon.

    The ATLAS collaboration has a broad search programme for diboson resonances, and the most recent results using 36 fb–1 of proton–proton collision data at the LHC taken at a centre-of-mass energy of 13 TeV in 2015 and 2016 have now been released. Six different final states characterised by different boson decay modes were considered in searches for a VV resonance: 4ℓ, ℓℓνν, ℓℓqq, ℓνqq, ννqq and qqqq, where ℓ, ν and q stand for charged leptons (electrons and muons), neutrinos and quarks, respectively. For the Vh resonance search, the dominant Higgs boson decay into a pair of b-quarks (branching fraction of 58%) was exploited together with four different V decays leading to ℓℓbb, ℓνbb, ννbb and qqbb final states. A Zγ resonance was sought in final states with two leptons and a photon.

    A new resonance would appear as an excess (bump) over the smoothly distributed SM background in the invariant mass distribution reconstructed from the final-state particles. The left figure shows the observed WZ mass distribution in the qqqq channel together with simulations of some example signals. An important key to probe very high-mass signals is to identify high-momentum hadronically decaying V and h bosons. ATLAS developed a new technique to reconstruct the invariant mass of such bosons combining information from the calorimeters and the central tracking detectors. The resulting improved mass resolution for reconstructed V and h bosons increased the sensitivity to very heavy signals.

    No evidence for a new resonance was observed in these searches, allowing ATLAS to set stringent exclusion limits. For example, a graviton signal predicted in a model with extra spatial dimensions was excluded up to masses of 4 TeV, while heavy weak-boson-like resonances (as predicted in composite Higgs boson models) decaying to WZ bosons are excluded for masses up to 3.3 TeV. Heavier Higgs partners can be excluded up to masses of about 350 GeV, assuming specific model parameters.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

    Advertisements
     
  • richardmitnick 2:46 pm on September 22, 2017 Permalink | Reply
    Tags: , , , , CERN LHC, , Other CERN Linacs, ,   

    From CERN Courier: “Injecting new life into the LHC” 

    CERN Courier

    Sep 22, 2017

    Malika Meddahi
    Giovanni Rumolo

    1
    Beam transfer magnets. No image credit

    The Large Hadron Collider (LHC) is the most famous and powerful of all CERN’s machines, colliding intense beams of protons at an energy of 13 TeV.

    LHC

    CERN/LHC Map

    CERN LHC Tunnel

    CERN LHC particles

    But its success relies on a series of smaller machines in CERN’s accelerator complex that serve it. The LHC’s proton injectors have already been providing beams with characteristics exceeding the LHC’s design specifications. This decisively contributed to the excellent performance of the 2010–2013 LHC physics operation and, since 2015, has allowed CERN to push the machine beyond its nominal beam performance.

    Built between 1959 and 1976, the CERN injector complex accelerates proton beams to a kinetic energy of 450 GeV. It does this via a succession of accelerators: a linear accelerator called Linac 2 followed by three synchrotrons – the Proton Synchrotron Booster (PSB), the Proton Synchrotron (PS) and the Super Proton Synchrotron (SPS).

    2
    CERN Linac 2. No image credit

    3
    CERN The Proton Synchrotron Booster

    CERN Proton Synchrotron

    CERN Super Proton Synchrotron

    The complex also provides the LHC with ion beams, which are first accelerated through a linear accelerator called Linac 3 [nand the Low Energy Ion Ring (LEIR) synchrotron before being injected into the PS and the SPS.

    6
    CERN Linac 3

    5
    CERN Low Energy Ion Ring (LEIR) synchrotron

    The CERN injectors, besides providing beams to the LHC, also serve a large number of fixed-target experiments at CERN – including the ISOLDE radioactive-beam facility and many others.

    CERN ISOLDE

    Part of the LHC’s success lies in the flexibility of the injectors to produce various beam parameters, such as the intensity, the spacing between proton bunches and the total number of bunches in a bunch train. This was clearly illustrated in 2016 when the LHC reached peak luminosity values 40% higher than the design value of 1034 cm–2 s–1, although the number of bunches in the LHC was still about 27% below the maximum achievable. This gain was due to the production of a brighter beam with roughly the same intensity per bunch but in a beam envelope of just half the size.

    Despite the excellent performance of today’s injectors, the beams produced are not sufficient to meet the very demanding proton beam parameters specified by the high-luminosity upgrade of the LHC (HL-LHC).

    Indeed, as of 2025, the HL-LHC aims to accumulate an integrated luminosity of around 250 fb–1 per year, to be compared with the 40 fb–1 achieved in 2016. For heavy-ion operations, the goals are just as challenging: with lead ions the objective is to obtain an integrated luminosity of 10 nb–1 during four runs starting from 2021 (compared to the 2015 achievement of less than 1 nb–1). This has demanded a significant upgrade programme that is now being implemented.

    Immense challenges

    To prepare the CERN accelerator complex for the immense challenges of the HL-LHC, the LHC Injectors Upgrade project (LIU) was launched in 2010. In addition to enabling the necessary proton and ion injector chains to deliver beams of ions and protons required for the HL-LHC, the LIU project must ensure the reliable operation and lifetime of the injectors throughout the HL-LHC era, which is expected to last until around 2035. Hence, the LIU project is also tasked with replacing ageing equipment (such as power supplies, magnets and radio-frequency cavities) and improving radioprotection measures such as shielding and ventilation. [See https://sciencesprings.wordpress.com/2017/09/21/from-cern-next-stop-the-superconducting-magnets-of-the-future/%5D

    One of the first challenges faced by the LIU team members was to define the beam-performance limitations of all the accelerators in the injector chain and identify the actions needed to overcome them by the required amount. Significant machine and simulation studies were carried out over a period of years, while functional and engineering specifications were prepared to provide clear guidelines to the equipment groups. This was followed by the production of the first hardware prototype devices and their installation in the machines for testing and, where possible, early exploitation.

    Significant progress has already been made concerning the production of ion beams. Thanks to the modifications in Linac 3 and LEIR implemented after 2015 and the intensive machine studies conducted within the LIU programme over the last three years, the excellent performance of the ion injector chain could be further improved in 2016 (figure 1). This enabled the recorded luminosity for the 2016 proton–lead run to exceed the target value by a factor of almost eight. The main remaining challenges for the ion beams will be to more than double the number of bunches in the LHC through complex RF manipulations in the SPS known as “momentum slip stacking”, as well as to guarantee continued and stable performance of the ion injector chain without constant expert monitoring.

    Along the proton injector chain, the higher-intensity beams within a comparatively small beam envelope required by the HL-LHC can only be demonstrated after the installation of all the LIU equipment during Long Shutdown 2 (LS2) in 2019–2020. The main installations feature: a new injection region, a new main power supply and RF system in the PSB; a new injection region and RF system to stabilise the future beams in the PS; an upgraded main RF system; and the shielding of vacuum flanges together with partial coating of the beam chambers in order to stabilise future beams against parasitic electromagnetic interaction and electron clouds in the SPS. Beam instrumentation, protection devices and beam dumps also need to be upgraded in all the machines to match the new beam parameters. The baseline goals of the LIU project to meet the challenging HL-LHC requirements are summarised in the panel (final page of feature).

    Execution phase

    Having defined, designed and endorsed all of the baseline items during the last seven years, the LIU project is presently in its execution phase. New hardware is being produced, installed and tested in the different machines. Civil-engineering work is proceeding for the buildings that will host the new PSB main power supply and the upgraded SPS RF equipment, and to prepare the area in which the new SPS internal beam dump will be located.

    The 86 m-long Linac 4, which will eventually replace Linac 2, is an essential component of the HL-LHC upgrade .

    7
    CERN Linac 4

    The machine, based on newly developed technology, became operational at the end of 2016 following the successful completion of acceleration tests at its nominal energy of 160 MeV. It is presently undergoing an important reliability run that will be instrumental to reach beams with characteristics matching the requirements of the LIU project and to achieve an operational availability higher than 95%, which is an essential level for the first link in the proton injector chain. On 26 October 2016, the first 160 MeV negative hydrogen-ion beam was successfully sent to the injection test stand, which operated until the beginning of April 2017 and demonstrated the correct functioning of this new and critical CERN injection system as well as of the related diagnostics and controls.

    The PSB upgrade has mostly completed the equipment needed for the injection of negative hydrogen ions from Linac 4 into the PSB and is progressing with the 2 GeV energy upgrade of the PSB rings and extraction, with a planned installation date of 2019–2020 during LS2. On the beam-physics side, studies have mainly focused on the deployment of the new wideband RF system, commissioning of beam diagnostics and investigation of space-charge effects. During the 2016–2017 technical stop, the principal LIU-related activities were the removal of a large volume of obsolete cables and the installation of new beam instrumentation (e.g. a prototype transverse size measurement device and turn-by-turn orbit measurement systems). The unused cables, which had been individually identified and labelled beforehand, could be safely removed from the machine to allow cables for the new LIU equipment to be pulled.

    The procurement, construction, installation and testing of upgrade items for the PS is also progressing. Some hardware, such as new corrector magnets and power supplies, a newly developed beam gas-ionisation monitor and new injection vacuum chambers to remove aperture limitations, was already installed during past technical stops. Mitigating anticipated longitudinal beam instabilities in the PS is essential for achieving the LIU baseline beam parameters. This requires that the parasitic electromagnetic interaction of the beam with the multiple RF systems has to be reduced and a new feedback system has to be deployed to keep the beam stable. Beam-dynamics studies will determine the present intensity reach of the PS and identify any remaining needs to comfortably achieve the value required for the HL-LHC. Improved schemes of bunch rotation are also under investigation to better match the beam extracted from the PS to the SPS RF system and thus limit the beam losses at injection energy in the SPS.

    In the SPS, the LIU deployment in the tunnel has begun in earnest, with the re-arrangement and improvement of the extraction kicker system, the start of civil engineering for the new beam-dump system in LSS5 and the shielding of vacuum flanges in 10 half-cells together with the amorphous carbon coating of the adjacent beam chambers (to mitigate against electron-cloud effects). In a notable first, eight dipole and 10 focusing quadrupole magnet chambers were amorphous carbon coated in-situ during the 2016–2017 technical stop, proving the industrialisation of this process (figure 2). The new overground RF building needed to accommodate the power amplifiers of the upgraded main RF system has been completed, while procurement and testing of the solid-state amplifiers has also commenced. The prototyping and engineering for the LIU beam-dump is in progress with the construction and installation of a new SPS beam-dump block, which will be able to cope with the higher beam intensities of the HL-LHC and minimise radiation issues.

    Regarding diagnostics, the development of beam-size measurement devices based on flying wire, gas ionisation and synchrotron radiation, all of which are part of the LIU programme, is already providing meaningful results (figure 3) addressing the challenges of measuring the operating high-intensity and high-brightness beams with high precision. From the machine performance and beam dynamics side, measurements in 2015–2016 made with the very high intensities available from the PS meant that new regimes were probed in terms of electron-cloud instabilities, RF power and losses at injection. More studies are planned in 2017–2018 to clearly identify a path for the mitigation of the injection losses when operating with higher beam currents.

    Looking forward to LS2

    The success of LIU in delivering beams with the desired parameters is the key to achieving the HL-LHC luminosity target. Without the LIU beams, all of the other necessary HL-LHC developments – including high-field triplet magnets [see above], crab cavities and new collimators – would only allow a fraction of the desired luminosity to be delivered to experiments.

    Whenever possible, LIU installation work is taking place during CERN’s regular year-end technical stops. But the great majority of the upgrade requires an extended machine stop and therefore will have to wait until LS2 for implementation. The duration of access to the different accelerators during LS2 is being defined and careful preparation is ongoing to manage the work on site, ensure safety and level the available resources among the different machines in the CERN accelerator complex. After all of the LIU upgrades are in place, beams will be commissioned with the newly installed systems. The LIU goals in terms of beam characteristics are, by definition, uncharted territory. Reaching them will require not only a high level of expertise, but also careful optimisation and extensive beam-physics and machine-development studies in all of CERN’s accelerators.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

     
  • richardmitnick 8:03 pm on September 21, 2017 Permalink | Reply
    Tags: , , , CERN LHC, CERN openlab, , ,   

    From CERN: “CERN openlab tackles ICT challenges of High-Luminosity LHC “ 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead

    CERN

    21 Sep 2017
    Harriet Kim Jarlett

    1
    CERN computing centre in 2017 (Image: Robert Hradil, Monika Majer/ProStudio22.ch)

    CERN openlab has published a white paper identifying the major ICT challenges that face CERN and other ‘big science’ projects in the coming years.

    2

    CERN is home to the Large Hadron Collider (LHC), the world’s most powerful particle accelerator. The complexity of the scientific instruments at the laboratory throw up extreme ICT challenges, and make it an ideal environment for carrying out joint R&D projects and testing with industry.

    A continuing programme of upgrades to the LHC and the experiments at CERN will result in hugely increased ICT demands in the coming years. The High-Luminosity LHC, the successor to the LHC, is planned to come online in around 2026.

    By this time, the total computing capacity required by the experiments is expected to be 50-100 times greater than today, with data storage needs expected to be in the order of exabytes.

    CERN openlab works to develop and test the new ICT solutions and techniques that help to make the ground-breaking physics discoveries at CERN possible. It is a unique public-private partnership that provides a framework through which CERN can collaborate with leading ICT companies to accelerate the development of these cutting-edge technologies.

    With a new three-year phase of CERN openlab set to begin at the start of 2018, work has been carried out throughout the first half of 2017 to identify key areas for future collaboration. A series of workshops and discussions was held to discuss the ICT challenges faced by the LHC research community — and other ‘big science’ projects over the coming years. This white paper is the culmination of these investigations, and sets out specific challenges that are ripe for tackling through collaborative R&D projects with leading ICT companies.

    The white paper identifies 16 ICT ‘challenge areas’, which have been grouped into four overarching ‘R&D topics’ (data-centre technologies and infrastructures, computing performance and software, machine learning and data analytics, applications in other disciplines). Challenges identified include ensuring that data centre architectures are flexible and cost effective; using cloud computing resources in a scalable, hybrid manner; fully modernising code, in order to exploit hardware to its maximum potential; making sure large-scale platforms are in place to enable global scientific collaboration; and successfully translating the huge potential of machine learning into concrete solutions .

    “Tackling these challenges — through a public-private partnership that brings together leading experts from each of these spheres — has the potential to positively impact on a range of scientific and technological fields, as well as wider society,” says Alberto Di Meglio, head of CERN openlab.

    “With the LHC and the experiments set to undergo major upgrade work in 2019 and 2020, CERN openlab’s sixth phase offers a clear opportunity to develop ICT solutions that will already make a tangible difference for researchers when the upgraded LHC and experiments come back online in 2021,” says Maria Girone, CERN openlab CTO.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

    Quantum Diaries

     
  • richardmitnick 9:09 am on September 13, 2017 Permalink | Reply
    Tags: , CERN LHC, Charmonium surprise at LHCb, , ,   

    From LHCb at CERN: “Charmonium surprise at LHCb” 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead

    CERN

    13 Sept 2017 [This just appeared in RSS]
    Stefania Pandolfi

    1
    The LHCb cavern (Image: Maximilien Brice/CERN)

    Today, the LHCb experiment at CERN presented a measurement of the masses of two particular particles with a precision that is unprecedented at a hadron collider for this type of particles. Until now, the precise study of these “charmonium” particles, invaluable source of insights into the subatomic world, required dedicated experiments to be built.

    “Thanks to this result, the LHCb collaboration opens a new avenue to precision measurements of charmonium particles at hadron colliders, that was unexpected by the physics community”, says Giovanni Passaleva, Spokesperson for the LHCb collaboration. Indeed, this kind of measurement seemed impossible until recently.

    The two particles, χc1 and χc2, are excited states of a better-known particle called J/ψ. An excited state is a particle that has a higher internal energy, namely a mass, than the absolute minimum configuration which is allowed. The J/ψ meson and its excited states, also referred to as charmonium, are formed by a charm quark and its antimatter correspondent, a charm antiquark, bound together by the strong nuclear force. The J/ψ revolutionary observation in November 1974 triggered rapid changes in high-energy physics at the time, and earned its discoverers the Nobel Prize in physics. Just like ordinary atoms, a meson can be observed in excited states where the two quarks move around each other in different configurations, and because of Einstein’s famous equivalence of energy and mass, after a tiny amount of time they can disappear and transform into some other particles of lower masses. The LHCb experiment studied, for the first time, the particular transformation of χc1 and χc2 mesons decaying into a J/ψ particle and a pair of muons in order to determine some of their properties very precisely.

    Previous studies of χc1 and χc2 at particle colliders have exploited another type of decay of these particles, featuring a photon in the final state instead of a pair of muons. However, measuring the energy of a photon is experimentally very challenging in the harsh environment of a hadron collider. Owing to the specialised capabilities of the LHCb detector in measuring trajectories and properties of charged particles like muons, and exploiting the large dataset accumulated during the first and second runs of the LHC up to the end of 2016, it was possible to observe the two excited particles with an excellent mass resolution. Exploiting this novel decay with two muons in the final state, the new measurements of χc1 and χc2 masses and natural widths have a similar precision and are in good agreement with those obtained at previous dedicated experiments that were built with a specific experimental approach very different from that in use at colliders.

    “Not only are we no longer obliged to resort to purpose-built experiments for such studies,” continues Passaleva, “but also, in the near future, we will be able to think about applying a similar approach for the study of a similar class of particles, known as bottomonium, where charm quarks are replaced with beauty quarks.” These new measurements, along with future updates with larger datasets of collisions accumulated at the LHC, will allow new, stringent tests of the predictions of quantum chromodynamics (QCD), which is the theory that describes the behaviour of the strong nuclear force, contributing to the challenge of fully understanding the elusive features of this fundamental interaction of nature.

    Find out more on the LHCb website.

    2
    The image above shows the data points (black dots) of the reconstructed mass distribution resulting from the combination of the J/ψ and the two muons. The two particle states are the two narrow peaks standing out from the distribution of data. (Image: LHCb collaboration)

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

    Quantum Diaries

     
  • richardmitnick 9:55 pm on September 5, 2017 Permalink | Reply
    Tags: , , , CERN LHC, , , , , , , , , ,   

    From Symmetry: “What can particles tell us about the cosmos?” 

    Symmetry Mag
    Symmetry

    09/05/17
    Amanda Solliday

    The minuscule and the immense can reveal quite a bit about each other.

    In particle physics, scientists study the properties of the smallest bits of matter and how they interact. Another branch of physics—astrophysics—creates and tests theories about what’s happening across our vast universe.

    1
    The current theoretical framework that describes elementary particles and their forces, known as the Standard Model, is based on experiments that started in 1897 with the discovery of the electron. Today, we know that there are six leptons, six quarks, four force carriers and a Higgs boson. Scientists all over the world predicted the existence of these particles and then carried out the experiments that led to their discoveries. Learn all about the who, what, where and when of the discoveries that led to a better understanding of the foundations of our universe.

    While particle physics and astrophysics appear to focus on opposite ends of a spectrum, scientists in the two fields actually depend on one another. Several current lines of inquiry link the very large to the very small.

    The seeds of cosmic structure

    For one, particle physicists and astrophysicists both ask questions about the growth of the early universe.

    In her office at Stanford University, Eva Silverstein explains her work parsing the mathematical details of the fastest period of that growth, called cosmic inflation.

    Lambda-Cold Dark Matter, Accelerated Expansion of the Universe, Big Bang-Inflation (timeline of the universe) Date 2010 Credit: Alex MittelmannColdcreation

    “To me, the subject is particularly interesting because you can understand the origin of structure in the universe,” says Silverstein, a professor of physics at Stanford and the Kavli Institute for Particle Astrophysics and Cosmology. “This paradigm known as inflation accounts for the origin of structure in the most simple and beautiful way a physicist can imagine.”

    Scientists think that after the Big Bang, the universe cooled, and particles began to combine into hydrogen atoms. This process released previously trapped photons—elementary particles of light.

    The glow from that light, called the cosmic microwave background, lingers in the sky today.

    CMB per ESA/Planck

    Scientists measure different characteristics of the cosmic microwave background to learn more about what happened in those first moments after the Big Bang.

    According to scientists’ models, a pattern that first formed on the subatomic level eventually became the underpinning of the structure of the entire universe. Places that were dense with subatomic particles—or even just virtual fluctuations of subatomic particles—attracted more and more matter. As the universe grew, these areas of density became the locations where galaxies and galaxy clusters formed. The very small grew up to be the very large.

    Universe map Sloan Digital Sky Survey (SDSS) 2dF Galaxy Redshift Survey

    Scientists studying the cosmic microwave background hope to learn about more than just how the universe grew—it could also offer insight into dark matter, dark energy and the mass of the neutrino.

    Dark Matter

    Dark matter cosmic web and the large-scale structure it forms The Millenium Simulation, V. Springel et al

    Dark Matter Particle Explorer China

    DEAP Dark Matter detector, The DEAP-3600, suspended in the SNOLAB deep in Sudbury’s Creighton Mine

    LUX Dark matter Experiment at SURF, Lead, SD, USA

    ADMX Axion Dark Matter Experiment, U Uashington

    Dark Energy Survey


    Dark Energy Camera [DECam], built at FNAL


    NOAO/CTIO Victor M Blanco 4m Telescope which houses the DECam at Cerro Tololo, Chile, housing DECam at an altitude of 7200 feet

    FNAL LBNF/DUNE from FNAL to SURF, Lead, South Dakota, USA


    FNAL DUNE Argon tank at SURF


    Surf-Dune/LBNF Caverns at Sanford



    SURF building in Lead SD USA

    “It’s amazing that we can probe what was going on almost 14 billion years ago,” Silverstein says. “We can’t learn everything that was going on, but we can still learn an incredible amount about the contents and interactions.”

    For many scientists, “the urge to trace the history of the universe back to its beginnings is irresistible,” wrote theoretical physicist Stephen Weinberg in his 1977 book The First Three Minutes. The Nobel laureate added, “From the start of modern science in the sixteenth and seventeenth centuries, physicists and astronomers have returned again and again to the problem of the origin of the universe.”

    Searching in the dark

    Particle physicists and astrophysicists both think about dark matter and dark energy. Astrophysicists want to know what made up the early universe and what makes up our universe today. Particle physicists want to know whether there are undiscovered particles and forces out there for the finding.

    “Dark matter makes up most of the matter in the universe, yet no known particles in the Standard Model [of particle physics] have the properties that it should possess,” says Michael Peskin, a professor of theoretical physics at SLAC.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    “Dark matter should be very weakly interacting, heavy or slow-moving, and stable over the lifetime of the universe.”

    There is strong evidence for dark matter through its gravitational effects on ordinary matter in galaxies and clusters. These observations indicate that the universe is made up of roughly 5 percent normal matter, 25 percent dark matter and 70 percent dark energy. But to date, scientists have not directly observed dark energy or dark matter.

    “This is really the biggest embarrassment for particle physics,” Peskin says. “However much atomic matter we see in the universe, there’s five times more dark matter, and we have no idea what it is.”

    But scientists have powerful tools to try to understand some of these unknowns. Over the past several years, the number of models of dark matter has been expanding, along with the number of ways to detect it, says Tom Rizzo, a senior scientist at SLAC and head of the theory group.

    Some experiments search for direct evidence of a dark matter particle colliding with a matter particle in a detector. Others look for indirect evidence of dark matter particles interfering in other processes or hiding in the cosmic microwave background. If dark matter has the right properties, scientists could potentially create it in a particle accelerator such as the Large Hadron Collider.

    LHC

    CERN/LHC Map

    CERN LHC Tunnel

    CERN LHC particles

    Physicists are also actively hunting for signs of dark energy. It is possible to measure the properties of dark energy by observing the motion of clusters of galaxies at the largest distances that we can see in the universe.

    “Every time that we learn a new technique to observe the universe, we typically get lots of surprises,” says Marcelle Soares-Santos, a Brandeis University professor and a researcher on the Dark Energy Survey. “And we can capitalize on these new ways of observing the universe to learn more about cosmology and other sides of physics.”

    Forces at play

    Particle physicists and astrophysicists find their interests also align in the study of gravity. For particle physicists, gravity is the one basic force of nature that the Standard Model does not quite explain. Astrophysicists want to understand the important role gravity played and continues to play in the formation of the universe.

    In the Standard Model, each force has what’s called a force-carrier particle or a boson. Electromagnetism has photons. The strong force has gluons. The weak force has W and Z bosons. When particles interact through a force, they exchange these force-carriers, transferring small amounts of information called quanta, which scientists describe through quantum mechanics.

    General relativity explains how the gravitational force works on large scales: Earth pulls on our own bodies, and planetary objects pull on each other. But it is not understood how gravity is transmitted by quantum particles.

    Discovering a subatomic force-carrier particle for gravity would help explain how gravity works on small scales and inform a quantum theory of gravity that would connect general relativity and quantum mechanics.

    Compared to the other fundamental forces, gravity interacts with matter very weakly, but the strength of the interaction quickly becomes larger with higher energies. Theorists predict that at high enough energies, such as those seen in the early universe, quantum gravity effects are as strong as the other forces. Gravity played an essential role in transferring the small-scale pattern of the cosmic microwave background into the large-scale pattern of our universe today.

    “Another way that these effects can become important for gravity is if there’s some process that lasts a long time,” Silverstein says. “Even if the energies aren’t as high as they would need to be sensitive to effects like quantum gravity instantaneously.”

    Physicists are modeling gravity over lengthy time scales in an effort to reveal these effects.

    Our understanding of gravity is also key in the search for dark matter. Some scientists think that dark matter does not actually exist; they say the evidence we’ve found so far is actually just a sign that we don’t fully understand the force of gravity.

    Big ideas, tiny details

    Learning more about gravity could tell us about the dark universe, which could also reveal new insight into how structure in the universe first formed.

    Scientists are trying to “close the loop” between particle physics and the early universe, Peskin says. As scientists probe space and go back further in time, they can learn more about the rules that govern physics at high energies, which also tells us something about the smallest components of our world.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 4:03 pm on August 24, 2017 Permalink | Reply
    Tags: , , , CERN LHC, , , , , , , SURF LBNF/ DUNE,   

    From Symmetry: “Mega-collaborations for scientific discovery” 

    Symmetry Mag

    Symmetry

    08/24/17
    Leah Poffenberger

    1
    DUNE joins the elite club of physics collaborations with more than 1000 members. Photo by Reidar Hahn, Fermilab.

    Sometimes it takes lot of people working together to make discovery possible. More than 7000 scientists, engineers and technicians worked on designing and constructing the Large Hadron Collider at CERN, and thousands of scientists now run each of the LHC’s four major experiments.

    LHC

    CERN/LHC Map

    CERN LHC Tunnel

    CERN LHC particles

    Not many experiments garner such numbers. On August 15, the Deep Underground Neutrino Experiment (DUNE) became the latest member of the exclusive clique of particle physics experiments with more than a thousand collaborators.

    Meet them all:

    3

    4,000+: Compact Muon Solenoid Detector (CMS) Experiment

    CMS is one of the two largest experiments at the LHC. It is best known for its role in the discovery of the Higgs boson.

    The “C” in CMS stands for compact, but there’s nothing compact about the CMS collaboration. It is one of the largest scientific collaborations in history. More than 4000 people from 200 institutions around the world work on the CMS detector and use its data for research.

    About 30 percent of the CMS collaboration hail from US institutions*. A remote operations center at the Department of Energy’s Fermi National Accelerator Laboratory in Batavia, Illinois, serves as a base for CMS research in the United States.

    4

    3,000+: A Toroidal LHC ApparatuS (ATLAS) Experiment

    The ATLAS experiment, the other large experiment responsible for discovering the Higgs boson at the LHC, ranks number two in number of collaborators. The ATLAS collaboration has more than 3000 members from 182 institutions in 38 countries. ATLAS and CMS ask similar questions about the building blocks of the universe, but they look for the answers with different detector designs.

    About 30 percent of the ATLAS collaboration are from institutions in the United States*. Brookhaven National Laboratory in Upton, New York, serves as the US host.

    2,000+: Linear Collider Collaboration

    Proposed LC Linear Collider schematic. Location not yet decided.

    The Linear Collider Collaboration (LCC) is different from CMS and ATLAS in that the collaboration’s experiment is still a proposed project and has not yet been built. LCC has around 2000 members who are working to develop and build a particle collider that can produce different kinds of collisions than those seen at the LHC.

    LCC members are working on two potential linear collider projects: the compact linear collider study (CLIC) at CERN and the International Linear Collider (ILC) in Japan. CLIC and the ILC originally began as separate projects, but the scientists working on both joined forces in 2013.

    Either CLIC or the ILC would complement the LHC by colliding electrons and positrons to explore the Higgs particle interactions and the nature of subatomic forces in greater detail.

    1,500+; A Large Ion Collider Experiment (ALICE)

    5

    ALICE is part of LHC’s family of particle detectors, and, like ATLAS and CMS, it too has a large, international collaboration, counting 1500 members from 154 physics institutes in 37 countries. Research using ALICE is focused on quarks, the sub-atomic particles that make up protons and neutrons, and the strong force responsible for holding quarks together.

    1,000+: Deep Underground Neutrino Experiment (DUNE)

    The Deep Underground Neutrino Experiment is the newest member of the club. This month, the DUNE collaboration surpassed 1000 collaborators from 30 countries.

    From its place a mile beneath the earth at the Sanford Underground Research Facility in South Dakota, DUNE will investigate the behavior of neutrinos, which are invisible, nearly massless particles that rarely interact with other matter. The neutrinos will come from Fermilab, 800 miles away.

    Neutrino research could help scientists answer the question of why there is an imbalance between matter and antimatter in the universe. Groundbreaking for DUNE occurred on July 21, and the experiment will start taking data in around 2025.

    FNAL LBNF/DUNE from FNAL to SURF, Lead, South Dakota, USA


    FNAL DUNE Argon tank at SURF


    Surf-Dune/LBNF Caverns at Sanford



    SURF building in Lead SD USA

    Honorable mentions

    A few notable collaborations have made it close to 1000 but didn’t quite make the list. LHCb, the fourth major detector at LHC, boasts a collaboration 800 strong.

    CERN/LHCb

    Over 700 collaborators work on the Belle II experiment at KEK in Japan, which will begin taking data in 2018, studying the properties of B mesons, particles that contain a bottom quark.

    Belle II super-B factory experiment takes shape at KEK
    5

    The 600-member SLAC/Babar collaboration at SLAC National Accelerator Laboratory also studies B mesons.

    SLAC/Babar

    STAR, a detector at Brookhaven National Laboratory that probes the conditions of the early universe, has more than 600 collaborators from 55 institutions.

    BNL/RHIC Star Detector

    The CDF and DZero collaborations at Fermilab, best known for their co-discovery of the top quark in 1995, had about 700 collaborators at their peak.

    FNAL/Tevatron DZero detector

    FNAL/Tevatron CDF detector

    *Among the reasons why I started this blog was that this level of U.S. involvement was invisible in our highly vaunted press. CERN had taken over HEP from FNAL. Our idiot Congress in 1993 had killed off the Superconducting Super Collider. So it looked like we had given up. But BNL had 600 people on ATLAS. FNAL had 1000 people on CMS. So we were far from dead in HEP, just invisible. So, I had a story to tell. Today I have 1000 readers. Not too shabby for Basic and Applied Science.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 9:00 pm on July 3, 2017 Permalink | Reply
    Tags: , CERN LHC, , , , Joe Incandela, , What comes next?   

    From Symmetry: “When was the Higgs actually discovered?” 

    Symmetry Mag

    Symmetry

    07/03/17
    Sarah Charley

    The announcement on July 4 was just one part of the story. Take a peek behind the scenes of the discovery of the Higgs boson.

    1
    Maximilien Brice, Laurent Egli, CERN

    Joe Incandela UCSB and Cern CMS

    Joe Incandela sat in a conference room at CERN and watched with his arms folded as his colleagues presented the latest results on the hunt for the Higgs boson. It was December 2011, and they had begun to see the very thing they were looking for—an unexplained bump emerging from the data.

    “I was far from convinced,” says Incandela, a professor at the University of California, Santa Barbara and the former spokesperson of the CMS experiment at the Large Hadron Collider.

    CERN CMS Higgs Event

    CERN/CMS Detector

    LHC

    CERN/LHC Map

    CERN LHC Tunnel

    CERN LHC particles

    For decades, scientists had searched for the elusive Higgs boson: the holy grail of modern physics and the only piece of the robust and time-tested Standard Model that had yet to be found.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    The construction of the LHC was motivated in large part by the absence of this fundamental component from our picture of the universe. Without it, physicists couldn’t explain the origin of mass or the divergent strengths of the fundamental forces.

    “Without the Higgs boson, the Standard Model falls apart,” says Matthew McCullough, a theorist at CERN. “The Standard Model was fitting the experimental data so well that most of the theory community was convinced that something playing the role of Higgs boson would be discovered by the LHC.”

    The Standard Model predicted the existence of the Higgs but did not predict what the particle’s mass would be. Over the years, scientists had searched for it across a wide range of possible masses. By 2011, there was only a tiny region left to search; everything else had been excluded by previous generations of experimentation.

    FNAL in the Tevatron research had ruled out many of the possible levels of energy that could have been the home of Higgs.

    FNAL Tevatron

    FNAL/Tevatron map


    FNAL/Tevatron DZero detector


    FNAL/Tevatron CDF detector

    If the predicted Higgs boson were anywhere, it had to be there, right where the LHC scientists were looking.

    But Incandela says he was skeptical about these preliminary results. He knew that the Higgs could manifest itself in many different forms, and this particular channel was extremely delicate.

    “A tiny mistake or an unfortunate distribution of the background events could make it look like a new particle is emerging from the data when in reality, it’s nothing,” Incandela says.

    A common mantra in science is that extraordinary claims require extraordinary evidence. The challenge isn’t just collecting the data and performing the analysis; it’s deciding if every part of the analysis is trustworthy. If the analysis is bulletproof, the next question is whether the evidence is substantial enough to claim a discovery. And if a discovery can be claimed, the final question is what, exactly, has been discovered? Scientists can have complete confidence in their results but remain uncertain about how to interpret them.

    In physics, it’s easy to say what something is not but nearly impossible to say what it is. A single piece of corroborated, contradictory evidence can discredit an entire theory and destroy an organization’s credibility.

    “We’ll never be able to definitively say if something is exactly what we think it is, because there’s always something we don’t know and cannot test or measure,” Incandela says. “There could always be a very subtle new property or characteristic found in a high-precision experiment that revolutionizes our understanding.”

    With all of that in mind, Incandela and his team made a decision: From that point on, everyone would refine their scientific analyses using special data samples and a patch of fake data generated by computer simulations covering the interesting areas of their analyses. Then, when they were sure about their methodology and had enough data to make a significant observation, they would remove the patch and use their algorithms on all the real data in a process called unblinding.

    “This is a nice way of providing an unbiased view of the data and helps us build confidence in any unexpected signals that may be appearing, particularly if the same unexpected signal is seen in different types of analyses,” Incandela says.

    A few weeks before July 4, all the different analysis groups met with Incandela to present a first look at their unblinded results. This time the bump was very significant and showing up at the same mass in two independent channels.

    “At that point, I knew we had something,” Incandela says. “That afternoon we presented the results to the rest of the collaboration. The next few weeks were among the most intense I have ever experienced.”

    Meanwhile, the other general-purpose experiment at the LHC, ATLAS, was hot on the trail of the same mysterious bump.

    CERN/ATLAS detector

    CERN ATLAS Higgs Event

    Andrew Hard was a graduate student at The University of Wisconsin, Madison working on the ATLAS Higgs analysis with his PhD thesis advisor Sau Lan Wu.

    “Originally, my plan had been to return home to Tennessee and visit my parents over the winter holidays,” Hard says. “Instead, I came to CERN every day for five months—even on Christmas. There were a few days when I didn’t see anyone else at CERN. One time I thought some colleagues had come into the office, but it turned out to be two stray cats fighting in the corridor.”

    Hard was responsible for writing the code that selected and calibrated the particles of light the ATLAS detector recorded during the LHC’s high-energy collisions. According to predictions from the Standard Model, the Higgs can transform into two of these particles when it decays, so scientists on both experiments knew that this project would be key to the discovery process.

    “We all worked harder than we thought we could,” Hard says. “People collaborated well and everyone was excited about what would come next. All in all, it was the most exciting time in my career. I think the best qualities of the community came out during the discovery.”

    At the end of June, Hard and his colleagues synthesized all of their work into a single analysis to see what it revealed. And there it was again—that same bump, this time surpassing the statistical threshold the particle physics community generally requires to claim a discovery.

    “Soon everyone in the group started running into the office to see the number for the first time,” Hard says. “The Wisconsin group took a bunch of photos with the discovery plot.”

    Hard had no idea whether CMS scientists were looking at the same thing. At this point, the experiments were keeping their latest results secret—with the exception of Incandela, Fabiola Gianotti (then ATLAS spokesperson) and a handful of CERN’s senior management, who regularly met to discuss their progress and results.

    Fabiola Gianotti, then the ATLAS spokesperson, now the General Director of CERN

    “I told the collaboration that the most important thing was for each experiment to work independently and not worry about what the other experiment was seeing,” Incandela says. “I did not tell anyone what I knew about ATLAS. It was not relevant to the tasks at hand.”

    Still, rumors were circulating around theoretical physics groups both at CERN and abroad. Mccullough, then a postdoc at the Massachusetts Institute of Technology, was avidly following the progress of the two experiments.

    “We had an update in December 2011 and then another one a few months later in March, so we knew that both experiments were seeing something,” he says. “When this big excess showed up in July 2012, we were all convinced that it was the guy responsible for curing the ails of the Standard Model, but not necessarily precisely that guy predicted by the Standard Model. It could have properties mostly consistent with the Higgs boson but still be not absolutely identical.”

    The week before announcing what they’d found, Hard’s analysis group had daily meetings to discuss their results. He says they were excited but also nervous and stressed: Extraordinary claims require extraordinary confidence.

    “One of our meetings lasted over 10 hours, not including the dinner break halfway through,” Hard says. “I remember getting in a heated exchange with a colleague who accused me of having a bug in my code.”

    After both groups had independently and intensely scrutinized their Higgs-like bump through a series of checks, cross-checks and internal reviews, Incandela and Gianotti decided it was time to tell the world.

    “Some people asked me if I was sure we should say something,” Incandela says. “I remember saying that this train has left the station. This is what we’ve been working for, and we need to stand behind our results.”

    On July 4, 2012, Incandela and Gianotti stood before an expectant crowd and, one at a time, announced that decades of searching and generations of experiments had finally culminated in the discovery of a particle “compatible with the Higgs boson.”

    Science journalists rejoiced and rushed to publish their stories. But was this new particle the long-awaited Higgs boson? Or not?

    Discoveries in science rarely happen all at once; rather, they build slowly over time. And even when the evidence overwhelmingly points in a clear direction, scientists will rarely speak with superlatives or make definitive claims.

    “There is always a risk of overlooking the details,” Incandela says, “and major revolutions in science are often born in the details.”

    Immediately after the July 4 announcement, theorists from around the world issued a flurry of theoretical papers presenting alternative explanations and possible tests to see if this excess really was the Higgs boson predicted by the Standard Model or just something similar.

    “A lot of theory papers explored exotic ideas,” McCullough says. “It’s all part of the exercise. These papers act as a straw man so that we can see just how well we understand the particle and what additional tests need to be run.”

    For the next several months, scientists continued to examine the particle and its properties. The more data they collected and the more tests they ran, the more the discovery looked like the long-awaited Higgs boson. By March, both experiments had twice as much data and twice as much evidence.

    “Amongst ourselves, we called it the Higgs,” Incandela says, “but to the public, we were more careful.”

    It was increasingly difficult to keep qualifying their statements about it, though. “It was just getting too complicated,” Incandela says. “We didn’t want to always be in this position where we had to talk about this particle like we didn’t know what it was.”

    On March 14, 2013—nine months and 10 days after the original announcement—CERN issued a press release quoting Incandela as saying, “to me, it is clear that we are dealing with a Higgs boson, though we still have a long way to go to know what kind of Higgs boson it is.”​

    To this day, scientists are open to the possibility that the Higgs they found is not exactly the Higgs they expected.

    “We are definitely, 100 percent sure that this is a Standard-Model-like Higgs boson,” Incandela says. “But we’re hoping that there’s a chink in that armor somewhere. The Higgs is a sign post, and we’re hoping for a slight discrepancy which will point us in the direction of new physics.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 4:04 pm on June 30, 2017 Permalink | Reply
    Tags: , CERN LHC, Gluons, , , What really hapens?   

    From Symmetry: “What’s really happening during an LHC collision?” 

    Symmetry Mag

    Symmetry

    06/30/17
    Sarah Charley

    It’s less of a collision and more of a symphony.

    1
    Wow!! ATLAS collaboration.

    The Large Hadron Collider is definitely large.

    LHC

    CERN/LHC Map

    CERN LHC Tunnel

    CERN LHC particles

    With a 17-mile circumference, it is the biggest collider on the planet. But the latter fraction of its name is a little misleading. That’s because what collides in the LHC are the tiny pieces inside the hadrons, not the hadrons themselves.

    Hadrons are composite particles made up of quarks and gluons.

    The quark structure of the proton 16 March 2006 Arpad Horvath

    The gluons carry the strong force, which enables the quarks to stick together and binds them into a single particle.

    2
    SA

    The main fodder for the LHC are hadrons called protons. Protons are made up of three quarks and an indefinable number of gluons. (Protons in turn make up atoms, which are the building blocks of everything around us.)

    If a proton were enlarged to the size of a basketball, it would look empty. Just like atoms, protons are mostly empty space. The individual quarks and gluons inside are known to be extremely small, less than 1/10,000th the size of the entire proton.

    “The inside of a proton would look like the atmosphere around you,” says Richard Ruiz, a theorist at Durham University. “It’s a mixture of empty space and microscopic particles that, for all intents and purposes, have no physical volume.

    “But if you put those particles inside a balloon, you’ll see the balloon expand. Even though the internal particles are microscopic, they interact with each other and exert a force on their surroundings, inevitably producing something which does have an observable volume.”

    So how do you collide two objects that are effectively empty space? You can’t. But luckily, you don’t need a classical collision to unleash a particle’s full potential.

    In particle physics, the term “collide” can mean that two protons glide through each other, and their fundamental components pass so close together that they can talk to each other. If their voices are loud enough and resonate in just the right way, they can pluck deep hidden fields that will sing their own tune in response—by producing new particles.

    “It’s a lot like music,” Ruiz says. “The entire universe is a symphony of complex harmonies which call and respond to each other. We can easily produce the mid-range tones, which would be like photons and muons, but some of these notes are so high that they require a huge amount of energy and very precise conditions to resonate.”

    Space is permeated with dormant fields that can briefly pop a particle into existence when vibrated with the right amount of energy. These fields play important roles but almost always work behind the scenes. The Higgs field, for instance, is always interacting with other particles to help them gain mass. But a Higgs particle will only appear if the field is plucked with the right resonance.

    When protons meet during an LHC collision, they break apart and the quarks and gluons come spilling out. They interact and pull more quarks and gluons out of space, eventually forming a shower of fast-moving hadrons.

    This subatomic symbiosis is facilitated by the LHC and recorded by the experiment, but it’s not restricted to the laboratory environment; particles are also accelerated by cosmic sources such as supernova remnants. “This happens everywhere in the universe,” Ruiz says. “The LHC and its experiments are not special in that sense. They’re more like a big concert hall that provides the energy to pop open and record the symphony inside each proton.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 5:12 pm on June 29, 2017 Permalink | Reply
    Tags: , , , , CERN LHC, , , HPSS -High Performance Storage System, , , , RACF - Resource Access Control Facility, Scientific Data and Computing Center   

    From BNL: “Brookhaven Lab’s Scientific Data and Computing Center Reaches 100 Petabytes of Recorded Data” 

    Brookhaven Lab

    Ariana Tantillo
    atantillo@bnl.gov

    Total reflects 17 years of experimental physics data collected by scientists to understand the fundamental nature of matter and the basic forces that shape our universe.

    1
    (Back row) Ognian Novakov, Christopher Pinkenburg, Jérôme Lauret, Eric Lançon, (front row) Tim Chou, David Yu, Guangwei Che, and Shigeki Misawa at Brookhaven Lab’s Scientific Data and Computing Center, which houses the Oracle StorageTek tape storage system where experimental data are recorded.

    Imagine storing approximately 1300 years’ worth of HDTV video, nearly six million movies, or the entire written works of humankind in all languages since the start of recorded history—twice over. Each of these quantities is equivalent to 100 petabytes of data: the amount of data now recorded by the Relativistic Heavy Ion Collider (RHIC) and ATLAS Computing Facility (RACF) Mass Storage Service, part of the Scientific Data and Computing Center (SDCC) at the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory. One petabyte is defined as 10245 bytes, or 1,125,899,906,842,624 bytes, of data.

    “This is a major milestone for SDCC, as it reflects nearly two decades of scientific research for the RHIC nuclear physics and ATLAS particle physics experiments, including the contributions of thousands of scientists and engineers,” said Brookhaven Lab technology architect David Yu, who leads the SDCC’s Mass Storage Group.

    SDCC is at the core of a global computing network connecting more than 2,500 researchers around the world with data from the STAR and PHENIX experiments at RHIC—a DOE Office of Science User Facility at Brookhaven—and the ATLAS experiment at the Large Hadron Collider (LHC) in Europe.

    BNL/RHIC Star Detector

    BNL/RHIC PHENIX

    CERN/ATLAS detector

    In these particle collision experiments, scientists recreate conditions that existed just after the Big Bang, with the goal of understanding the fundamental forces of nature—gravitational, electromagnetic, strong nuclear, and weak nuclear—and the basic structure of matter, energy, space, and time.

    Big Data Revolution

    The RHIC and ATLAS experiments are part of the big data revolution.

    BNL RHIC Campus


    BNL/RHIC

    These experiments involve collecting extremely large datasets that reduce statistical uncertainty to make high-precision measurements and search for extremely rare processes and particles.

    For example, only one Higgs boson—an elementary particle whose energy field is thought to give mass to all the other elementary particles—is produced for every billion proton-proton collisions at the LHC.

    CERN CMS Higgs Event


    CERN/CMS Detector

    CERN ATLAS Higgs Event

    More, once produced, the Higgs boson almost immediately decays into other particles. So detecting the particle is a rare event, with around one trillion collisions required to detect a single instance. When scientists first discovered the Higgs boson at the LHC in 2012, they observed about 20 instances, recording and analyzing more than 300 trillion collisions to confirm the particle’s discovery.

    LHC

    CERN/LHC Map

    CERN LHC Tunnel

    CERN LHC particles

    At the end of 2016, the ATLAS collaboration released its first measurement of the mass of the W boson particle (another elementary particle that, together with the Z boson, is responsible for the weak nuclear force). This measurement, which is based on a sample of 15 million W boson candidates collected at LHC in 2011, has a relative precision of 240 parts per million (ppm)—a result that matches the best single-experiment measurement announced in 2007 by the Collider Detector at Fermilab collaboration, whose measurement is based on several years’ worth of collected data. A highly precise measurement is important because a deviation from the mass predicted by the Standard Model could point to new physics. More data samples are required to achieve the level of accuracy (80 ppm) that scientists need to significantly test this model.

    The volume of data collected by these experiments will grow significantly in the near future as new accelerator programs deliver higher-intensity beams. The LHC will be upgraded to increase its luminosity (rate of collisions) by a factor of 10. This High-Luminosity LHC, which should be operational by 2025, will provide a unique opportunity for particle physicists to look for new and unexpected phenomena within the exabytes (one exabyte equals 1000 petabytes) of data that will be collected.

    Data archiving is the first step in making available the results from such experiments. Thousands of physicists then need to calibrate and analyze the archived data and compare the data to simulations. To this end, computational scientists, computer scientists, and mathematicians in Brookhaven Lab’s Computational Science Initiative, which encompasses SDCC, are developing programming tools, numerical models, and data-mining algorithms. Part of SDCC’s mission is to provide computing and networking resources in support of these activities.

    A Data Storage, Computing, and Networking Infrastructure

    Housed inside SDCC are more than 60,000 computing cores, 250 computer racks, and tape libraries capable of holding up to 90,000 magnetic storage tape cartridges that are used to store, process, analyze, and distribute the experimental data. The facility provides approximately 90 percent of the computing capacity for analyzing data from the STAR and PHENIX experiments, and serves as the largest of the 12 Tier 1 computing centers worldwide that support the ATLAS experiment. As a Tier 1 center, SDCC contributes nearly 23 percent of the total computing and storage capacity for the ATLAS experiment and delivers approximately 200 terabytes of data (picture 62 million photos) per day to more than 100 data centers globally.

    At SDCC, the High Performance Storage System (HPSS) has been providing mass storage services to the RHIC and LHC experiments since 1997 and 2006, respectively. This data archiving and retrieval software, developed by IBM and several DOE national laboratories, manages petabytes of data on disk and in robot-controlled tape libraries. Contained within the libraries are magnetic tape cartridges that encode the data and tape drives that read and write the data. Robotic arms load the cartridges into the drives and unload them upon request.

    3
    Inside one of the automated tape libraries at the Scientific Data and Computing Center (SDCC), Eric Lançon, director of SDCC, holds a magnetic tape cartridge. When scientists need data, a robotic arm (the piece of equipment in front of Lançon) retrieves the relevant cartridges from their slots and loads them into drives in the back of the library.

    When ranked by the volume of data stored in a single HPSS, Brookhaven’s system is the second largest in the nation and the fourth largest in the world. Currently, the RACF operates nine Oracle robotic tape libraries that constitute the largest Oracle tape storage system in the New York tri-state area. Contained within this system are nearly 70,000 active cartridges with capacities ranging from 800 gigabytes to 8.5 terabytes, and more than 100 tape drives. As the volume of scientific data to be stored increases, more libraries, tapes, and drives can be added accordingly. In 2006, this scalability was exercised when HPSS was expanded to accommodate data from the ATLAS experiment at LHC.

    “The HPSS system was deployed in the late 1990s, when the RHIC accelerator was coming on line. It allowed data from RHIC experiments to be transmitted via network to the data center for storage—a relatively new idea at the time,” said Shigeki Misawa, manager of Mass Storage and General Services at Brookhaven Lab. Misawa played a key role in the initial evaluation and configuration of HPSS, and has guided the system through significant changes in hardware (network equipment, storage systems, and servers) and operational requirements (tape drive read/write rate, magnetic tape cartridge capacity, and data transfer speed). “Prior to this system, data was recorded on magnetic tape at the experiment and physically moved to the data center,” he continued.

    Over the years, SDCC’s HPSS has been augmented with a suite of optimization and monitoring tools developed at Brookhaven Lab. One of these tools is David Yu’s scheduling software that optimizes the retrieval of massive amounts of data from tape storage. Another, developed by Jérôme Lauret, software and computing project leader for the STAR experiment, is software for organizing multiple user requests to retrieve data more efficiently.

    Engineers in the Mass Storage Group—including Tim Chou, Guangwei Che, and Ognian Novakov—have created other software tools customized for Brookhaven Lab’s computing environment to enhance data management and operation abilities and to improve the effectiveness of equipment usage.

    STAR experiment scientists have demonstrated the capabilities of SDCC’s enhanced HPSS, retrieving more than 4,000 files per hour (a rate of 6,000 gigabytes per hour) while using a third of HPSS resources. On the data archiving side, HPSS can store data in excess of five gigabytes per second.

    As demand for mass data storage spreads across Brookhaven, access to HPSS is being extended to other research groups. In the future, SDCC is expected to provide centralized mass storage services to multi-experiment facilities, such as the Center for Functional Nanomaterials and the National Synchrotron Light Source II—two more DOE Office of Science User Facilities at Brookhaven.

    “The tape library system of SDCC is a clear asset for Brookhaven’s current and upcoming big data science programs,” said SDCC Director Eric Lançon. “Our expertise in the field of data archiving is acknowledged worldwide.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    BNL Campus

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 6:04 pm on June 20, 2017 Permalink | Reply
    Tags: , CERN LHC, Cyber security,   

    From SA: “World’s Most Powerful Particle Collider Taps AI to Expose Hack Attacks” 

    Scientific American

    Scientific American

    June 19, 2017
    Jesse Emspak

    1
    A general view of the CERN Computer / Data Center and server farm. Credit: Dean Mouhtaropoulos Getty Images

    Thousands of scientists worldwide tap into CERN’s computer networks each day in their quest to better understand the fundamental structure of the universe. Unfortunately, they are not the only ones who want a piece of this vast pool of computing power, which serves the world’s largest particle physics laboratory. The hundreds of thousands of computers in CERN’s grid are also a prime target for hackers who want to hijack those resources to make money or attack other computer systems. But rather than engaging in a perpetual game of hide-and-seek with these cyber intruders via conventional security systems, CERN scientists are turning to artificial intelligence to help them outsmart their online opponents.

    Current detection systems typically spot attacks on networks by scanning incoming data for known viruses and other types of malicious code. But these systems are relatively useless against new and unfamiliar threats. Given how quickly malware changes these days, CERN is developing new systems that use machine learning to recognize and report abnormal network traffic to an administrator. For example, a system might learn to flag traffic that requires an uncharacteristically large amount of bandwidth, uses the incorrect procedure when it tries to enter the network (much like using the wrong secret knock on a door) or seeks network access via an unauthorized port (essentially trying to get in through a door that is off-limits).

    CERN’s cybersecurity department is training its AI software to learn the difference between normal and dubious behavior on the network, and to then alert staff via phone text, e-mail or computer message of any potential threat. The system could even be automated to shut down suspicious activity on its own, says Andres Gomez, lead author of a paper [Intrusion Prevention and Detection in GridComputing – The ALICE Case] describing the new cybersecurity framework.

    CERN’s Jewel

    CERN—the French acronym for the European Organization for Nuclear Research lab, which sits on the Franco-Swiss border—is opting for this new approach to protect a computer grid used by more than 8,000 physicists to quickly access and analyze large volumes of data produced by the Large Hadron Collider (LHC).

    LHC

    CERN/LHC Map

    CERN LHC Tunnel

    CERN LHC particles

    The LHC’s main job is to collide atomic particles at high-speed so that scientists can study how particles interact. Particle detectors and other scientific instruments within the LHC gather information about these collisions, and CERN makes it available to laboratories and universities worldwide for use in their own research projects.

    The LHC is expected to generate a total of about 50 petabytes of data (equal to 15 million high-definition movies) in 2017 alone, and demands more computing power and data storage than CERN itself can provide. In anticipation of that type of growth the laboratory in 2002 created its Worldwide LHC Computing Grid, which connects computers from more than 170 research facilities across more than 40 countries. CERN’s computer network functions somewhat like an electrical grid, which relies on a network of generating stations that create and deliver electricity as needed to a particular community of homes and businesses. In CERN’s case the community consists of research labs that require varying amounts of computing resources, based on the type of work they are doing at any given time.

    Grid Guardians

    One of the biggest challenges to defending a computer grid is the fact that security cannot interfere with the sharing of processing power and data storage. Scientists from labs in different parts of the world might end up accessing the same computers to do their research if demand on the grid is high or if their projects are similar. CERN also has to worry about whether the computers of the scientists’ connecting into the grid are free of viruses and other malicious software that could enter and spread quickly due to all the sharing. A virus might, for example, allow hackers to take over parts of the grid and use those computers either to generate digital currency known as bitcoins or to launch cyber attacks against other computers. “In normal situations, antivirus programs try to keep intrusions out of a single machine,” Gomez says. “In the grid we have to protect hundreds of thousands of machines that already allow” researchers outside CERN to use a variety of software programs they need for their different experiments. “The magnitude of the data you can collect and the very distributed environment make intrusion detection on [a] grid far more complex,” he says.

    Jarno Niemelä, a senior security researcher at F-Secure, a company that designs antivirus and computer security systems, says CERN’s use of machine learning to train its network defenses will give the lab much-needed flexibility in protecting its grid, especially when searching for new threats. Still, artificially intelligent intrusion detection is not without risks—and one of the biggest is whether Gomez and his team can develop machine-learning algorithms that can tell the difference between normal and harmful activity on the network without raising a lot of false alarms, Niemelä says.

    CERN’s AI cybersecurity upgrades are still in the early stages and will be rolled out over time. The first test will be protecting the portion of the grid used by ALICE (A Large Ion Collider Experiment)—a key LHC project to study the collisions of lead nuclei. If tests on ALICE are successful, CERN’s machine learning–based security could then be used to defend parts of the grid used by the institution’s six other detector experiments.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: