Tagged: CERN Courier Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 3:43 pm on November 10, 2017 Permalink | Reply
    Tags: , , , CERN Courier, , Extreme cosmic rays reveal clues to origin, , ,   

    From CERN Courier: “Extreme cosmic rays reveal clues to origin” 


    CERN Courier

    Nov 10, 2017
    Merlin Kole

    1
    Dipole structure

    The energy spectrum of cosmic rays continuously bombarding the Earth spans many orders of magnitude, with the highest energy events topping 108 TeV. Where these extreme particles come from, however, has remained a mystery since their discovery more than 50 years ago. Now the Pierre Auger collaboration has published results showing that the arrival direction of ultra-high-energy cosmic rays (UHECRs) is far from uniform, giving a clue to their origins.

    The discovery in 1963 at the Vulcano Ranch Experiment of cosmic rays with energies exceeding one million times the energy of the protons in the LHC raised many questions. Not only is the charge of these hadronic particles unknown, but the acceleration mechanisms required to produce UHECRs and the environments that can host these mechanisms are still being debated. Proposed origins include sources in the galactic centre, extreme supernova events, mergers of neutron stars, and extragalactic sources such as blazars. Unlike the case with photons or neutrinos, the arrival direction of charged cosmic rays does not point directly towards their origin because, despite their extreme energies, their paths are deflected by magnetic fields both inside and outside our galaxy. Since the deflection reduces as the energy goes up, however, some UHECRs with the highest energies might still contain information about their arrival direction.

    At the Pierre Auger Observatory, cosmic rays are detected using a vast array of detectors spread over an area of 3000 km2 near the town of Malargüe in western Argentina.

    Pierre Auger Observatory in the western Mendoza Province, Argentina, near the Andes, at an altitude of 1330 m–1620 m, average ~1400 m

    Like the first cosmic-ray detectors in the 1960s, the array measures the air showers induced as the cosmic rays interact with the atmosphere. The arrival times of the particles, measured with GPS receivers, are used to determine the direction from which the primary particles came within approximately one degree.

    The collaboration studied the arrival direction of particles with energies in the range 4 8 EeV and for particles with energies exceeding 8 EeV. In the former data set, no clear anisotropy was observed, whereas for particles with energies above 8 EeV a dipole structure was observed (see figure), indicating that more particles come from a particular part of the sky. Since the maximum of the dipole is outside the galactic plane, the measured anisotropy is consistent with an extragalactic nature. The collaboration reports that the maximum, when taking into account the deflection of magnetic fields, is consistent with a region in the sky known to have a large density of galaxies, supporting the view that UHECRs are produced in other galaxies. The lack of anisotropy at lower energies could be a result of the higher deflection of these particles in the galactic magnetic field.

    The presented dipole measurement is based on a total of 30,000 cosmic rays measured by the Pierre Auger Observatory, which is currently being upgraded. Although the results indicate an extragalactic origin, the particular source responsible for accelerating these particles remains unknown. The upgraded observatory will enable more data to be acquired and allow a more detailed investigation of the currently studied energy ranges. It will also open the possibility to explore even higher energies where the magnetic-field deflections become even smaller, making it possible to study the origin of UHECRs, their acceleration mechanism and the magnetic fields that deflect them.

    Further reading
    Pierre Auger Collaboration 2017 Science 357 1266.
    http://science.sciencemag.org/content/357/6357/1266

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

    Advertisements
     
  • richardmitnick 3:23 pm on November 10, 2017 Permalink | Reply
    Tags: , , , CALET on the ISS, CERN Courier, , High-energy cosmic rays, ,   

    From CERN Courier: “First cosmic-ray results from CALET on the ISS” 


    CERN Courier

    Nov 10, 2017

    1
    CALET on the ISS. No image credit

    2
    Electron spectrum

    The CALorimetric Electron Telescope (CALET), a space mission led by the Japan Aerospace Exploration Agency with participation from the Italian Space Agency (ASI) and NASA, has released its first results concerning the nature of high-energy cosmic rays.

    Having docked with the International Space Station (ISS) on 25 August 2015, CALET is carrying out a full science programme with long-duration observations of high-energy charged particles and photons coming from space. It is the second high-energy experiment operating on the ISS following the deployment of AMS-02 in 2011. During the summer of 2017 a third experiment, ISS-CREAM, joined these two. Unlike AMS-02, CALET and ISS-CREAM have no magnetic spectrometer and therefore measure the inclusive electron and positron spectrum. CALET’s homogeneus calorimeter is optimised to measure electrons, and one of its main science goals is to measure the detailed shape of the electron spectrum.

    Due to the large radiative losses during their travel in space, high-energy cosmic electrons are expected to originate from regions relatively close to Earth (of the order of a few thousand light-years). Yet their origin is still unknown. The shape of the spectrum and the anisotropy in the arrival direction might contain crucial information as to where and how electrons are accelerated. It could also provide a clue on possible signatures of dark matter – for example, the presence of a peak in the spectrum might tell us about a possible dark-matter decay or annihilation with an electron or positron in the final state – and shed light on the intriguing electron and positron spectra reported by AMS-02 (CERN Courier December 2016 p26).

    To pinpoint possible spectral features on top of the overall power-law energy dependence of the spectrum, CALET was designed to measure the energy of the incident particle with very high resolution and with a large proton rejection power, well into the TeV energy region. This is provided by a thick homogeneous calorimeter preceded by a high-granularity pre-shower with imaging capabilities with a total thickness of 30 radiation length at normal incidence. The calibration of the two instruments is the key to control the energy scale and this is why CALET – a CERN-recognised experiment – performed several calibration tests at CERN.

    The first data from CALET concern a measurement of the inclusive electron and positron spectrum in the energy range from 10 GeV to 3 TeV, based on about 0.7 million candidates (1.3 million in full acceptance). Above an energy of 30 GeV the spectrum can be fitted with a single power law with a spectral index of –3.152±0.016. A possible structure observed above 100 GeV requires further investigation with increased statistics and refined data analysis. Beyond 1 TeV, where a roll-off of the spectrum is expected and low statistics is an issue, electron data are now being carefully analysed to extend the measurement. CALET has been designed to measure electrons up to around 20 TeV and hadrons up to an energy of 1 PeV.

    CALET is a powerful space observatory with the ability to identify cosmic nuclei from hydrogen to elements heavier than iron. It also has a dedicated gamma-ray-burst instrument (CGBM) that so far has detected bursts at an average rate of one every 10 days in the energy range of 7 KeV–20 MeV. The search for electromagnetic counterparts of gravitational waves (GWs) detected by the LIGO and Virgo observatories proceeds around the clock thanks to a special collaboration agreement with LIGO and Virgo. Upper limits on X-ray and gamma-ray counterparts of the GW151226 event were published and further research on GW follow-ups is being carried out. Space-weather studies relative to the relativistic electron precipitation (REP) from the Van Allen belts have also been released.

    With more than 500 million triggers collected so far and an expected extension of the observation time on the ISS to five years, CALET is likely to produce a wealth of interesting results in the near future.

    Further reading

    CALET Collaboration. 2017 Phys. Rev. Lett. 119 181101

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

     
  • richardmitnick 4:16 pm on October 13, 2017 Permalink | Reply
    Tags: , CERN Courier, CLEAR, , ,   

    From CERN Courier: “CLEAR prospects for accelerator research” 


    CERN Courier

    Oct 13, 2017

    1
    CLEAR’s plasma-lens experiment will test ways to drive strong currents through a plasma for particle-beam transverse focusing.
    Image credit: M Volpi.

    A new user facility for accelerator R&D, the CERN Linear Electron Accelerator for Research (CLEAR), started operation in August and is ready to provide beam for experiments. CLEAR evolved from the former CTF3 test facility for the Compact Linear Collider (CLIC), which ended a successful programme in December 2016. Following approval of the CLEAR proposal, the necessary hardware modifications started in January and the facility is now able to host and test a broad range of ideas in the accelerator field.

    CLEAR’s primary goal is to enhance and complement the existing accelerator R&D programme at CERN, as well as offering a training infrastructure for future accelerator physicists and engineers. The focus is on general accelerator R&D and component studies for existing and possible future accelerator applications. This includes studies of high-gradient acceleration methods, such as CLIC X-band and plasma technologies, as well as prototyping and validation of accelerator components for the high-luminosity LHC upgrade.

    The scientific programme for 2017 includes: a combined test of critical CLIC technologies, continuing previous tests performed at CTF3; measurements of radiation effects on electronic components to be installed on space missions in a Jovian environment and for dosimetry tests aimed at medical applications; beam instrumentation R&D; and the use of plasma for beam focusing. Further experiments, such as those exploring THz radiation for accelerator applications and direct impedance measurements of equipment to be installed in CERN accelerators, are also planned.

    The experimental programme for 2018 and beyond is still open to new and challenging proposals. An international scientific committee is currently being formed to prioritise proposals, and a user request form is available at the CLEAR website: http://clear.web.cern.ch/.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

     
  • richardmitnick 8:05 pm on September 25, 2017 Permalink | Reply
    Tags: , , , , , CERN Courier, , , , DM axions, , , , The origin of solar flares,   

    From CERN Courier: “Study links solar activity to exotic dark matter” 


    CERN Courier

    1
    Solar-flare distributions

    The origin of solar flares, powerful bursts of radiation appearing as sudden flashes of light, has puzzled astrophysicists for more than a century. The temperature of the Sun’s corona, measuring several hundred times hotter than its surface, is also a long-standing enigma.

    A new study suggests that the solution to these solar mysteries is linked to a local action of dark matter (DM). If true, it would challenge the traditional picture of DM as being made of weakly interacting massive particles (WIMPs) or axions, and suggest that DM is not uniformly distributed in space, as is traditionally thought.

    The study is not based on new experimental data. Rather, lead author Sergio Bertolucci, a former CERN research director, and collaborators base their conclusions on freely available data recorded over a period of decades by geosynchronous satellites. The paper presents a statistical analysis of the occurrences of around 6500 solar flares in the period 1976–2015 and of the continuous solar emission in the extreme ultraviolet (EUV) in the period 1999–2015. The temporal distribution of these phenomena, finds the team, is correlated with the positions of the Earth and two of its neighbouring planets: Mercury and Venus. Statistically significant (above 5σ) excesses of the number of flares with respect to randomly distributed occurrences are observed when one or more of the three planets find themselves in a slice of the ecliptic plane with heliocentric longitudes of 230°–300°. Similar excesses are observed in the same range of longitudes when the solar irradiance in the EUV region is plotted as a function of the positions of the planets.

    These results suggest that active-Sun phenomena are not randomly distributed, but instead are modulated by the positions of the Earth, Venus and Mercury. One possible explanation, says the team, is the existence of a stream of massive DM particles with a preferred direction, coplanar to the ecliptic plane, that is gravitationally focused by the planets towards the Sun when one or more of the planets enter the stream. Such particles would need to have a wide velocity spectrum centred around 300 km s–1 and interact with ordinary matter much more strongly than typical DM candidates such as WIMPs. The non-relativistic velocities of such DM candidates make planetary gravitational lensing more efficient and can enhance the flux of the particles by up to a factor of 106, according to the team.

    Co-author Konstantin Zioutas, spokesperson for the CAST experiment at CERN, accepts that this interpretation of the solar and planetary data is speculative – particularly regarding the mechanism by which a temporarily increased influx of DM actually triggers solar activity.

    CERN CAST Axion Solar Telescope

    However, he says, the long persisting failure to detect the ubiquitous DM might be due to the widely assumed small cross-section of its constituents with ordinary matter, or to erroneous DM modelling. “Hence, the so-far-adopted direct-detection concepts can lead us towards a dead end, and we might find that we have overlooked a continuous communication between the dark and the visible sector.”

    Models of massive DM streaming particles that interact strongly with normal matter are few and far between, although the authors suggest that “antiquark nuggets” are best suited to explain their results. “In a few words, there is a large ‘hidden’ energy in the form of the nuggets,” says Ariel Zhitnitsky, who first proposed the quark-nugget dark-matter model in 2003. “In my model, this energy can be precisely released in the form of the EUV radiation when the anti-nuggets enter the solar corona and get easily annihilated by the light elements present in such a highly ionised environment.”

    The study calls for further investigation, says researchers. “It seems that the statistical analysis of the paper is accurate and the obtained results are rather intriguing,” says Rita Bernabei, spokesperson of the DAMA experiment, which for the first time in 1998 claimed to have detected dark matter in the form of WIMPs on the basis of an observed seasonal modulation of a signal in their scintillation detector.

    DAMA-LIBRA at Gran Sasso

    “However, the paper appears to be mostly hypothetical in terms of this new type of dark matter.”

    The team now plans to produce a full simulation of planetary lensing taking into account the simultaneous effect of all the planets in the solar system, and to extend the analysis to include sunspots, nano-flares and other solar observables. CAST, the axion solar telescope at CERN, will also dedicate a special data-taking period to the search for streaming DM axions.

    “If true, our findings will provide a totally different view about dark matter, with far-reaching implications in particle and astroparticle physics,” says Zioutas. “Perhaps the demystification of the Sun could lead to a dark-matter solution also.”

    Further reading

    S Bertolucci et al. 2017 Phys. Dark Universe 17 13. Elsevier

    http://www.elsevier.com/locate/dark

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

     
  • richardmitnick 3:36 pm on September 22, 2017 Permalink | Reply
    Tags: , ATLAS hunts for new physics with dibosons, , CERN Courier, , ,   

    From CERN Courier: “ATLAS hunts for new physics with dibosons” 


    CERN Courier

    Sep 22, 2017

    1
    WZ data

    Beyond the Standard Model of particle physics (SM), crucial open questions remain such as the nature of dark matter, the overabundance of matter compared to antimatter in the universe, and also the mass scale of the scalar sector (what makes the Higgs boson so light?). Theorists have extended the SM with new symmetries or forces that address these questions, and many such extensions predict new resonances that can decay into a pair of bosons (diboson), for example: VV, Vh, Vγ and γγ, where V stands for a weak boson (W and Z), h for the Higgs boson, and γ is a photon.

    The ATLAS collaboration has a broad search programme for diboson resonances, and the most recent results using 36 fb–1 of proton–proton collision data at the LHC taken at a centre-of-mass energy of 13 TeV in 2015 and 2016 have now been released. Six different final states characterised by different boson decay modes were considered in searches for a VV resonance: 4ℓ, ℓℓνν, ℓℓqq, ℓνqq, ννqq and qqqq, where ℓ, ν and q stand for charged leptons (electrons and muons), neutrinos and quarks, respectively. For the Vh resonance search, the dominant Higgs boson decay into a pair of b-quarks (branching fraction of 58%) was exploited together with four different V decays leading to ℓℓbb, ℓνbb, ννbb and qqbb final states. A Zγ resonance was sought in final states with two leptons and a photon.

    A new resonance would appear as an excess (bump) over the smoothly distributed SM background in the invariant mass distribution reconstructed from the final-state particles. The left figure shows the observed WZ mass distribution in the qqqq channel together with simulations of some example signals. An important key to probe very high-mass signals is to identify high-momentum hadronically decaying V and h bosons. ATLAS developed a new technique to reconstruct the invariant mass of such bosons combining information from the calorimeters and the central tracking detectors. The resulting improved mass resolution for reconstructed V and h bosons increased the sensitivity to very heavy signals.

    No evidence for a new resonance was observed in these searches, allowing ATLAS to set stringent exclusion limits. For example, a graviton signal predicted in a model with extra spatial dimensions was excluded up to masses of 4 TeV, while heavy weak-boson-like resonances (as predicted in composite Higgs boson models) decaying to WZ bosons are excluded for masses up to 3.3 TeV. Heavier Higgs partners can be excluded up to masses of about 350 GeV, assuming specific model parameters.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

     
  • richardmitnick 3:19 pm on September 22, 2017 Permalink | Reply
    Tags: , CERN Courier, , ,   

    From ORNL via CERN Courier: “Miniature detector first to spot coherent neutrino-nucleus scattering” 

    i1

    Oak Ridge National Laboratory

    CERN Courier

    1
    Detector placement. No image credit

    The COHERENT collaboration at Oak Ridge National Laboratory (ORNL) in the US has detected coherent elastic scattering of neutrinos off nuclei for the first time. The ability to harness this process, predicted 43 years ago, offers new ways to study neutrino properties and could drastically reduce the scale of neutrino detectors.

    Neutrinos famously interact very weakly, requiring very large volumes of active material to detect their presence. Typically, neutrinos interact with individual protons or neutrons inside a nucleus, but coherent elastic neutrino-nucleus scattering (CEνNS) occurs when a neutrino interacts with an entire nucleus. For this to occur, the momentum exchanged must remain significantly small compared to the nuclear size. This restricts the process to neutrino energies below a few tens of MeV, in contrast to the charged-current interactions by which neutrinos are usually detected. The signature of CEνNS is a low-energy nuclear recoil with all nucleon wavefunctions remaining in phase, but until now the difficulty in detecting these low-energy nuclear recoils has prevented observations of CEνNS – despite the predicted cross-section for this process being the largest of all low-energy neutrino couplings.

    The COHERENT team, comprising 80 researchers from 19 institutions, used ORNL’s Spallation Neutron Source (SNS), which generates the most intense pulsed neutron beams in the world while simultaneously creating a significant yield of low-energy neutrinos.

    ORNL Spallation Neutron Source

    Approximately 5 × 1020 protons are delivered per day, each returning roughly 0.08 isotropically emitted neutrinos per flavour. The researchers placed a detector, a caesium-iodide scintillator crystal doped with sodium, 20 m from the neutrino source with shielding to reduce background events associated with the neutron-induced nuclear recoils produced from the SNS. The results favour the presence of CEνNS over its absence at the 6.7σ level, with 134±22 events observed versus 173±48 predicted.

    Crucially, the result was achieved using the world’s smallest neutrino detector, with a mass of 14.5 kg. This is a consequence of the large nuclear mass of caesium and iodine, which results in a large CEνNS cross-section.

    The intense scintillation of this material for low-energy nuclear recoils, combined with the large neutrino flux of the SNS, also contributed to the success of the measurement. In effect, CEνNS allows the same detection rates as conventional neutrino detectors that are 100 times more massive.

    “It is a nearly ideal detector choice for coherent neutrino scattering,” says lead designer Juan Collar of the University of Chicago. “However, other new coherent neutrino-detector designs are appearing over the horizon that look extraordinarily promising in order to further reduce detector mass, truly realising technological applications such as reactor monitoring.”

    Yoshi Uchida of Imperial College London, who was not involved in the study, says that detecting neutrinos via the neutral-current process as opposed to the usual charged-current process is a great advantage because it is “blind” to the type of neutrino being produced and is sensitive at low energies. “So in combination with other types of detection, it could tell us a lot about a particular neutrino source of interest.” However, he adds that the SNS set-up is very specific and that, outside such ideal conditions, it might be difficult to scale a similar detector in a way that would be of practical use. “The fact that the COHERENT collaboration already has several other target nuclei (and detection methods) being used in their set-up means there will be more to come on this subject in the near future.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    ORNL is managed by UT-Battelle for the Department of Energy’s Office of Science. DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time.

    i2

     
  • richardmitnick 2:46 pm on September 22, 2017 Permalink | Reply
    Tags: , , CERN Courier, , , , Other CERN Linacs, ,   

    From CERN Courier: “Injecting new life into the LHC” 

    CERN Courier

    Sep 22, 2017

    Malika Meddahi
    Giovanni Rumolo

    1
    Beam transfer magnets. No image credit

    The Large Hadron Collider (LHC) is the most famous and powerful of all CERN’s machines, colliding intense beams of protons at an energy of 13 TeV.

    LHC

    CERN/LHC Map

    CERN LHC Tunnel

    CERN LHC particles

    But its success relies on a series of smaller machines in CERN’s accelerator complex that serve it. The LHC’s proton injectors have already been providing beams with characteristics exceeding the LHC’s design specifications. This decisively contributed to the excellent performance of the 2010–2013 LHC physics operation and, since 2015, has allowed CERN to push the machine beyond its nominal beam performance.

    Built between 1959 and 1976, the CERN injector complex accelerates proton beams to a kinetic energy of 450 GeV. It does this via a succession of accelerators: a linear accelerator called Linac 2 followed by three synchrotrons – the Proton Synchrotron Booster (PSB), the Proton Synchrotron (PS) and the Super Proton Synchrotron (SPS).

    2
    CERN Linac 2. No image credit

    3
    CERN The Proton Synchrotron Booster

    CERN Proton Synchrotron

    CERN Super Proton Synchrotron

    The complex also provides the LHC with ion beams, which are first accelerated through a linear accelerator called Linac 3 [nand the Low Energy Ion Ring (LEIR) synchrotron before being injected into the PS and the SPS.

    6
    CERN Linac 3

    5
    CERN Low Energy Ion Ring (LEIR) synchrotron

    The CERN injectors, besides providing beams to the LHC, also serve a large number of fixed-target experiments at CERN – including the ISOLDE radioactive-beam facility and many others.

    CERN ISOLDE

    Part of the LHC’s success lies in the flexibility of the injectors to produce various beam parameters, such as the intensity, the spacing between proton bunches and the total number of bunches in a bunch train. This was clearly illustrated in 2016 when the LHC reached peak luminosity values 40% higher than the design value of 1034 cm–2 s–1, although the number of bunches in the LHC was still about 27% below the maximum achievable. This gain was due to the production of a brighter beam with roughly the same intensity per bunch but in a beam envelope of just half the size.

    Despite the excellent performance of today’s injectors, the beams produced are not sufficient to meet the very demanding proton beam parameters specified by the high-luminosity upgrade of the LHC (HL-LHC).

    Indeed, as of 2025, the HL-LHC aims to accumulate an integrated luminosity of around 250 fb–1 per year, to be compared with the 40 fb–1 achieved in 2016. For heavy-ion operations, the goals are just as challenging: with lead ions the objective is to obtain an integrated luminosity of 10 nb–1 during four runs starting from 2021 (compared to the 2015 achievement of less than 1 nb–1). This has demanded a significant upgrade programme that is now being implemented.

    Immense challenges

    To prepare the CERN accelerator complex for the immense challenges of the HL-LHC, the LHC Injectors Upgrade project (LIU) was launched in 2010. In addition to enabling the necessary proton and ion injector chains to deliver beams of ions and protons required for the HL-LHC, the LIU project must ensure the reliable operation and lifetime of the injectors throughout the HL-LHC era, which is expected to last until around 2035. Hence, the LIU project is also tasked with replacing ageing equipment (such as power supplies, magnets and radio-frequency cavities) and improving radioprotection measures such as shielding and ventilation. [See https://sciencesprings.wordpress.com/2017/09/21/from-cern-next-stop-the-superconducting-magnets-of-the-future/%5D

    One of the first challenges faced by the LIU team members was to define the beam-performance limitations of all the accelerators in the injector chain and identify the actions needed to overcome them by the required amount. Significant machine and simulation studies were carried out over a period of years, while functional and engineering specifications were prepared to provide clear guidelines to the equipment groups. This was followed by the production of the first hardware prototype devices and their installation in the machines for testing and, where possible, early exploitation.

    Significant progress has already been made concerning the production of ion beams. Thanks to the modifications in Linac 3 and LEIR implemented after 2015 and the intensive machine studies conducted within the LIU programme over the last three years, the excellent performance of the ion injector chain could be further improved in 2016 (figure 1). This enabled the recorded luminosity for the 2016 proton–lead run to exceed the target value by a factor of almost eight. The main remaining challenges for the ion beams will be to more than double the number of bunches in the LHC through complex RF manipulations in the SPS known as “momentum slip stacking”, as well as to guarantee continued and stable performance of the ion injector chain without constant expert monitoring.

    Along the proton injector chain, the higher-intensity beams within a comparatively small beam envelope required by the HL-LHC can only be demonstrated after the installation of all the LIU equipment during Long Shutdown 2 (LS2) in 2019–2020. The main installations feature: a new injection region, a new main power supply and RF system in the PSB; a new injection region and RF system to stabilise the future beams in the PS; an upgraded main RF system; and the shielding of vacuum flanges together with partial coating of the beam chambers in order to stabilise future beams against parasitic electromagnetic interaction and electron clouds in the SPS. Beam instrumentation, protection devices and beam dumps also need to be upgraded in all the machines to match the new beam parameters. The baseline goals of the LIU project to meet the challenging HL-LHC requirements are summarised in the panel (final page of feature).

    Execution phase

    Having defined, designed and endorsed all of the baseline items during the last seven years, the LIU project is presently in its execution phase. New hardware is being produced, installed and tested in the different machines. Civil-engineering work is proceeding for the buildings that will host the new PSB main power supply and the upgraded SPS RF equipment, and to prepare the area in which the new SPS internal beam dump will be located.

    The 86 m-long Linac 4, which will eventually replace Linac 2, is an essential component of the HL-LHC upgrade .

    7
    CERN Linac 4

    The machine, based on newly developed technology, became operational at the end of 2016 following the successful completion of acceleration tests at its nominal energy of 160 MeV. It is presently undergoing an important reliability run that will be instrumental to reach beams with characteristics matching the requirements of the LIU project and to achieve an operational availability higher than 95%, which is an essential level for the first link in the proton injector chain. On 26 October 2016, the first 160 MeV negative hydrogen-ion beam was successfully sent to the injection test stand, which operated until the beginning of April 2017 and demonstrated the correct functioning of this new and critical CERN injection system as well as of the related diagnostics and controls.

    The PSB upgrade has mostly completed the equipment needed for the injection of negative hydrogen ions from Linac 4 into the PSB and is progressing with the 2 GeV energy upgrade of the PSB rings and extraction, with a planned installation date of 2019–2020 during LS2. On the beam-physics side, studies have mainly focused on the deployment of the new wideband RF system, commissioning of beam diagnostics and investigation of space-charge effects. During the 2016–2017 technical stop, the principal LIU-related activities were the removal of a large volume of obsolete cables and the installation of new beam instrumentation (e.g. a prototype transverse size measurement device and turn-by-turn orbit measurement systems). The unused cables, which had been individually identified and labelled beforehand, could be safely removed from the machine to allow cables for the new LIU equipment to be pulled.

    The procurement, construction, installation and testing of upgrade items for the PS is also progressing. Some hardware, such as new corrector magnets and power supplies, a newly developed beam gas-ionisation monitor and new injection vacuum chambers to remove aperture limitations, was already installed during past technical stops. Mitigating anticipated longitudinal beam instabilities in the PS is essential for achieving the LIU baseline beam parameters. This requires that the parasitic electromagnetic interaction of the beam with the multiple RF systems has to be reduced and a new feedback system has to be deployed to keep the beam stable. Beam-dynamics studies will determine the present intensity reach of the PS and identify any remaining needs to comfortably achieve the value required for the HL-LHC. Improved schemes of bunch rotation are also under investigation to better match the beam extracted from the PS to the SPS RF system and thus limit the beam losses at injection energy in the SPS.

    In the SPS, the LIU deployment in the tunnel has begun in earnest, with the re-arrangement and improvement of the extraction kicker system, the start of civil engineering for the new beam-dump system in LSS5 and the shielding of vacuum flanges in 10 half-cells together with the amorphous carbon coating of the adjacent beam chambers (to mitigate against electron-cloud effects). In a notable first, eight dipole and 10 focusing quadrupole magnet chambers were amorphous carbon coated in-situ during the 2016–2017 technical stop, proving the industrialisation of this process (figure 2). The new overground RF building needed to accommodate the power amplifiers of the upgraded main RF system has been completed, while procurement and testing of the solid-state amplifiers has also commenced. The prototyping and engineering for the LIU beam-dump is in progress with the construction and installation of a new SPS beam-dump block, which will be able to cope with the higher beam intensities of the HL-LHC and minimise radiation issues.

    Regarding diagnostics, the development of beam-size measurement devices based on flying wire, gas ionisation and synchrotron radiation, all of which are part of the LIU programme, is already providing meaningful results (figure 3) addressing the challenges of measuring the operating high-intensity and high-brightness beams with high precision. From the machine performance and beam dynamics side, measurements in 2015–2016 made with the very high intensities available from the PS meant that new regimes were probed in terms of electron-cloud instabilities, RF power and losses at injection. More studies are planned in 2017–2018 to clearly identify a path for the mitigation of the injection losses when operating with higher beam currents.

    Looking forward to LS2

    The success of LIU in delivering beams with the desired parameters is the key to achieving the HL-LHC luminosity target. Without the LIU beams, all of the other necessary HL-LHC developments – including high-field triplet magnets [see above], crab cavities and new collimators – would only allow a fraction of the desired luminosity to be delivered to experiments.

    Whenever possible, LIU installation work is taking place during CERN’s regular year-end technical stops. But the great majority of the upgrade requires an extended machine stop and therefore will have to wait until LS2 for implementation. The duration of access to the different accelerators during LS2 is being defined and careful preparation is ongoing to manage the work on site, ensure safety and level the available resources among the different machines in the CERN accelerator complex. After all of the LIU upgrades are in place, beams will be commissioned with the newly installed systems. The LIU goals in terms of beam characteristics are, by definition, uncharted territory. Reaching them will require not only a high level of expertise, but also careful optimisation and extensive beam-physics and machine-development studies in all of CERN’s accelerators.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

     
  • richardmitnick 4:55 pm on May 19, 2017 Permalink | Reply
    Tags: , , Belle II rolls in, CERN Courier, , , KEK laboratory in Japan, ,   

    From CERN Courier: “Belle II rolls in” 

    CERN Courier

    May 19, 2017

    1
    The Belle II detector in place

    On 11 April, the Belle II detector at the KEK laboratory in Japan was successfully “rolled-in” to the collision point of the upgraded SuperKEKB accelerator, marking an important milestone for the international B-physics community. The Belle II experiment is an international collaboration hosted by KEK in Tsukuba, Japan, with related physics goals to those of the LHCb experiment at CERN but in the pristine environment of electron–positron collisions. It will analyse copious quantities of B mesons to study CP violation and signs of physics beyond the Standard Model (CERN Courier September 2016 p32).

    “Roll-in” involves moving the entire 8 m-tall, 1400 tonne Belle II detector system from its assembly area to the beam-collision point 13 m away. The detector is now integrated with SuperKEKB and all its seven subdetectors, except for the innermost vertex detector, are in place. The next step is to install the complex focusing magnets around the Belle II interaction point. SuperKEKB achieved its first turns in February, with operation of the main rings scheduled for early spring and phase-II “physics” operation by the end of 2018.

    Compared to the previous Belle experiment, and thanks to major upgrades made to the former KEKB collider, Belle II will allow much larger data samples to be collected with much improved precision. “After six years of gruelling work with many unexpected twists and turns, it was a moving and gratifying experience for everyone on the team to watch the Belle II detector move to the interaction point,” says Belle II spokesperson Tom Browder. “Flavour physics is now the focus of much attention and interest in the community and Belle II will play a critical role in the years to come.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

     
  • richardmitnick 8:28 pm on May 1, 2017 Permalink | Reply
    Tags: , CERN Courier, , , , ,   

    From CERN Courier: “How dark matter became a particle 

    CERN Courier

    Apr 13, 2017
    Gianfranco Bertone, University of Amsterdam
    Dan Hooper, Fermi National Accelerator Laboratory and the University of Chicago.

    It took decades for dark matter to enter the lexicon of particle physics. Today, explaining the nature and abundance of dark matter is one of the most pressing problems in the field.

    1
    EAGLE-project simulation

    Astronomers have long contemplated the possibility that there may be forms of matter in the universe that are imperceptible, either because they are too far away, too dim or intrinsically invisible. Lord Kelvin was perhaps the first, in 1904, to attempt a dynamical estimate of the amount of dark matter in the universe. His argument was simple yet powerful: if stars in the Milky Way can be described as a gas of particles acting under the influence of gravity, one can establish a relationship between the size of the system and the velocity dispersion of the stars. Henri Poincaré was impressed by Kelvin’s results, and in 1906 he argued that since the velocity dispersion predicted in Kelvin’s estimate is of the same order of magnitude as that observed, “there is no dark matter, or at least not so much as there is of shining matter”.

    The Swiss–US astronomer Fritz Zwicky is arguably the most famous and widely cited pioneer in the field of dark matter. In 1933, he studied the redshifts of various galaxy clusters and noticed a large scatter in the apparent velocities of eight galaxies within the Coma Cluster. Zwicky applied the so-called virial theorem – which establishes a relationship between the kinetic and potential energies of a system of particles – to estimate the cluster’s mass. In contrast to what would be expected from a structure of this scale – a velocity dispersion of around 80 km/s – the observed average velocity dispersion along the line of sight was approximately 1000 km/s. From this comparison, Zwicky concluded: “If this would be confirmed, we would get the surprising result that dark matter is present in much greater amount than luminous matter.”

    In the 1950s and 1960s, most astronomers did not ask whether the universe had a significant abundance of invisible or missing mass. Although observations from this era would later be seen as evidence for dark matter, back then there was no consensus that the observations required much, or even any, such hidden material, and certainly there was not yet any sense of crisis in the field. It was in 1970 that the first explicit statements began to appear arguing that additional mass was needed in the outer parts of some galaxies, based on comparisons between predicted and measured rotation curves. The appendix of a seminal paper published by Ken Freeman in 1970, prompted by discussions with radio-astronomer Mort Roberts, concluded that: “If [the data] are correct, then there must be in these galaxies additional matter which is undetected, either optically or at 21 cm. Its mass must be at least as large as the mass of the detected galaxy, and its distribution must be quite different from the exponential distribution which holds for the optical galaxy.” (Figure 1 below.)

    Several other lines of evidence began to appear that supported the same conclusion. In 1974, two influential papers (by Jaan Einasto, Ants Kaasik and Enn Saar, and by Jerry Ostriker, Jim Peebles and Amos Yahil) argued that a common solution existed for the mass discrepancies observed in clusters and in galaxies, and made the strong claim that the mass of galaxies had been until then underestimated by a factor of about 10.

    1
    Kelvin, Rubin, Bosma

    By the end of the decade, opinion among many cosmologists and astronomers had crystallised: dark matter was indeed abundant in the universe. Although the same conclusion was reached by many groups of scientists with different subcultures and disciples, many individuals found different lines of evidence to be compelling during this period. Some astronomers were largely persuaded by new and more reliable measurements of rotation curves, such as those by Albert Bosma, Vera Rubin and others. Others were swayed by observations of galaxy clusters, arguments pertaining to the stability of disc galaxies, or even cosmological considerations. Despite disagreements regarding the strengths and weaknesses of these various observations and arguments, a consensus nonetheless began to emerge by the end of the 1970s in favour of dark-matter’s existence.

    Enter the particle physicists

    From our contemporary perspective, it can be easy to imagine that scientists in the 1970s had in mind halos of weakly interacting particles when they thought about dark matter. In reality, they did not. Instead, most astronomers had much less exotic ideas in the form of comparatively low-luminosity versions of otherwise ordinary stars and gas. Over time, however, an increasing number of particle physicists became aware of and interested in the problem of dark matter. This transformation was not just driven by new scientific results, but also by sociological changes in science that had been taking place for some time.

    Half a century ago, cosmology was widely viewed as something of a fringe science, with little predictive power or testability. Particle physicists and astrophysicists did not often study or pursue research in each other’s fields, and it was not obvious what their respective communities might have to offer one another. More than any other problem in science, it was dark matter that brought particle physicists and astronomers together.

    As astrophysical alternatives were gradually ruled out one by one, the view that dark matter is likely to consist of one or more yet undiscovered species of subatomic particle came to be held almost universally among both particle physicists and astrophysicists alike.

    Perhaps unsurprisingly, the first widely studied particle dark-matter candidates were neutrinos. Unlike all other known particle species, neutrinos are stable and do not experience electromagnetic or strong interactions – which are essential characteristics for almost any viable dark-matter candidate. The earliest discussion of the role of neutrinos in cosmology appeared in a 1966 paper by Soviet physicists Gershtein and Zeldovich, and several years later the topic began to appear in the West, beginning in 1972 with a paper by Ram Cowsik and J McClelland. Despite the very interesting and important results of these and other papers, it is notable that most of them did not address or even acknowledge the possibility that neutrinos could account for the missing mass that had been observed by astronomers on galactic and cluster scales. An exception included the 1977 paper by Lee and [Steven]Weinberg, whose final sentence reads: “Of course, if a stable heavy neutral lepton were discovered with a mass of order 1–15 GeV, the gravitational field of these heavy neutrinos would provide a plausible mechanism for closing the universe.”

    While this is still a long way from acknowledging the dynamical evidence for dark matter, it was an indication that physicists were beginning to realise that weakly interacting particles could be very abundant in our universe, and may have had an observable impact on its evolution. In 1980, the possibility that neutrinos might make up the dark matter received a considerable boost when a group studying tritium beta decay reported that they had measured the mass of the electron antineutrino to be approximately 30 eV – similar to the value needed for neutrinos to account for the majority of dark matter. Although this “discovery” was eventually refuted, it motivated many particle physicists to consider the cosmological implications of their research.

    2
    No image caption, no image credit.

    Although we know today that dark matter in the form of Standard Model neutrinos would be unable to account for the observed large-scale structure of the universe, neutrinos provided an important template for the class of hypothetical species that would later be known as weakly interacting massive particles (WIMPs). Astrophysicists and particle physicists alike began to experiment with a variety of other, more viable, dark-matter candidates.

    Cold dark-matter paradigm

    The idea of neutrino dark matter was killed off in the mid-1980s with the arrival of numerical simulations. These could predict how large numbers of dark-matter particles would evolve under the force of gravity in an expanding universe, and therefore allow astronomers to assess the impact of dark matter on the formation of large-scale structure. In fact, by comparing the results of these simulations with those of galaxy surveys, it was soon realised that no relativistic particle could account for dark matter. Instead, the paradigm of cold dark matter – i.e. made of particles that were non-relativistic at the epoch of structure formation – was well on its way to becoming firmly established.

    Meanwhile, in 1982, Jim Peebles pointed out that the observed characteristics of the cosmic microwave background (CMB) also seemed to require the existence of dark matter.

    CMB per ESA/Planck

    ESA/Planck

    If just baryons existed, then one could only explain the observed degree of large-scale structure if the universe started in a fairly anisotropic or “clumpy” state. But by this time, the available data already set an upper limit on CMB anisotropies at a level of 10–4 – too meagre to account for the universe’s structure. Peebles argued that this problem would be relieved if the universe was instead dominated by massive weakly interacting particles whose density fluctuations begin to grow prior to the decoupling of matter and radiation during which the CMB was born. Among other papers, this received enormous attention within the scientific community and helped establish cold dark matter as the leading paradigm to describe the structure and evolution of the universe at all scales.

    Solutions beyond the Standard Model

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    Neutrinos might be the only known particles that are stable, electrically neutral and not strongly interacting, but the imagination of particle physicists did not remain confined to the Standard Model for long. Instead, papers started to appear that openly contemplated many speculative and yet undiscovered particles that might account for dark matter. In particular, particle physicists began to find new candidates for dark matter within the framework of a newly proposed space–time symmetry called supersymmetry.

    Standard model of Supersymmetry DESY

    The cosmological implications of supersymmetry were discussed as early as the late 1970s. In Piet Hut’s 1977 paper on the cosmological constraints on the masses of neutrinos, he wrote that the dark-matter argument was not limited to neutrinos or even to weakly interacting particles. The abstract of his paper mentions another possibility made within the context of the supersymmetric partner of the graviton, the spin-3/2 gravitino: “Similar, but much more severe, restrictions follow for particles that interact only gravitationally. This seems of importance with respect to supersymmetric theories,” wrote Hut.

    In their 1982 paper, Heinz Pagels and Joel Primack also considered the cosmological implications of gravitinos. But unlike Hut’s paper, or the other preceding papers that had discussed neutrinos as a cosmological relic, Pagels and Primack were keenly aware of the dark-matter problem and explicitly proposed that gravitinos could provide the solution by making up the missing mass. In many ways, their paper reads like a modern manuscript on supersymmetric dark matter, motivating supersymmetry by its various attractive features and then discussing both the missing mass in galaxies and the role that dark matter could play in the formation of large-scale structure. Around the same time, supersymmetry was being further developed into its more modern form, leading to the introduction of R-parity and constructions such as the minimal supersymmetric standard model (MSSM). Such supersymmetric models included not only the gravitino as a dark-matter candidate, but also neutralinos – electrically neutral mixtures of the superpartners of the photon, Z and Higgs bosons.

    Over the past 35 years, neutralinos have remained the single most studied candidate for dark matter and have been the subject of many thousand scientific publications. Papers discussing the cosmological implications of stable neutralinos began to appear in 1983. In the first two of these, Weinberg and Haim Goldberg independently discussed the case of a photino (a neutralino whose composition is dominated by the superpartner of the photon) and derived a lower bound of 1.8 GeV on its mass by requiring that the density of such particles does not overclose the universe. A few months later, a longer paper by John Ellis and colleagues considered a wider range of neutralinos as cosmological relics. In Goldberg’s paper there is no mention of the phrase “dark matter” or of any missing mass problem, and Ellis et al. took a largely similar approach by simply requiring only that the cosmological abundance of neutralinos not be so large as to overly slow or reverse the universe’s expansion rate. Although most of the papers on stable cosmological relics written around this time did not yet fully embrace the need to solve the dark-matter problem, occasional sentences could be found that reflected the gradual emergence of a new perspective.

    4
    The Bullet Cluster

    During the years that followed, an increasing number of particle physicists would further motivate proposals for physics beyond the Standard Model by showing that their theories could account for the universe’s dark matter. In 1983, for instance, John Preskill, Mark Wise and Frank Wilczek showed that the axion, originally proposed to solve the strong CP problem in quantum chromodynamics, could account for all of the dark matter in the universe. In 1993, Scott Dodelson and Lawrence Widrow proposed a scenario in which an additional, sterile neutrino species that did not experience electroweak interactions could be produced in the early universe and realistically make up the dark matter. Both the axion and the sterile neutrino are still considered as well-motivated dark-matter candidates, and are actively searched for with a variety of particle and astroparticle experiments.

    The triumph of particle dark matter

    In the early 1980s there was still nothing resembling a consensus about whether dark matter was made of particles at all, with other possibilities including planets, brown dwarfs, red dwarfs, white dwarfs, neutron stars and black holes. Kim Griest would later coin the term “MACHOs” – short for massive astrophysical compact halo objects – to denote this class of dark-matter candidates, in response to the alternative of WIMPs. There is a consensus today, based on searches using gravitational microlensing surveys and determinations of the cosmic baryon density based on measurements of the primordial light-element abundances and the CMB, that MACHOs do not constitute a large fraction of the dark matter.

    Gravitational microlensing, S. Liebes, Physical Review B, 133 (1964): 835

    An alternative explanation for particle dark matter is to assume that there is no dark matter in the first place, and that instead our theory of gravity needs to be modified. This simple idea, which was put forward in 1982 by Mordehai Milgrom, is known as modified Newtonian dynamics (MOND) and has far-reaching consequences. At the heart of MOND is the suggestion that the force due to gravity does not obey Newton’s second law, F = ma. If instead gravity scaled as F = ma2/a0 in the limit of very low accelerations (a << a0 ~ 1.2 × 10−10 m/s2), then it would be possible to account for the observed motions of stars and gas within galaxies without postulating the presence of any dark matter.

    In 2006, a group of astronomers including Douglas Clowe transformed the debate between dark matter and MOND with the publication of an article entitled: A direct empirical proof of the existence of dark matter. In this paper, the authors described the observations of a pair of merging clusters collectively known as the Bullet Cluster (image above left). As a result of the clusters’ recent collision, the distribution of stars and galaxies is spatially separated from the hot X-ray-emitting gas (which constitutes the majority of the baryonic mass in this system). A comparison of the weak lensing and X-ray maps of the bullet cluster clearly reveals that the mass in this system does not trace the distribution of baryons. Another source of gravitational potential, such as that provided by dark matter, must instead dominate the mass of this system.

    Following these observations of the bullet cluster and similar systems, many researchers expected that this would effectively bring the MOND hypothesis to an end. This did not happen, although the bullet cluster and other increasingly precise cosmological measurements on the scale of galaxy clusters, as well as the observed properties of the CMB, have been difficult to reconcile with all proposed versions of MOND. It is currently unclear whether other theories of modified gravity, in some yet-unknown form, might be compatible with these observations. Until we have a conclusive detection of dark-matter particles, however, the possibility that dark matter is a manifestation of a new theory of gravity remains open.

    Today, the idea that most of the mass in the universe is made up of cold and non-baryonic particles is not only the leading paradigm, but is largely accepted among astrophysicists and particle physicists alike. Although dark-matter’s particle nature continues to elude us, a rich and active experimental programme is striving to detect and characterise dark-matter’s non-gravitational interactions, ultimately allowing us to learn the identity of this mysterious substance. It has been more than a century since the first pioneering attempts to measure the amount of dark matter in the universe. Perhaps it will not be too many more years before we come to understand what that matter is.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

     
  • richardmitnick 4:55 pm on February 16, 2017 Permalink | Reply
    Tags: CERN Courier, Dark Matter Physics,   

    From CERN Courier: “Funding injection for SNOLAB” 

    CERN Courier

    1
    DEAP-3600 detector at SNOLAB

    The SNOLAB laboratory in Ontario, Canada, has received a grant of $28.6m to help secure its next three years of operations. The facility is one of 17 research facilities to receive support through Canada’s Major Science Initiative (MSI) fund, which exists to secure state-of-the-art national research facilities.

    SNOLAB, which is located in a mine 2 km beneath the surface, specialises in neutrino and dark-matter physics and claims to be the deepest cleanroom facility in the world. Current experiments located there include: PICO and DEAP-3600, which search for dark matter using bubble-chamber and liquid-argon technology, respectively; EXO, which aims to measure the mass and nature of the neutrino; HALO, designed to detect supernovae; and a new neutrino experiment SNO+ based on the existing SNO detector.

    4
    EXO. U. MD

    The new funds will be used to employ the 96-strong SNOLAB staff and support the operations and maintenance of the lab’s facilities.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: