Tagged: CERN Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 9:51 am on December 19, 2014 Permalink | Reply
    Tags: , CERN   

    From CERN: “Long Shutdown 1: Exciting times ahead” 

    CERN New Masthead

    Posted by Cian O’Luanaigh on 8 Feb 2013.
    Updated 11 Dec 2014
    Caroline Duc

    The Large Hadron Collider (LHC) has provided physicists with a huge quantity of data to analyse since the first physics run in 2009. Now it’s time for the machine, along with CERN’s other accelerators, to get a facelift. “Long Shutdown 1″ (LS1) will begin on 14 February 2013, but this doesn’t mean that life at CERN will be any less rich and exciting. Although there will be no collisions for a period of almost two years, the whole CERN site will be a hive of activity, with large-scale work under way to modernize the infrastructure and prepare the LHC for operation at higher energy.

    1
    Over 10,000 high-current splices between LHC magnets will be opened and consolidated during the first Long Shutdown of the LHC. This image shows their installation in 2007 (Image: CERN)

    “A whole series of renovation work will be carried out around the LHC during LS1,” says Simon Baird, deputy head of the Engineering department. “The key driver is of course the consolidation of the 10,170 high-current splices (link is external) between the superconducting magnets. The teams will start by opening up the 1695 interconnections between each of the cryostats of the main magnets. They will repair and consolidate around 500 interconnections simultaneously. The maintenance work will gradually cover the entire 27-kilometre circumference of the LHC.” The LHC will be upgraded as well as renovated during the period concerned. In the framework of the Radiation to Electronics project (R2E), sensitive electronic equipment protection will be optimized by relocating the equipment or by adding shielding.

    The work will by no means be confined to the LHC. Major renovation work is scheduled, for example, for the Proton Synchrotron (PS) and the Super Proton Synchrotron (SPS). During LS1 the upgrade of the PS access control system, which includes the installation of 25 new biometrically controlled access points, will continue. The whole tunnel ventilation system will also be dismantled and replaced, with 25 air-handling units representing a cumulated flow rate of 576,000 cubic metres per hour to be installed around the accelerator’s 628-metre circumference. Meanwhile, at the SPS, about 100 kilometres of radiation-damaged cables used in the instrumentation and control systems will be removed or replaced.

    CERN will take advantage of LS1 to improve the installations connected with the experiments, accelerators, electronics, and so on, with a view to a spectacular resumption of its main activities after the shutdown. While the shutdown work is in progress, life at the laboratory will be anything but boring. Stay tuned to keep abreast of all the developments.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Meet CERN in a variety of places:

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New
    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New

    LHC

    CERN LHC New

    LHC particles

    Quantum Diaries

     
  • richardmitnick 10:28 pm on December 3, 2014 Permalink | Reply
    Tags: , , , CERN, , , , , , ,   

    From isgtw: “Volunteer computing: 10 years of supporting CERN through LHC@home” 


    international science grid this week

    December 3, 2014
    Andrew Purcell

    LHC@home recently celebrated a decade since its launch in 2004. Through its SixTrack project, the LHC@home platform harnesses the power of volunteer computing to model the progress of sub-atomic particles traveling at nearly the speed of light around the Large Hadron Collider (LHC) at CERN, near Geneva, Switzerland. It typically simulates about 60 particles whizzing around the collider’s 27km-long ring for ten seconds, or up to one million loops. Results from SixTrack were used to help the engineers and physicists at CERN design stable beam conditions for the LHC, so today the beams stay on track and don’t cause damage by flying off course into the walls of the vacuum tube. It’s now also being used to carry out simulations relevant to the design of the next phase of the LHC, known as the High-Luminosity LHC.

    CERN LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    “The results of SixTrack played an essential role in the design of the LHC, and the high-luminosity upgrades will naturally require additional development work on SixTrack,” explains Frank Schmidt, who works in CERN’s Accelerators and Beam Physics Group of the Beams Department and is the main author of the SixTrack code. “In addition to its use in the design stage, SixTrack is also a key tool for the interpretation of data taken during the first run of the LHC,” adds Massimo Giovannozzi, who also works in CERN’s Accelerators and Beams Physics Group. “We use it to improve our understanding of particle dynamics, which will help us to push the LHC performance even further over the coming years of operation.” He continues: “Managing a project like SixTrack within LHC@home requires resources and competencies that are not easy to find: Igor Zacharov, a senior scientist at the Particle Accelerator Physics Laboratory (LPAP) of the Swiss Federal Institute of Technology in Lausanne (EPFL), provides valuable support for SixTrack by helping with BOINC integration.”

    c
    Volunteer computing is a type of distributed computing through which members of the public donate computing resources (usually processing power) to aid research projects. Image courtesy Eduardo Diez Viñuela, Flickr (CC BY-SA 2.0).

    Before LHC@home was created, SixTrack was run only on desktop computers at CERN, using a platform called the Compact Physics Screen Saver (CPSS). This proved to be a useful tool for a proof of concept, but it was first with the launch of the LHC@home platform in 2004 that things really took off. “I am surprised and delighted by the support from our volunteers,” says Eric McIntosh, who formerly worked in CERN’s IT Department and is now an honorary member of the Beams Department. “We now have over 100,000 users all over the world and many more hosts. Every contribution is welcome, however small, as our strength lies in numbers.”

    Virtualization to the rescue

    Building on the success of SixTrack, the Virtual LHC@home project (formerly known as Test4Theory) was launched in 2011. It enables users to run simulations of high-energy particle physics using their home computers, with the results submitted to a database used as a common resource by both experimental and theoretical scientists working on the LHC.

    Whereas the code for SixTrack was ported for running on Windows, OS X, and Linux, the high-energy-physics code used by each of the LHC experiments is far too large to port in a similar way. It is also being constantly updated. “The experiments at CERN have their own libraries and they all run on Linux, while the majority of people out there have common-or-garden variety Windows machines,” explains CERN honorary staff member of the IT department and chief technology officer of the Citizen Cyberscience Centre Ben Segal. “Virtualization is the way to solve this problem.”

    The birth of the LHC@home platform

    In 2004, Ben Segal and François Grey , who were both members of CERN’s IT department at the time, were asked to plan an outreach event for CERN’s 50th anniversary that would help people around the world to get an impression of the computational challenges facing the LHC. “I had been an early volunteer for SETI@home after it was launched in 1999,” explains Grey. “Volunteer computing was often used as an illustration of what distributed computing means when discussing grid technology. It seemed to me that it ought to be feasible to do something similar for LHC computing and perhaps even combine volunteer computing and grid computing this way.”

    “I contacted David Anderson, the person behind SETI@Home, and it turned out the timing was good, as he was working on an open-source platform called BOINC to enable many projects to use the SETI@home approach,” Grey continues. BOINC (Berkeley Open Infrastructures for Network Computing)is an open-source software platform for computing with volunteered resources. It was first developed at the University of California, Berkeley in the US to manage the SETI@Home project, and uses the unused CPU and GPU cycles on a computer to support scientific research.

    “I vividly remember the day we phoned up David Anderson in Berkeley to see if we could make a SETI-like computing challenge for CERN,” adds Segal. “We needed a CERN application that ran on Windows, as over 90% of BOINC volunteers used that. The SixTrack people had ported their code to Windows and had already built a small CERN-only desktop grid to run it on, as they needed lots of CPU power. So we went with that.”

    A runaway success

    “I was worried that no one would find the LHC as interesting as SETI. Bear in mind that this was well before the whole LHC craziness started with the Angels and Demons movie, and news about possible mini black holes destroying the planet making headlines,” says Grey. “We made a soft launch, without any official announcements, in 2004. To our astonishment, the SETI@home community immediately jumped in, having heard about LHC@home by word of mouth. We had over 1,000 participants in 24 hours, and over 7,000 by the end of the week — our server’s maximum capacity.” He adds: “We’d planned to run the volunteer computing challenge for just three months, at the time of the 50th anniversary. But the accelerator physicists were hooked and insisted the project should go on.”

    Predrag Buncic, who is now coordinator of the offline group within the ALICE experiment, led work to create the CERN Virtual Machine in 2008. He, Artem Harutyunyan (former architect and lead developer of CernVM Co-Pilot), and Segal subsequently adopted this virtualization technology for use within Virtual LHC@home. This has made it significantly easier for the experiments at CERN to create their own volunteer computing applications, since it is no longer necessary for them to port their code. The long-term vision for Virtual LHC@home is to support volunteer-computing applications for each of the large LHC experiments.
    Growth of the platform

    The ATLAS experiment recently launched a project that simulates the creation and decay of supersymmetric bosons and fermions. “ATLAS@Home offers the chance for the wider public to participate in the massive computation required by the ATLAS experiment and to contribute to the greater understanding of our universe,” says David Cameron, a researcher at the University of Oslo in Norway. “ATLAS also gains a significant computing resource at a time when even more resources will be required for the analysis of data from the second run of the LHC.”

    CERN ATLAS New
    ATLAS

    ATLAS@home

    Meanwhile, the LHCb experiment has been running a limited test prototype for over a year now, with an application running Beauty physics simulations set to be launched for the Virtual LHC@home project in the near future. The CMS and ALICE experiments also have plans to launch similar applications.

    CERN LHCb New
    LHCb

    CERN CMS New
    CMS

    CERN ALICE New
    ALICE

    An army of volunteers

    “LHC@home allows CERN to get additional computing resources for simulations that cannot easily be accommodated on regular batch or grid resources,” explains Nils Høimyr, the member of the CERN IT department responsible for running the platform. “Thanks to LHC@home, thousands of CPU years of accelerator beam dynamics simulations for LHC upgrade studies have been done with SixTrack, and billions of events have been simulated with Virtual LHC@home.” He continues: “Furthermore, the LHC@home platform has been an outreach channel, giving publicity to LHC and high-energy physics among the general public.”

    See the full article here.

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    iSGTW is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, iSGTW is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read iSGTW via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
  • richardmitnick 9:56 pm on November 27, 2014 Permalink | Reply
    Tags: , , CERN, , ,   

    From CERN: “How bright is the LHC?” 

    CERN New Masthead

    Nov 27, 2014
    l

    The LHCb Collaboration has published the results of a luminosity calibration with a precision of 1.12%. This is the most precise luminosity measurement achieved so far at a bunched-beam hadron collider.

    lm
    LHC beam results

    The absolute luminosity at a particle collider. is not only an important figure of merit for the machine, it is also a necessity for determining the absolute cross-sections for reaction processes. Specifically, the number of interactions, N, measured in an experiment depends on the value of cross-section σ and luminosity L, N = σL, so the precision obtained in measuring a given cross-section depends critically on the precision with which the luminosity is known. The luminosity itself depends on the number of particles in each collider beam and on the size of overlap of both beams at the collision point. At the LHC, dedicated instruments measure the beam currents, and hence the number of particles in each colliding beam, while the experiments measure the size of overlap of the beams at the collision point.

    A standard method to determine the overlap of the beams is the van der Meer scan, invented in 1968 by Simon van der Meer to measure luminosity in CERN’s Intersecting Storage Rings, the world’s first hadron collider. This technique, which involves scanning the beams across each other and monitoring the interaction rate, has been used by all of the four large LHC experiments. However, LHCb physicists proposed an alternative method in 2005 – the beam-gas imaging (BGI) method – which they successfully applied for the first time in 2009. This takes advantage of the excellent precision of LHCb’s Vertex Locator, a detector that is placed around the proton–proton collision point. The BGI method is based on reconstructing the vertices of “beam-gas” interactions, i.e. interactions between beam particles and residual gas nuclei in the beam pipe to measure the angles, positions and shapes of the individual beams without displacing them.

    To date, LHCb is the only experiment capable of using the BGI method. The technique involves calibrating the luminosity during special measurement periods at the LHC, and then tracking relative changes through changes in the counting rate in different sub-detectors. However, the vacuum pressure in the LHC is so low that for the technique to work with high precision, the beam–gas collision rate was increased by injecting neon gas into the LHC beam pipe during the luminosity calibration periods. This allowed the LHCb physicists to obtain precise images of the shapes of the individual beams, as illustrated in the left and middle graphs of the figure, which unraveled subtle but important features of the distributions of beam particles. By combining the beam–gas data with the measured distribution of beam–beam interactions, which provides the shape of the luminous region (the right graph in the figure), an accurate calibration of the luminosity was achieved.

    The beam–gas data also revealed that a small fraction of the beam’s charge is spread outside of the expected (i.e. “nominal”) bunch locations. Because only collisions of protons located in the nominal bunches are included in physics measurements, it was important to measure which fraction of the total beam current measured with the LHC’s current monitors participated in the collisions, i.e. contributed to the luminosity. Only LHCb could measure this fraction with sufficient precision, so the results of LHCb’s measurements of the fraction of charge outside the nominal bunch locations – the so-called “ghost” charge – were also used by the ALICE, ATLAS and CMS experiments.

    For proton–proton interactions at 8 TeV, a relative precision of the luminosity calibration of 1.47% was obtained using van der Meer scans and 1.43% using beam–gas imaging, resulting in a combined precision of 1.12%. The BGI method has proved to be so successful that it will now be used to measure beam sizes as part of monitoring and studying the LHC beams. Dedicated equipment will be installed in a modified region of the LHC ring near Point 4. This system, dubbed the Beam-Gas Vertexing system (BGV), is being developed by a collaboration from CERN, EPFL and RTWH Aachen. It includes a gas-injection system and a scintillating-fibre tracker telescope, which are expected to be commissioned with beam in 2015.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Meet CERN in a variety of places:

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New
    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New

    LHC

    CERN LHC New

    LHC particles

    Quantum Diaries

     
  • richardmitnick 12:27 pm on November 20, 2014 Permalink | Reply
    Tags: , , CERN, , ,   

    From CERN: “CERN makes public first data of LHC experiments” 

    CERN New Masthead

    20 Nov 2014
    Cian O’Luanaigh

    CERN today launched its Open Data Portal where data from real collision events, produced by experiments at the Large Hadron Collider (LHC) will for the first time be made openly available to all. It is expected that these data will be of high value for the research community, and also be used for education purposes.

    CERN LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    cern

    “Launching the CERN Open Data Portal is an important step for our Organization. Data from the LHC programme are among the most precious assets of the LHC experiments, that today we start sharing openly with the world. We hope these open data will support and inspire the global research community, including students and citizen scientists,” says CERN Director-General Rolf Heuer.

    The principle of openness is enshrined in CERN’s founding Convention, and all LHC publications have been published Open Access, free for all to read and re-use. Widening the scope, the LHC collaborations recently approved Open Data policies and will release collision data over the coming years.

    The first high-level and analysable collision data openly released come from the CMS experiment and were originally collected in 2010 during the first LHC run. This data set is now publicly available on the CERN Open Data Portal. Open source software to read and analyse the data is also available, together with the corresponding documentation. The CMS collaboration is committed to releasing its data three years after collection, after they have been thoroughly studied by the collaboration.

    CERN CMS New
    CMS

    “This is all new and we are curious to see how the data will be re-used,” says CMS data preservation coordinator Kati Lassila-Perini. “We’ve prepared tools and examples of different levels of complexity from simplified analysis to ready-to-use online applications. We hope these examples will stimulate the creativity of external users.”

    In parallel, the CERN Open Data Portal gives access to additional event data sets from the ALICE, ATLAS, CMS and LHCb collaborations, which have been specifically prepared for educational purposes, such as the international masterclasses in particle physics benefiting over ten thousand high-school students every year. These resources are accompanied by visualisation tools.

    CERN ALICE New
    ALICE

    CERN ATLAS New
    ATLAS

    CERN LHCb New
    LHCb

    “Our own data policy foresees data preservation and its sharing. We have seen that students are fascinated by being able to analyse LHC data in the past and so, we are very happy to take the first steps and make available some selected data for education” says Silvia Amerio, data preservation coordinator of the LHCb experiment.

    “The development of this Open Data Portal represents a first milestone in our mission to serve our users in preserving and sharing their research materials. It will ensure that the data and tools can be accessed and used, now and in the future,” says Tim Smith of the CERN IT Department.

    All data on OpenData.cern.ch are shared under a Creative Commons CC0 public domain dedication; data and software are assigned unique DOI identifiers to make them citable in scientific articles; and software is released under open source licenses. The CERN Open Data Portal is built on the open-source Invenio Digital Library software, which powers other CERN Open Science tools and initiatives.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon
    Stem Education Coalition

    Meet CERN in a variety of places:

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New
    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New

    LHC

    CERN LHC New

    LHC particles

    Quantum Diaries

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 1:40 pm on November 4, 2014 Permalink | Reply
    Tags: , , CERN, , ,   

    From Symmetry: “Fabiola Gianotti chosen as next head of CERN” This is great news!! 

    Symmetry

    November 04, 2014
    Kathryn Jepsen

    The former head of the ATLAS experiment at the LHC will be the first female leader of Europe’s largest particle physics laboratory.

    Today the CERN Council announced the selection of Italian physicist Fabiola Gianotti as the organization’s next director-general.

    fg

    Gianotti was leader of the ATLAS experiment at the Large Hadron Collider from March 2009 to February 2013, covering the period in which the ATLAS and CMS experiments announced the long-awaited discovery of the Higgs boson, recognised by the award of the Nobel Prize to François Englert and Peter Higgs in 2013. She will be the first woman to hold the position of CERN director-general.

    “We were extremely impressed with all three candidates put forward by the search committee,” says CERN Council President Agnieszka Zalewska. “It was Dr Gianotti’s vision for CERN’s future as a world-leading accelerator laboratory, coupled with her in-depth knowledge of both CERN and the field of experimental particle physics that led us to this outcome.”

    The appointment will be formalised at the December session of Council. Gianotti’s mandate will begin on January 1, 2016, and will run for a period of five years.

    “It is a great honor and responsibility for me to be selected as the next CERN director-general following 15 outstanding predecessors,” Gianotti says. “CERN is a center of scientific excellence and a source of pride and inspiration for physicists from all over the world. CERN is also a cradle for technology and innovation, a fount of knowledge and education and a shining, concrete example of worldwide scientific cooperation and peace.

    “It is the combination of these four assets that renders CERN so unique, a place that makes better scientists and better people. I will fully engage myself to maintain CERN’s excellence in all its attributes, with the help of everybody, including CERN Council, staff and users from all over the world.”

    Gianotti received her PhD in experimental particle physics from the University of Milan in 1989. Since 1994 she has been a research physicist in the Physics Department of CERN. She has worked on several CERN experiments, being involved in detector R&D and construction, software development and data analysis. She is the author or co-author on more than 500 publications in peer-reviewed scientific journals.

    Since August 2013 she has been an honorary professor at the University of Edinburgh. She received honorary doctoral degrees from the University of Uppsala, the Ecole Polytechnique Federale de Lausanne, McGill University and Oslo University.

    She was included among the “Top 100 most inspirational women” by The Guardian newspaper in the UK in 2011, chosen as a runner-up for Time magazine’s 2012 “Person of the Year,” included among the “Top 100 most powerful women” by Forbes magazine in 2013 and considered among the “Leading global thinkers of 2013” by Foreign Policy magazine.

    She is a member of the Italian Academy of Sciences and has served on several other international committees. She was recently selected to be a member of the Scientific Advisory Board of the UN Secretary-General, Ban Ki-moon.

    “Fabiola Gianotti is an excellent choice to be my successor,” says current CERN Director General Rolf Heuer. “It has been a pleasure to work with her for many years. I look forward to continuing to work with her through the transition year of 2015 and am confident that CERN will be in very good hands.”

    See the full article here.

    Symmetry is a joint Fermilab/SLAC publication.


    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 11:58 am on October 30, 2014 Permalink | Reply
    Tags: , CERN, , , , ,   

    From LC Newsline: “The future of Higgs physics” 

    Linear Collider Collaboration header
    Linear Collider Collaboration

    30 October 2014
    Joykrit Mitra

    In 2012, the ATLAS and CMS experiments at CERN’s Large Hadron Collider announced the discovery of the Higgs boson. The Higgs was expected to be the final piece of the particular jigsaw that is the Standard Model of particle physics, and its discovery was a monumental event.

    higgs
    Event recorded with the CMS detector in 2012 at a proton-proton centre of mass energy of 8 TeV. The event shows characteristics expected from the decay of the SM Higgs boson to a pair of photons (dashed yellow lines and green towers). Image: L. Taylor, CMS collaboration /CERN

    sm
    The Standard Model of elementary particles, with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    CERN ATLAS New
    CERN ATLAS

    CERN CMS New
    CDERN CMS

    But more precise studies of it are needed than the LHC is able to provide. That is why, years earlier, a machine like the International Linear Collider had been envisioned as a Higgs factory, and the Higgs discovery set the stage for its possible construction.

    ILC schematic
    ILC schematic

    Over the years, instruments for probing the universe have become more sophisticated. More refined data has hinted that aspects of the Standard Model are incomplete. If built, a machine such as the ILC will help reveal how wide a gulf there is between the universe and our understanding of it by probing the Higgs to unprecedented levels. And perhaps, as some physicists think, it will uproot the Standard Model and make way for an entirely new physics.

    In the textbook version, the Higgs boson is a single particle, and its alleged progenitor, the mysterious Higgs field that pervades every point in the universe, is a single field. But this theory is still to be tested.

    “We don’t know whether the Higgs field is one field or many fields,” said Michael Peskin of SLAC’s Theoretical Physics Group. “We’re just now scratching the surface at the LHC.”

    The LHC collides proton beams together, and the collision environment is not a clean one. Protons are made up of quarks and gluons, and in an LHC collision it’s really these many component parts – not the larger proton – that interact. During a collision, there are simply too many components in the mix to determine the initial energies of each one. Without knowing them, it’s not possible to precisely calculate properties of the particles generated from the collision. Furthermore, Higgs events at the LHC are exceptionally rare, and there is so much background that the amount of data that scientists have to sift through to glean information on the Higgs is astronomical.

    “There are many ways to produce an event that looks like the Higgs at the LHC,” Peskin said. “Lots of other things happen that look exactly like what you’re trying to find.”

    The ILC, on the other hand, would collide electrons and positrons, which are themselves fundamental particles. They have no component parts. Scientists would know their precise initial energy states and there will be significantly fewer distractions from the measurement standpoint. The ILC is designed to be able to accelerate particle beams up to energies of 250 billion electronvolts, extendable eventually to 500 billion electronvolts. The higher the particles’ energies, the larger will be the number of Higgs events. It’s the best possible scenario to probe the Higgs.

    If the ILC is built, physicists will first want to test whether the Higgs particle discovered at the LHC indeed has the properties predicted by the Standard Model. To do this, they plan to study Higgs couplings with known subatomic particles. The higher a particle’s mass, the proportionally stronger its coupling ought to be with the Higgs boson. The ILC will be sensitive enough to detect and accurately measure Higgs couplings with light particles, for instance with charm quarks. Such a coupling can be detected at the LHC in principle but is very difficult to measure accurately.

    The ILC can also help measure the exact lifetime of the Higgs boson. The more particles the Higgs couples to, the faster it decays and disappears. A difference between the measured lifetime and the projected lifetime—calculated from the Standard Model—could reveal what fraction of possible particles—or the Higgs’ interactions with them— we’ve actually discovered.

    “Maybe the Higgs interacts with something new that is very hard to detect at a hadron collider, for example if it cannot be observed directly, like neutrinos,” speculated John Campbell of Fermilab’s Theoretical Physics Department.

    These investigations could yield some surprises. Unexpected vagaries in measurement could point to yet undiscovered particles, which in turn would indicate that the Standard Model is incomplete. The Standard Model also has predictions for the coupling between two Higgs bosons, and physicists hope to study this as well to check if there are indeed multiple kinds of Higgs particles.

    “It could be that the Higgs boson is only a part of the story, and it has explained what’s happened at colliders so far,” Campbell said. “The self-coupling of the Higgs is there in the Standard Model to make it self-consistent. If not the Higgs, then some other thing has to play that role that self-couplings play in the model. Other explanations could also provide dark matter candidates, but it’s all speculation at this point.”

    image
    3D plot showing how dark matter distribution in our universe has grown clumpier over time. (Image: NASA, ESA, R. Massey from California Institute of Technology)

    The Standard Model has been very self-consistent so far, but some physicists think it isn’t entirely valid. It ignores the universe’s
    accelerating expansion caused by dark energy, as well as the mysterious dark matter that still allows matter to clump together and galaxies to form. There is speculation about the existence of undiscovered mediator particles that might be exchanged between dark matter and the Higgs field. The Higgs particle could be a likely gateway to this unknown physics.

    With the LHC set to be operational again next year, an optimistic possibility is that a new particle or two might be dredged out from trillions of collision events in the near future. If built, the ILC would be able to build on such discoveries, just as in case of the Higgs boson, and provide a platform for more precise investigation.

    The collaboration between a hadron collider like the LHC and an electron-positron collider of the scale of the ILC could uncover new territories to be explored and help map them with precision, making particle physics that much richer.

    See the full article here.

    The Linear Collider Collaboration is an organisation that brings the two most likely candidates, the Compact Linear Collider Study (CLIC) and the International Liner Collider (ILC), together under one roof. Headed by former LHC Project Manager Lyn Evans, it strives to coordinate the research and development work that is being done for accelerators and detectors around the world and to take the project linear collider to the next step: a decision that it will be built, and where.

    Some 2000 scientists – particle physicists, accelerator physicists, engineers – are involved in the ILC or in CLIC, and often in both projects. They work on state-of-the-art detector technologies, new acceleration techniques, the civil engineering aspect of building a straight tunnel of at least 30 kilometres in length, a reliable cost estimate and many more aspects that projects of this scale require. The Linear Collider Collaboration ensures that synergies between the two friendly competitors are used to the maximum.

    Linear Collider Colaboration Banner

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 2:33 pm on October 13, 2014 Permalink | Reply
    Tags: CERN,   

    From CERN via FNAL: “CERN and the rise of the Standard Model” 

    CERN New Masthead

    Curiosity is as old as humankind, and it is CERN’s raison d’être. When the Laboratory was founded, the structure of matter was a mystery. Today, we know that all visible matter in the Universe is composed of a remarkably small number of particles, whose behaviour is governed by four distinct forces. CERN has played a vital role in reaching this understanding.

    Watch, enjoy, learn.

    Meet CERN in a variety of places:

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New
    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New

    LHC

    CERN LHC New

    LHC particles

    Quantum Diaries

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 2:42 pm on August 23, 2014 Permalink | Reply
    Tags: , , CERN, , , , , U-70 Synchrotron   

    From ExtremeTech via Fermilab: “What happens if you get hit by the main beam of a particle accelerator like the LHC?” 


    Fermilab is an enduring source of strength for the US contribution to scientific research world wide.

    et
    July 28, 2014
    Sebastian Anthony

    atlas
    ATLAS

    I don’t know about you, but ever since I started covering the Large Hadron Collider and other large-scale particle accelerators for ExtremeTech, I’ve always morbidly wondered: What would happen if a scientist was accidentally hit by the main particle beam? Would the scientist explode in the style of beam weapons in Star Trek? Would the beam bore a hole clean through the scientist’s chest? Or maybe the beam would do nothing at all and pass through the scientist harmlessly? Well, fortunately (unfortunately?) we don’t have to guess, as this exact scenario actually happened to Anatoli Bugorski, a Russian scientist, way back in 1978.

    Back in the 1970s, Anatoli Bugorski was a researcher at the Soviet Union’s Institute for High Energy Physics. The Institute housed the U-70, a synchrotron that when it was built was the most powerful particle accelerator in the world (it’s still the most powerful accelerator in Russia today). The U-70 smashes two beams of protons together at a combined energy of around 76 GeV, at a speed that gets very close to the speed of light.

    u-70
    U-70

    ab
    Anatoli_Bugorski

    On July 13, 1978, Bugorski was checking a malfunction on the U-70… and then somehow his head ended up in the path of the main proton beam. The beam entered his skull on the back left, and came out near the left side of his nose. Sources seem to disagree on how much ionizing radiation Bugorski actually took to the head, but some say it was as high as 2,000-3,000 grays (200,000-300,000 rads). In any case, the beam would’ve been more than strong enough to burn a hole through the bone, skin, and brain tissue.

    At the time, Bugorski reported seeing a flash that was “brighter than a thousand suns,” but otherwise didn’t feel any pain. Over the next few days, the left side of his head swelled up “beyond recognition,” and then his skin started peeling off. Bugorski was moved to Moscow, where doctors avidly observed his expected demise — but, curiously enough, he survived. The left side of his face is paralyzed (due to nerve damage), his left ear is shot (all he can hear is an “unpleasant internal noise”), and he occasionally suffers from seizures, but otherwise Bugorski was relatively unscathed by the accident. He went on to complete his PhD — and he’s still alive today.
    Inside the Russian U-70 synchrotron building, in 2006

    diag
    U-70 synchrotron, diagram

    abt
    Anatoli Bugorski today.

    You can see that the left side of his face droops a bit from the paralysis, and that it’s wrinkle-free because he hasn’t been able to move it for 26 years — similar to how Botox works, in actual fact.

    Slightly anticlimactic, eh? Well, if it’s any consolation, Bugorski probably got incredibly lucky that the proton beam (apparently) missed any vital parts of his brain. If it had hit the hippocampus, motor cortex, or the frontal lobe, this story wouldn’t have had a very happy ending. Likewise, it’s probably lucky that the beam hit his brain — which has the remarkable ability to rewire itself when such disasters occur — rather than some other vital organ. If the beam had sliced through his heart, or an artery in his neck, he probably would’ve died instantly.

    It’s also important to note that the beam from a particle accelerator is very narrow (the more focused the beam is, the higher the chance of collisions with protons in the other beam). As you can see in the black and white photo above, only a small patch of hair is missing from Bugorski’s scalp, suggesting the beam only fried quite a narrow channel of brain tissue. In much the same way that you could pass a very thin hypodermic needle through someone without causing too much damage, a particle beam probably isn’t going to carve a comically large cylinder through the victim’s chest.

    chart
    XKCD’s radiation dose chart. A sievert (Sv) is a measure of absorbed radiation; grays (Gy) are a physical quantity of radiation. Bugorski was hit by a large number of grays, but seemingly didn’t absorb much of it.

    A dosage of between 2,000 and 3,000 grays, if it was effectively absorbed by the human body (i.e. sieverts), would usually be more than enough to cause acute radiation sickness and death. In this case, though, the beam was so focused that it just passed straight through his body; if it had been more scattered, and fried a wider smattering of cells, Bugorski would certainly have died.

    cms
    The LHC’s CMS detector. If the main beam was turned on, would the hard-hatted engineer be blown to smithereens?

    Finally, though, it’s worth noting that the Russian U-70 is a very weak particle accelerator by today’s standards. When the Large Hadron Collider comes back online in 2015, it’ll have a proton-proton collision energy of around 14 TeV — or about 200 times more power than the U-70′s 67 GeV. Despite its high energy, though, we’re still only talking about a beam of protons that’s a few millimeters wide — and of course there are all sorts of security measures that would prevent a CERN scientist from ever being hit by the LHC’s main beam. If those safety mechanisms failed, and the superconducting magnets that keep the beam focused and on target were on the fritz, then maybe you’d end up with a proton beam that moved around enough to slice a scientist into pieces. It’s a long shot, though.

    See the full article here.

    Fermilab Campus

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 12:13 pm on August 19, 2014 Permalink | Reply
    Tags: , , CERN, ,   

    From CERN Courier: “First direct high-precision measurement of the proton’s magnetic moment sets the stage for BASE” 

    CERN Courier

    Jul 23, 2014
    No Writer Credit

    A German/Japanese collaboration working at the University of Mainz has performed the first direct high-precision measurement of the magnetic moment of the proton – which is by far the most accurate to date.

    pro
    The quark structure of the proton. (The color assignment of individual quarks is not important, only that all three colors are present.)

    The result is consistent with the currently accepted value of the Committee on Data for Science and Technology (CODATA), but is 2.5 times more precise and 760 times more accurate than any previous direct measurement. The techniques used will feature in the Baryon-Antibaryon Symmetry Experiment (BASE) – recently approved to run at CERN’s Antiproton Decelerator (AD) – which aims at the direct high-precision measurement of the magnetic moments of the proton and the antiproton with fractional precisions at the parts-per-billion (ppb) level, or better.

    ap
    The quark structure of the antiproton.

    CERN Antiproton Decelerator
    CERN Antiproton Decelerator

    Prior to this work, the record for the most precise measurement of the proton’s magnetic moment had stood for more than 40 years. In 1972, a group at Massachusetts Institute of Technology measured its value indirectly by performing ground-state hyperfine spectroscopy with a hydrogen maser in a magnetic field. This experiment measured the ratio of the magnetic moments of the proton and the electron. The results, combined with theoretical corrections and two additional independent measurements, enabled the calculation of the proton magnetic moment with a precision of about 10 parts in a billion.

    hm
    Hydrogen maser. (Courtesy NASA/JPL-Caltech)

    In an attempt to surpass the record, the collaboration of scientists from Mainz University, the Max Planck Institute for Nuclear Physics in Heidelberg, GSI Darmstadt and the Japanese RIKEN institute applied the so-called double Penning trap technique to a single proton for the first time (see figure 1).

    pen

    One Penning trap – called the analysis trap – is used for the non-destructive detection of the spin state, through the continuous Stern-Gerlach effect. In this elegant approach, a strong magnetic inhomogeneity is superimposed on the trap, so coupling the particle’s spin-magnetic-moment to its axial oscillation frequency in the trap. By measuring the axial frequency, the spin quantum state of the trapped particle can be determined. And by recording the quantum-jump rate as a function of a spin-flip drive frequency, the spin precession frequency νL is obtained. Together with a measurement of the cyclotron frequency νc of the trapped particle, the magnetic moment of the proton μp is obtained finally in units of the nuclear magneton, μp/μN = νL/νc.

    graph
    Fig. 2.

    This approach has already been applied with great success in measurements of the magnetic moments of the electron and the positron. However, the magnetic moment of the proton is about 660 times smaller than that of the electron, so the proton measurement requires an apparatus that is orders of magnitude more sensitive. To detect the proton’s spin state, the collaboration used an extremely strong magnetic inhomogeneity of 300,000 T/m2. However, this limits the experimental precision in the frequency measurements to the parts-per-million (ppm) level. Therefore a second trap – the precision trap – was added about 45 mm away from the strong magnetic-field inhomogeneity. In this trap the magnetic field is about 75,000 times more homogeneous than in the analysis trap.

    To determine the magnetic moment of the proton, the first step was to identify the spin state of the single particle in the analysis trap. Afterwards the particle was transported to the precision trap, where the cyclotron frequency was measured and a spin flip induced. Subsequently the particle was transported back to the analysis trap and the spin state was analysed again. By repeating this procedure several hundred times, the magnetic moment was measured in the homogeneous magnetic field of the precision trap. The result, extracted from the normalized resonance curve (figure 2), is the value μp = 2.792847350(9)μN, with a relative precision of 3.3 ppb.

    In the BASE experiment at the AD the technique will be applied directly to a single trapped antiproton and will potentially improve the currently accepted value of the magnetic moment by at least a factor of 1000. This will constitute a stringent test with baryons of CPT symmetry – the most fundamental symmetry underlying the quantum field theories of the Standard Model of particle physics. CPT invariance implies the exact equality of the properties of matter–antimatter conjugates and any measured difference could contribute to understanding the striking imbalance of matter and antimatter observed on cosmological scales.

    See the full article here.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 12:14 pm on August 15, 2014 Permalink | Reply
    Tags: , CERN, , ,   

    From BBC: “Higgs boson spills secrets as LHC prepared for return” 

    BBC

    30 June 2014
    Paul Rincon

    It’s nearly time. After shutting down last year for vital repairs and upgrades, the Large Hadron Collider is being prepared for its comeback.

    CERN LHC Map
    Map of th LHC at CERN

    LHC Tube
    LHC tube in its tunnel

    Engineers at Cern in Geneva have begun cooling the huge machine to its operating temperature of -271.3C, which is colder than deep space.

    And the accelerator system that supplies the LHC with its proton particle beams – which are smashed together to recreate the conditions just after the Big Bang – is up and running for the first time since 2012.

    Teams are working to get the LHC – located in a circular tunnel beneath the French-Swiss border – back online by January 2015 and this time it will operate at its full energy of 14 trillion electron volts.

    After the $10bn machine was switched on for the first time in 2008, problems were found with many of the electrical splices between the 1,200 superconducting magnets that bend particle beams around the 27km-long underground ring.

    To prevent serious damage, officials decided to run the collider at an energy of seven to eight trillion electron volts – about half what it was designed for.

    “Much work has been carried out on the LHC over the last 18 months or so, and it’s effectively a new machine, poised to set us on the path to new discoveries,” said Cern’s director-general Rolf Heuer at the EuroScience Open Forum in Copenhagen this month.

    higgs
    The Higgs is a sub-atomic particle that was detected at the Large Hadron Collider in 2012
    It was proposed as a mechanism to explain mass by six physicists, including Peter Higgs, in 1964
    It imparts mass to other fundamental particles via the associated Higgs field
    It is the cornerstone of the Standard Model, which explains how particles interact

    The low energy run from 2010-2012 was nevertheless sufficient to achieve a key scientific goal: Detecting the elusive Higgs boson particle.

    The Higgs is the cornerstone of our current best theory of particle physics – the Standard Model. This is the “instruction booklet” that describes how elementary particles (the smallest building blocks of the Universe) and forces interact.

    sm
    The Standard Model of elementary particles, with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    On 4 July 2012, Cern announced that a five-decade-long search for the particle, first proposed by Edinburgh-based physicist Peter Higgs and others in the 1960s, had reached its conclusion.

    Scientists working on Atlas and CMS, the two huge multi-purpose detectors placed at strategic points around the LHC tunnel, saw the Higgs at a 5-sigma level of significance – the statistical threshold for announcing a discovery.

    CERN ATLAS New
    ATLAS

    CERN CMS New
    CMS

    Particle physicists have learnt more about the Higgs boson’s behaviour and how well it conforms to predictions. In a paper published in the journal Nature Physics, researchers outlined how they have watched the Higgs decay into the particles that make up matter (known as fermions), in addition to those that convey force (bosons), which had already been observed.

    This is exactly as the Standard Model predicts. Physicists know that this framework, devised in the 1970s, must be a stepping stone to a deeper understanding of the cosmos. But so far, it’s standing up exceptionally well. Searches at the LHC for deviations from this elegant scheme – such as evidence for new, exotic particles – have come to nothing.

    meeting
    Higgs update, 2012 Physicists packed out the auditorium at Cern to hear the Higgs boson discovery announcement in 2012

    two
    Higgs update, 2012 Francois Englert (L), Peter Higgs (R) and other originators of the Higgs boson theory were at Cern to hear the announcement. Englert and Higgs would later win a Nobel Prize for their work

    media
    Higgs update, 2012 The announcement was a huge media event too

    At ICHEP, other scientists are expected to outline details of a refined mass for the fundamental particle, which has been measured at approximately 125 gigaelectronvolts (GeV). For those outside the particle physics community, this might seem like a minor detail. But the mass of the Higgs is more than a mere number.

    There’s something very curious about its value that could have profound implications for the Universe. Mathematical models allow for the possibility that our cosmos is long-lived yet not entirely stable, and may – at some indeterminate point – be destroyed.

    “The overall stability of the Universe depends on the Higgs mass – which is a bit funny,” said Prof Jordan Nash, a particle physicist from Imperial College London, who works on the CMS experiment at Cern.

    “There’s a long theoretical argument which I won’t go into, but that value is intriguing in that it sits on the edge between what we think is the long-term stability of the Universe and a Universe that has a finite lifetime.”

    To use an analogy, imagine the Higgs boson is an object resting at the bottom of a curved slope. If that resting place really is the lowest point on the slope, then the vacuum of space is completely stable – in other words, it is in the lowest energy state and can go no further.
    Infographic The mass of the Higgs (inside rectangle) may hint at the stability of the Universe

    However, if at some point further along this slope, there’s another dip, the potential exists for the Universe to “topple” into this lower energy state, or minimum. If that happens, the vacuum of space collapses, dooming the cosmos.

    “The Higgs mass is in that place where it gets interesting, where it’s no longer guaranteed that there are no other minima,” Prof Nash, who works on the CMS experiment at Cern, told the BBC. But there’s no need to worry, the models suggest such a rare event would not occur for a very, very long time – many times further into the future, in fact, than the current age of the Universe.

    This idea of a finite lifetime for the cosmos is dependent on the Standard Model being the ultimate scheme in physics. But there is much in the Universe – gravitation and dark matter, for example – that the Standard Model can’t fully explain, so there are reasons to think that’s not the case.

    The existence of exotic particles, such as those predicted by the theory known as supersymmetry, would shore up the stability of the Universe in those mathematical models.

    Supersymmetry standard model

    But as previously mentioned, searches for these particles, called superpartners, have so far drawn a blank, as have attempts to detect dark matter, extra dimensions, and other phenomena beyond the Standard Model. Hopes that the LHC would allow scientists to lift the veil on a whole new realm of physics have proved optimistic, at least during its initial run.

    man
    Re-soldering Electrical connections between the superconducting magnets have been re-soldered

    man2
    Work in tunnel Engineers have been working to prepare the machine for a planned re-start at the beginning of 2015

    Some versions of supersymmetry have already been all but ruled out by the LHC. But the theory has many forms, depending on how you tweak the mathematical parameters.

    “From the theory community’s point of view, this is all very interesting because it fleshes out much better what the first run of the LHC has excluded,” said Prof Dave Charlton, who leads the Atlas experiment at Cern.

    “Therefore, it better establishes where we should be looking for new signals next year.”

    Assuming the theorists are indeed correct, supersymmetry will have to wait some time longer for its big reveal.

    Other hypothesised particles, such as the W prime and Z prime bosons could possibly be detected soon after the LHC returns to particle smashing.

    For now, all eyes are on the engineers at Cern. The LHC’s initial switch on was marked by mishaps, including a magnet that buckled in the tunnel during a test in 2007. The following year, another magnet failure caused a tonne of helium to leak out, forcing controllers to shut the machine down just nine days after its big switch-on.

    But after the re-start in 2009, the LHC performed flawlessly, and the rest, as they say, is history.

    If all goes well, by the end of March 2015 scientists could begin colliding high-energy beams of particles at the LHC.

    And that’s when the real fun will begin.

    See the full article here.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
Follow

Get every new post delivered to your Inbox.

Join 378 other followers

%d bloggers like this: