Tagged: LHC@home Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 10:28 pm on December 3, 2014 Permalink | Reply
    Tags: , , , , , , , , LHC@home, ,   

    From isgtw: “Volunteer computing: 10 years of supporting CERN through LHC@home” 

    international science grid this week

    December 3, 2014
    Andrew Purcell

    LHC@home recently celebrated a decade since its launch in 2004. Through its SixTrack project, the LHC@home platform harnesses the power of volunteer computing to model the progress of sub-atomic particles traveling at nearly the speed of light around the Large Hadron Collider (LHC) at CERN, near Geneva, Switzerland. It typically simulates about 60 particles whizzing around the collider’s 27km-long ring for ten seconds, or up to one million loops. Results from SixTrack were used to help the engineers and physicists at CERN design stable beam conditions for the LHC, so today the beams stay on track and don’t cause damage by flying off course into the walls of the vacuum tube. It’s now also being used to carry out simulations relevant to the design of the next phase of the LHC, known as the High-Luminosity LHC.

    CERN LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    “The results of SixTrack played an essential role in the design of the LHC, and the high-luminosity upgrades will naturally require additional development work on SixTrack,” explains Frank Schmidt, who works in CERN’s Accelerators and Beam Physics Group of the Beams Department and is the main author of the SixTrack code. “In addition to its use in the design stage, SixTrack is also a key tool for the interpretation of data taken during the first run of the LHC,” adds Massimo Giovannozzi, who also works in CERN’s Accelerators and Beams Physics Group. “We use it to improve our understanding of particle dynamics, which will help us to push the LHC performance even further over the coming years of operation.” He continues: “Managing a project like SixTrack within LHC@home requires resources and competencies that are not easy to find: Igor Zacharov, a senior scientist at the Particle Accelerator Physics Laboratory (LPAP) of the Swiss Federal Institute of Technology in Lausanne (EPFL), provides valuable support for SixTrack by helping with BOINC integration.”

    Volunteer computing is a type of distributed computing through which members of the public donate computing resources (usually processing power) to aid research projects. Image courtesy Eduardo Diez Viñuela, Flickr (CC BY-SA 2.0).

    Before LHC@home was created, SixTrack was run only on desktop computers at CERN, using a platform called the Compact Physics Screen Saver (CPSS). This proved to be a useful tool for a proof of concept, but it was first with the launch of the LHC@home platform in 2004 that things really took off. “I am surprised and delighted by the support from our volunteers,” says Eric McIntosh, who formerly worked in CERN’s IT Department and is now an honorary member of the Beams Department. “We now have over 100,000 users all over the world and many more hosts. Every contribution is welcome, however small, as our strength lies in numbers.”

    Virtualization to the rescue

    Building on the success of SixTrack, the Virtual LHC@home project (formerly known as Test4Theory) was launched in 2011. It enables users to run simulations of high-energy particle physics using their home computers, with the results submitted to a database used as a common resource by both experimental and theoretical scientists working on the LHC.

    Whereas the code for SixTrack was ported for running on Windows, OS X, and Linux, the high-energy-physics code used by each of the LHC experiments is far too large to port in a similar way. It is also being constantly updated. “The experiments at CERN have their own libraries and they all run on Linux, while the majority of people out there have common-or-garden variety Windows machines,” explains CERN honorary staff member of the IT department and chief technology officer of the Citizen Cyberscience Centre Ben Segal. “Virtualization is the way to solve this problem.”

    The birth of the LHC@home platform

    In 2004, Ben Segal and François Grey , who were both members of CERN’s IT department at the time, were asked to plan an outreach event for CERN’s 50th anniversary that would help people around the world to get an impression of the computational challenges facing the LHC. “I had been an early volunteer for SETI@home after it was launched in 1999,” explains Grey. “Volunteer computing was often used as an illustration of what distributed computing means when discussing grid technology. It seemed to me that it ought to be feasible to do something similar for LHC computing and perhaps even combine volunteer computing and grid computing this way.”

    “I contacted David Anderson, the person behind SETI@Home, and it turned out the timing was good, as he was working on an open-source platform called BOINC to enable many projects to use the SETI@home approach,” Grey continues. BOINC (Berkeley Open Infrastructures for Network Computing)is an open-source software platform for computing with volunteered resources. It was first developed at the University of California, Berkeley in the US to manage the SETI@Home project, and uses the unused CPU and GPU cycles on a computer to support scientific research.

    “I vividly remember the day we phoned up David Anderson in Berkeley to see if we could make a SETI-like computing challenge for CERN,” adds Segal. “We needed a CERN application that ran on Windows, as over 90% of BOINC volunteers used that. The SixTrack people had ported their code to Windows and had already built a small CERN-only desktop grid to run it on, as they needed lots of CPU power. So we went with that.”

    A runaway success

    “I was worried that no one would find the LHC as interesting as SETI. Bear in mind that this was well before the whole LHC craziness started with the Angels and Demons movie, and news about possible mini black holes destroying the planet making headlines,” says Grey. “We made a soft launch, without any official announcements, in 2004. To our astonishment, the SETI@home community immediately jumped in, having heard about LHC@home by word of mouth. We had over 1,000 participants in 24 hours, and over 7,000 by the end of the week — our server’s maximum capacity.” He adds: “We’d planned to run the volunteer computing challenge for just three months, at the time of the 50th anniversary. But the accelerator physicists were hooked and insisted the project should go on.”

    Predrag Buncic, who is now coordinator of the offline group within the ALICE experiment, led work to create the CERN Virtual Machine in 2008. He, Artem Harutyunyan (former architect and lead developer of CernVM Co-Pilot), and Segal subsequently adopted this virtualization technology for use within Virtual LHC@home. This has made it significantly easier for the experiments at CERN to create their own volunteer computing applications, since it is no longer necessary for them to port their code. The long-term vision for Virtual LHC@home is to support volunteer-computing applications for each of the large LHC experiments.
    Growth of the platform

    The ATLAS experiment recently launched a project that simulates the creation and decay of supersymmetric bosons and fermions. “ATLAS@Home offers the chance for the wider public to participate in the massive computation required by the ATLAS experiment and to contribute to the greater understanding of our universe,” says David Cameron, a researcher at the University of Oslo in Norway. “ATLAS also gains a significant computing resource at a time when even more resources will be required for the analysis of data from the second run of the LHC.”



    Meanwhile, the LHCb experiment has been running a limited test prototype for over a year now, with an application running Beauty physics simulations set to be launched for the Virtual LHC@home project in the near future. The CMS and ALICE experiments also have plans to launch similar applications.

    CERN LHCb New

    CERN CMS New


    An army of volunteers

    “LHC@home allows CERN to get additional computing resources for simulations that cannot easily be accommodated on regular batch or grid resources,” explains Nils Høimyr, the member of the CERN IT department responsible for running the platform. “Thanks to LHC@home, thousands of CPU years of accelerator beam dynamics simulations for LHC upgrade studies have been done with SixTrack, and billions of events have been simulated with Virtual LHC@home.” He continues: “Furthermore, the LHC@home platform has been an outreach channel, giving publicity to LHC and high-energy physics among the general public.”

    See the full article here.

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    iSGTW is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, iSGTW is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read iSGTW via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

  • richardmitnick 10:57 am on July 14, 2014 Permalink | Reply
    Tags: , , , , , LHC@home, ,   

    From Test4Theory: ” Server migration completed” 

    LHC@home 2dot0


    The migration of Test4Theory to the new vLHC@home server is completed. In case of problems, we advice to detach and re-attach to the project.

    Thanks for your understanding and for your contributions!

    See the full article here.


    ScienceSprings is powered by MAINGEAR computers

  • richardmitnick 1:58 pm on October 15, 2012 Permalink | Reply
    Tags: , , , , LHC@home   

    SymmetryMag Catches Up with LHC@home 

    Symmetry is A joint Fermilab/SLAC publication

    October 15, 2012
    Signe Brewster

    A summer of (physics) code
    This summer, seven young coding wizzes contributed to CERN projects through the Google Summer of Code program.


    Anyone in the world with a computer can contribute to research at CERN. Through the LHC@Home project, volunteers can offer up spare computing power to simulate and process collisions happening inside the Large Hadron Collider.

    The Grand Tube

    CERN recently improved the program with a new feature that helps scientists monitor the system that distributes work among volunteers’ computers. But the new feature is not the work of a CERN employee; it is the work of a college undergraduate who had the chance to work with CERN through the 2012 Google Summer of Code.

    ‘It’s a really nice concept,’ says Josip Lisec, a third-year student at the University of Zagreb in Croatia who worked on the improvement, which is known as Co-Pilot Monitor. ‘You’re pretty sure that the code you wrote during the summer will be put to use. And you learn a lot working with people on large projects.’

    ‘The things that I appreciated most about the students were their open-mindedness as well as willingness to learn,’ says Artem Harutyunyan, a former CERN fellow who mentored Lisec. ‘They proved to be talented developers who managed to get up to speed fairly quickly and started making contributions to our codebase right away.'”

    View the full article http://www.symmetrymagazine.org/article/october-2012/a-summer-of-physics-code

    BOINC is a leader in the field(s) of Distributed Computing, Grid Computing and Citizen Cyberscience.BOINC is more properly the Berkeley Open Infrastructure for Network Computing, developed at UC Berkeley.

    Visit the BOINC web page, click on Choose projects and check out some of the very worthwhile studies you will find. Then click on Download and run BOINC software/ All Versons. Download and install the current software for your 32bit or 64bit system, for Windows, Mac or Linux. When you install BOINC, it will install its screen savers on your system as a default. You can choose to run the various project screen savers or you can turn them off. Once BOINC is installed, in BOINC Manager/Tools, click on “Add project or account manager” to attach to projects. Many BOINC projects are listed there, but not all, and, maybe not the one(s) in which you are interested. You can get the proper URL for attaching to the project at the projects’ web page(s) BOINC will never interfere with any other work on your computer.


    SETI@home The search for extraterrestrial intelligence. “SETI (Search for Extraterrestrial Intelligence) is a scientific area whose goal is to detect intelligent life outside Earth. One approach, known as radio SETI, uses radio telescopes to listen for narrow-bandwidth radio signals from space. Such signals are not known to occur naturally, so a detection would provide evidence of extraterrestrial technology.

    Radio telescope signals consist primarily of noise (from celestial sources and the receiver’s electronics) and man-made signals such as TV stations, radar, and satellites. Modern radio SETI projects analyze the data digitally. More computing power enables searches to cover greater frequency ranges with more sensitivity. Radio SETI, therefore, has an insatiable appetite for computing power.

    Previous radio SETI projects have used special-purpose supercomputers, located at the telescope, to do the bulk of the data analysis. In 1995, David Gedye proposed doing radio SETI using a virtual supercomputer composed of large numbers of Internet-connected computers, and he organized the SETI@home project to explore this idea. SETI@home was originally launched in May 1999.”

    SETI@home is the birthplace of BOINC software. Originally, it only ran in a screensaver when the computer on which it was installed was doing no other work. With the powerand memory available today, BOINC can run 24/7 without in any way interfering with other ongoing work.

    The famous SET@home screen saver, a beauteous thing to behold.

    einstein@home The search for pulsars. “Einstein@Home uses your computer’s idle time to search for weak astrophysical signals from spinning neutron stars (also called pulsars) using data from the LIGO gravitational-wave detectors, the Arecibo radio telescope, and the Fermi gamma-ray satellite. Einstein@Home volunteers have already discovered more than a dozen new neutron stars, and we hope to find many more in the future. Our long-term goal is to make the first direct detections of gravitational-wave emission from spinning neutron stars. Gravitational waves were predicted by Albert Einstein almost a century ago, but have never been directly detected. Such observations would open up a new window on the universe, and usher in a new era in astronomy.”

    MilkyWay@Home Milkyway@Home uses the BOINC platform to harness volunteered computing resources, creating a highly accurate three dimensional model of the Milky Way galaxy using data gathered by the Sloan Digital Sky Survey. This project enables research in both astroinformatics and computer science.”

    Leiden Classical “Join in and help to build a Desktop Computer Grid dedicated to general Classical Dynamics for any scientist or science student!”

    World Community Grid (WCG) World Community Grid is a special case at BOINC. WCG is part of the social initiative of IBM Corporation and the Smarter Planet. WCG has under its umbrella currently eleven disparate projects at globally wide ranging institutions and universities. Most projects relate to biological and medical subject matter. There are also projects for Clean Water and Clean Renewable Energy. WCG projects are treated respectively and respectably on their own at this blog. Watch for news.

    Rosetta@home “Rosetta@home needs your help to determine the 3-dimensional shapes of proteins in research that may ultimately lead to finding cures for some major human diseases. By running the Rosetta program on your computer while you don’t need it you will help us speed up and extend our research in ways we couldn’t possibly attempt without your help. You will also be helping our efforts at designing new proteins to fight diseases such as HIV, Malaria, Cancer, and Alzheimer’s….”

    GPUGrid.net “GPUGRID.net is a distributed computing infrastructure devoted to biomedical research. Thanks to the contribution of volunteers, GPUGRID scientists can perform molecular simulations to understand the function of proteins in health and disease.” GPUGrid is a special case in that all processor work done by the volunteers is GPU processing. There is no CPU processing, which is the more common processing. Other projects (Einstein, SETI, Milky Way) also feature GPU processing, but they offer CPU processing for those not able to do work on GPU’s.


    These projects are just the oldest and most prominent projects. There are many others from which you can choose.

    There are currently some 300,000 users with about 480,000 computers working on BOINC projects That is in a world of over one billion computers. We sure could use your help.

    My BOINC


    ScienceSprings is powered by MAINGEAR computers

  • richardmitnick 2:55 pm on October 13, 2011 Permalink | Reply
    Tags: , , , , , , LHC@home, , , ,   

    Citizen Scientists, Unite!! Around two LHC@home Projects 

    Join your colleagues at the LHC in two Public Distributed Computing projects working for the Large Hadron Collider at CERN.

    LHC@home/Sixtrack works on magnet tuning so that there is less lost effort in the beam line.


    LHC@home 2.0 simulates collision events


    So, what do you do? First, if you are not already “crunching” for other worthwhile scientific projects, you go to the BOINC UC Berkeley Space Science Lab web site for a wee bit of software, which you install on your computer(s). Then, you visit the above two web sites, and attach to the two projects. For LHC@home 2.0 you might need to register and await an invitation. This project is very new and is in “beta”. LHC@home/Sixtrack is a mature project, so no waiting. At each project web site, you can find explanations of what is happening, the science. You can find forums in which you can participate.

    Historically, projects running on BOINC software used the idle CPU cycles of your computer. In fact, the grand daddy of all projects, SETI@home , actually ran in a screen saver. Today, with greater power, protected memory, “sandboxed” technology, projects run all the time and in no way interfere with whatever else you are doing on your computer.

    We need your help. There are over 1 billion computers in the world. BOINC counts all of its current users at about 295,000. The LHC is the largest and most complex basic scientific research experiment ever mounted by Modern Man. It ranks with the pyramids, penicillin, and Starbucks.

    SET@home, the largest and oldest project, based at the birthplace of BOINC, currently has 147,000 crunchers on 222,000 computers. To paraphrase Seth Shostak of the SETI Institute (no relation to SETI@home) they haven’t found pond scum. Not even dead pond scum. But, they are processing 423 TeraFLOPS per second. That’s half a PetaFLOP. That would put them at about #15 in the TOP500 world supercomputers list, except, of course, they are distributed.

    We deserve to be that big. We need to scale up and vie with our friends at SETI@home (I crunch for SETI@home on four of my six computers.)

    So, please, visit the sites. Take a look. Tell your colleagues. Be part of something great.

    [If you are at any school, university, institution of any sort, never ever install BOINC software on their computers without written permission of an authority.]

    Meet CERN in a variety of places:

    Cern Courier








  • richardmitnick 2:34 pm on July 28, 2011 Permalink | Reply
    Tags: , , , , , LHC@home, , ,   

    From isgtw: “Virtual atom smasher in LHC@Home 2.0” 

    Wonderful article about a new project being birthed from the ground up.

    Here is a beginning to entice the scientifically inclined:

    Jacqui Hayes
    July 27, 2011

    ““When I learned that if successful, LHC@home 2.0 would allow BOINC users to participate in physics experiments being done at CERN – and ultimately perhaps even the search for the Higgs Boson itself – I jumped at the chance to be a part of it,” said Tony De Bari.


    De Bari, who is from New Jersey, USA, joined the project in February 2011, after finding out about it from forum posts online and emailing the developers to ask for an invitation code.

    Dr Bari now has three computers, at work and at home, contributing time to the LHC@Home 2.0 project. “Even though I do not have a formal physics background, I am a self-proclaimed ‘science geek’ with a particular interest in physics. While pursuing my computer science degree, I took as many elective physics courses as time would permit, and I try to read as much about the subject as I can.”

    The developers hope LHC@Home 2.0 will be used for several research projects in the future. The first project, called Test4Theory, has been in the alpha testing phase since October 2010, and now has more than 100 volunteers just like De Bari. Even at this early stage, volunteers have already provided about 10% of the total computing resources currently available to theoretical physicists at CERN according to Anton Karneyeu, one of the developers on the project who also works on the CMS experiment.”

    My personal story is close to Dr De Bari’s, even to my location in New Jersey. I have six machines on the project. I have been on the original lhc@home project for quite some time. I asked for the invitation and was allowed to participate.

    What is BOINC?

    BOINC (Berkeley Open Infrastructures for Network Computing) is an open-source software platform for computing with volunteered resources. It was first developed at the University of California Berkeley to manage the SETI@Home project, and uses the unused CPU and GPU cycles on a computer for scientific computing.

    Please see the full articled here. There is a lot to read and to stimulate the curious toward this incredible project.

  • richardmitnick 6:16 pm on April 25, 2011 Permalink | Reply
    Tags: , LHC@home   

    BOINC Project LHC@home is back up but for Linux only??? 


    So, I watched a video put up by Dave Anderson from Asia@home. It was a talk one Daniel Lombrano Gonzalez about the BOINC project LHC@home. Apparently it has been re-started, but only for Linux!! . I could not be more unhappy. This is my favorite project.

    I found Mr. (Dr.?) Gonzalez’ email address and sent him a couple of polite notes. In one of them, I included a link to the stats for LHC@home at BOINCStats:

    Look at the figures: historically: over 88,000 people, 229,000 machines. Mostly Windows. Now only 2120 people on 2383 machines.

    I can only hope that they will look at the figures. Maybe once they have digested getting up on Linux they will see the wisdom of getting the project up for Windows machines. And Macs. Hey, I am a liberal.

    Here is the irony for me. I got interested in CERN and the LHC in 1985 when I saw the Timothy Ferris video Creation of Universe on PBS. I have been fascinated by what goes on there and also at Fermilab and the Tevatron, also a big part of the video. More recently, I got copies of two more videos, The Atom Smashers, centered mostly on Fermilab; and The Big Bang Machine, centered mostly on CERN. Each video did include material on the other center.

    A few years ago, I started “crunching” for a Cancer project at Oxford University. I knew of no other centers for this kind of distributed computing. Then, a friend told me about BOINC, BOINC projects, and WCG, World Community Grid. I have been an obsessed cruncher ever since. I am now running six machines, mostly 64 bit Win7 hyper threaded quads, 24/7.

    But, LHC@home, as little work as I got, as little work as anyone got, this was my favorite project.

    More recently, I started this blog to raise the visibility of U.S. basic research and the U.S. contribution to scientific research all around the world. Someone at CERN told me that 30% of the people at CERN are from the U.S. They put me on to the “Graybook” which lists all of the institutions in the world which contribute to the processing of data from the LHC. This included all of the D.O.E. labs and U.S. universities involved in the work. So, I built up this blog by getting all of the RSS feeds, Twitter feeds, and Facebook pages. Basically I try to take their news out to a wider world, a consumer world. Stuff you do not see on CNN, or in the New York Times or Wall Street Journal, Time, Newsweek, etc. I actually write very little. I am not a scientist. I am an obsessed onlooker. Let me tell you, all of these institutions use every bit of social media at their disposal. If you look back on the past posts here. You will see that CERN dominates above all other sources. I do posts for BOINC projects and WCG projects whenever there is material about which to write.

    So, that’s it. I am an obsessed cruncher and also obsessed with the LHC.

    I will never be a Linux guy. I can only hope that they do something in LHC@home to take advantage of the Windows population, which far exceeds and will ALWAYS far exceed the Linux population.

    Maybe someone there will see this post.

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: