Tagged: BOINC Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 7:51 pm on December 17, 2014 Permalink | Reply
    Tags: BOINC, ,   

    From HCC at WCG: “New imaging tools accelerate cancer research” 

    New WCG Logo

    15 Dec 2014
    Help Conquer Cancer research team

    Summary
    The Help Conquer Cancer research team at the Ontario Cancer Institute continues to analyze the millions of protein-crystallization images processed by World Community Grid volunteers, by building new classifiers based on a combination of Grid-processed image features, and deep features learned directly from image pixels. Improvements in image classification, along with new data provided by our collaborators increase possibilities for discovering useful and interesting patterns in protein crystallization.

    Dear World Community Grid volunteers,

    Since our last Help Conquer Cancer (HCC) project update, we have continued to analyze the results that you generated. Here, we provide an update on that analysis work, and new research directions the project is taking.

    Analyzing HCC Results

    Volunteers for the HCC project received raw protein crystallization images and processed each image into a set of over 12,000 numeric image features. These features were implemented by a combination of image-processing algorithms, and refined over several generations of image-processing research leading up to the launch of HCC. The features (HCC-processed images) were then used to train a classifier that would convert each image’s features into a label describing the crystallization reaction captured in the image.

    Importantly, these thousands of features were human-designed. Most protein crystals have straight edges, for example, and so certain features were incorporated into HCC that search for straight lines. This traditional method of building an image classifier involves two types of learning: the crystallographer or image-processing expert (human), who studies the image and designs features, and the classifier (computer model), that learns to predict image labels from the designed features. The image classifier itself never sees the pixels; any improvements to the feature design must come from the human expert.

    More recently, we have applied a powerful computer-vision/machine-learning technology that improves this process by closing the feedback loop between pixels, features and the classifier: deep convolutional neural networks (CNNs). These models learn their own features directly from the image pixels; thus, they could complement human-designed features.

    CrystalNet

    We call our deep convolutional neural networks [CNN] CrystalNet. Our preliminary results suggest that it is an accurate and efficient classifier for protein crystallization images.

    In a CNN, multiple filters act like pattern detectors that are applied across the input image. A single map of the layer 1 feature maps shows the activation responses from a single filter. Deep CNNs refers to CNNs with many layers: higher-level filters stacked upon lower-level filters. Information from image pixels at the bottom of the network rises upwards through layers of filters until the “deep” features emerge from the top. Although the example shown in Figure 1 (below) has only 6 layers, more layers can be easily added. Including other image preprocessing and normalization layers, CrystalNet has 13 layers in total.

    1
    Fig. 1: Diagram of the standard convolutional neural network. For a single feature map, the convolution operation applies inner product of the same filter across the input image. 2D topography is preserved in the feature map representation. Spatial pooling performs image down-sampling of the feature maps by a factor of 2. Fully connected layers are the same as standard neural network layers. Outputs are discrete random variables or “1-of-K” codes. Element-wise nonlinearity is applied at every layer of the network.

    2
    After training, Figure 2 shows examples of the first layer filters. These filters extract interesting features useful for protein crystallography classification. Note that some of these filters look like segments of straight lines. Others resemble microcrystal-detecting filters previously designed for HCC.
    Fig. 2: Selected examples of the first-layer filters learned by our deep convolutional neural net. These filters have resemblances to human-designed feature extractors such as edge (top row), microcrystal (bottom), texture, and other detectors from HCC and computer vision generally.

    3
    Figure 3 shows CrystalNet’s crystal-detection performance across 10 image classes in the test set. CrystalNet produces an area under curve (AUC) 0.9894 for crystal class classification. At 5% false positive rate, our model can accurately detect 98% of the positive cases.

    CrystalNet can provide labels for images generated during the high-throughput process effectively, with a low miss rate and high precision for crystal detection. Moreover, CrystalNet operates in real-time, where labeling 1,536 images from a single plate only requires approximately 2 seconds. The combination of accuracy and efficiency makes a fully automated high-throughput crystallography pipeline possible, substantially reducing labor-intensive screening.

    New data from collaborators

    Our collaborators at the High-Throughput Screening Lab at the Hauptman-Woodward Medical Research Institute (HWI) supplied the original protein-crystallization image data. They continue to generate more, and are using versions of the image classifiers derived from the HCC project.

    Our research on the predictive science of protein crystallization has been limited by the information we have about the proteins being crystallized. Our research partners at HWI run crystallization trials on proteins supplied by labs all over the world. Often, protein samples are missing the identifying information that allows us to link these samples to global protein databases (e.g., Uniprot). Missing protein identifiers prevent us from integrating these samples into our data-mining system, and thereby linking the protein’s physical and chemical properties to each cocktail and corresponding crystallization response.

    Recently, however, HWI crystallographers were able to compile and share with us a complete record of all crystallization-trial proteins produced by the North-Eastern Structural Genomics (NESG) group. This dataset represents approximately 25% of all proteins processed by HCC volunteers on World Community Grid. Now all our NESG protein records are complete with each protein’s Uniprot ID, amino-acid sequence, and domain signatures.

    With more complete protein/cocktail information, combined with more accurate image labels from improved deep neural-net image classifiers, we anticipate greater success mining our protein-crystallization database. Work is ongoing.

    See the full article here.

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    World Community Grid (WCG) brings people together from across the globe to create the largest non-profit computing grid benefiting humanity. It does this by pooling surplus computer processing power. We believe that innovation combined with visionary scientific research and large-scale volunteerism can help make the planet smarter. Our success depends on like-minded individuals – like you.”

    WCG projects run on BOINC software from UC Berkeley.

    BOINC is a leader in the field(s) of Distributed Computing, Grid Computing and Citizen Cyberscience.BOINC is more properly the Berkeley Open Infrastructure for Network Computing.

    CAN ONE PERSON MAKE A DIFFERENCE? YOU BETCHA!!

    “Download and install secure, free software that captures your computer’s spare power when it is on, but idle. You will then be a World Community Grid volunteer. It’s that simple!” You can download the software at either WCG or BOINC.

    Please visit the project pages-
    Outsmart Ebola together

    Outsmart Ebola Together

    Mapping Cancer Markers
    mappingcancermarkers2

    Uncovering Genome Mysteries
    Uncovering Genome Mysteries

    Say No to Schistosoma

    GO Fight Against Malaria

    Drug Search for Leishmaniasis

    Computing for Clean Water

    The Clean Energy Project

    Discovering Dengue Drugs – Together

    Help Cure Muscular Dystrophy

    Help Fight Childhood Cancer

    Help Conquer Cancer

    Human Proteome Folding

    FightAIDS@Home

    World Community Grid is a social initiative of IBM Corporation
    IBM Corporation
    ibm

    IBM – Smarter Planet
    sp

     
  • richardmitnick 10:16 am on December 17, 2014 Permalink | Reply
    Tags: , , , BOINC, , TheSkyNet   

    BOINC New Project – The SkyNet 


    BOINC is a leader in the field(s) of Distributed Computing, Grid Computing and Citizen Cyberscience.BOINC is more properly the Berkeley Open Infrastructure for Network Computing, developed at UC Berkeley.

    New Project for BOINC

    TheSkyNet

    theSkyNet

    Have a computer? Want to help astronomers make awesome discoveries and understand our Universe? theSkyNet needs you!

    By connecting 100s and 1000s of computers together through the Internet, it’s possible to simulate a single machine capable of doing some pretty amazing stuff. theSkyNet is a community computing project dedicated to radio astronomy. Astronomers use telescopes to observe the Universe at many different wavelengths. All day, every day, signals from distant galaxies, stars and other cosmic bits and pieces arrive at the Earth in the form of visible (optical) light, radio waves, infrared radiation, ultraviolet radiation and many other types of waves. Once detected by a telescope the signal is processed by computers and used by scientists to support a theory or inspire a new one.

    When you join theSkyNet your computer will help astronomers process information and answer some of the big questions we have about the Universe. As a part of theSkyNet community your computer will be called upon to process small packets of data, but you wont even notice it’s going on. The key to theSkyNet is to have lots of computers connected, with each doing only a little, but it all adding up to a lot.

    At the heart of theSkyNet is this website, theSkyNet.org where you’ll find alliances you’ve joined stack up against others. The more data you and your alliances process, the higher you’ll climb in the rankings. But that’s not it, because as theSkyNet project evolves we’ll be adding more features for you to explore. In the pipeline we have visualisation tools to help you understand the data you’re processing and even an opportunity to help identify and catalogue radio wave sources in the sky.

    At the moment theSkyNet has two main science projects for you to contribute to, theSkyNet SourceFinder and theSkyNet POGS. You can find out more about theSkyNet’s science and our two projects at the Science Portals.

    theSkyNet SourceFinder

    TheSkyNet SourceFinder was the first science project on theSkyNet. It’s based on a Java distributed computing technology called Nereus.

    Right now SourceFinder is busy processing an absolutely huge simulated cube of data that’s over one terabyte (1TB) in size.

    Automatically working through a data set this big has never been done before, and ICRAR’s astronomers are eagerly awaiting the results from SourceFinder to prove it’s possible.

    The next generation of amazing radio telescopes, such as CSIRO’s Australian Square Kilometre Array Pathfinder (ASKAP), will produce data cubes just like the one that SourceFinder is currently working on. When ASKAP starts collecting data from the sky in 2015 astronomers need to be ready to process the information it collects – and to find the location of radio sources, like galaxies, within it.

    Australian Square Kilometer Array Pathfinder Project
    ASKAP

    ASKAP will produce many thousands of data sets like the cube SourceFinder is working on, so figuring out how to process them automatically, with maximum efficiency and accuracy is one of the challenges astronomers are facing.

    SourceFinder is using some code called Duchamp that has been built to automatically tell the difference between background noise and real radio sources in data from a radio telescope. By processing this 1TB chunk of data on theSkyNet SourceFinder astronomers are:

    working out the best way to run Duchamp on big data in the future,
    proving that distributed computing is a real solution to process such detailed (and large) radio astronomy data, and
    learning more about how they’re going to get the most information out of telescopes like ASKAP and the SKA.

    The SkyNet POGS

    5

    TheSkyNet POGS is a research project that uses Internet-connected computers to do research in astronomy. We will combine the spectral coverage of GALEX, Pan-STARRS1, and WISE to generate a multi-wavelength UV-optical-NIR galaxy atlas for the nearby Universe. We will measure physical parameters (such as stellar mass surface density, star formation rate surface density, attenuation, and first-order star formation history) on a resolved pixel-by-pixel basis using spectral energy distribution (SED) fitting techniques in a distributed computing mode. You can participate by downloading and running a free program on your computer.

    NASA Galaxy Telescope
    NASA/GALEX

    Pann-STARSR1 Telescope
    Pann-STARRS1 interior
    Pan-STARRS1

    NASA Wise Telescope
    NASA/WISE

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Visit the BOINC web page, click on Choose projects and check out some of the very worthwhile studies you will find. Then click on Download and run BOINC software/ All Versons. Download and install the current software for your 32bit or 64bit system, for Windows, Mac or Linux. When you install BOINC, it will install its screen savers on your system as a default. You can choose to run the various project screen savers or you can turn them off. Once BOINC is installed, in BOINC Manager/Tools, click on “Add project or account manager” to attach to projects. Many BOINC projects are listed there, but not all, and, maybe not the one(s) in which you are interested. You can get the proper URL for attaching to the project at the projects’ web page(s) BOINC will never interfere with any other work on your computer.

    MAJOR PROJECTS RUNNING ON BOINC SOFTWARE

    SETI@home The search for extraterrestrial intelligence. “SETI (Search for Extraterrestrial Intelligence) is a scientific area whose goal is to detect intelligent life outside Earth. One approach, known as radio SETI, uses radio telescopes to listen for narrow-bandwidth radio signals from space. Such signals are not known to occur naturally, so a detection would provide evidence of extraterrestrial technology.

    Radio telescope signals consist primarily of noise (from celestial sources and the receiver’s electronics) and man-made signals such as TV stations, radar, and satellites. Modern radio SETI projects analyze the data digitally. More computing power enables searches to cover greater frequency ranges with more sensitivity. Radio SETI, therefore, has an insatiable appetite for computing power.

    Previous radio SETI projects have used special-purpose supercomputers, located at the telescope, to do the bulk of the data analysis. In 1995, David Gedye proposed doing radio SETI using a virtual supercomputer composed of large numbers of Internet-connected computers, and he organized the SETI@home project to explore this idea. SETI@home was originally launched in May 1999.”


    SETI@home is the birthplace of BOINC software. Originally, it only ran in a screensaver when the computer on which it was installed was doing no other work. With the powerand memory available today, BOINC can run 24/7 without in any way interfering with other ongoing work.

    seti
    The famous SET@home screen saver, a beauteous thing to behold.

    einstein@home The search for pulsars. “Einstein@Home uses your computer’s idle time to search for weak astrophysical signals from spinning neutron stars (also called pulsars) using data from the LIGO gravitational-wave detectors, the Arecibo radio telescope, and the Fermi gamma-ray satellite. Einstein@Home volunteers have already discovered more than a dozen new neutron stars, and we hope to find many more in the future. Our long-term goal is to make the first direct detections of gravitational-wave emission from spinning neutron stars. Gravitational waves were predicted by Albert Einstein almost a century ago, but have never been directly detected. Such observations would open up a new window on the universe, and usher in a new era in astronomy.”

    MilkyWay@Home Milkyway@Home uses the BOINC platform to harness volunteered computing resources, creating a highly accurate three dimensional model of the Milky Way galaxy using data gathered by the Sloan Digital Sky Survey. This project enables research in both astroinformatics and computer science.”

    Leiden Classical “Join in and help to build a Desktop Computer Grid dedicated to general Classical Dynamics for any scientist or science student!”

    World Community Grid (WCG) World Community Grid is a special case at BOINC. WCG is part of the social initiative of IBM Corporation and the Smarter Planet. WCG has under its umbrella currently eleven disparate projects at globally wide ranging institutions and universities. Most projects relate to biological and medical subject matter. There are also projects for Clean Water and Clean Renewable Energy. WCG projects are treated respectively and respectably on their own at this blog. Watch for news.

    Rosetta@home “Rosetta@home needs your help to determine the 3-dimensional shapes of proteins in research that may ultimately lead to finding cures for some major human diseases. By running the Rosetta program on your computer while you don’t need it you will help us speed up and extend our research in ways we couldn’t possibly attempt without your help. You will also be helping our efforts at designing new proteins to fight diseases such as HIV, Malaria, Cancer, and Alzheimer’s….”

    GPUGrid.net “GPUGRID.net is a distributed computing infrastructure devoted to biomedical research. Thanks to the contribution of volunteers, GPUGRID scientists can perform molecular simulations to understand the function of proteins in health and disease.” GPUGrid is a special case in that all processor work done by the volunteers is GPU processing. There is no CPU processing, which is the more common processing. Other projects (Einstein, SETI, Milky Way) also feature GPU processing, but they offer CPU processing for those not able to do work on GPU’s.

    gif

    These projects are just the oldest and most prominent projects. There are many others from which you can choose.

    There are currently some 300,000 users with about 480,000 computers working on BOINC projects That is in a world of over one billion computers. We sure could use your help.

    My BOINC

    graph

     
  • richardmitnick 10:28 pm on December 3, 2014 Permalink | Reply
    Tags: , , BOINC, , , , , , , ,   

    From isgtw: “Volunteer computing: 10 years of supporting CERN through LHC@home” 


    international science grid this week

    December 3, 2014
    Andrew Purcell

    LHC@home recently celebrated a decade since its launch in 2004. Through its SixTrack project, the LHC@home platform harnesses the power of volunteer computing to model the progress of sub-atomic particles traveling at nearly the speed of light around the Large Hadron Collider (LHC) at CERN, near Geneva, Switzerland. It typically simulates about 60 particles whizzing around the collider’s 27km-long ring for ten seconds, or up to one million loops. Results from SixTrack were used to help the engineers and physicists at CERN design stable beam conditions for the LHC, so today the beams stay on track and don’t cause damage by flying off course into the walls of the vacuum tube. It’s now also being used to carry out simulations relevant to the design of the next phase of the LHC, known as the High-Luminosity LHC.

    CERN LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    “The results of SixTrack played an essential role in the design of the LHC, and the high-luminosity upgrades will naturally require additional development work on SixTrack,” explains Frank Schmidt, who works in CERN’s Accelerators and Beam Physics Group of the Beams Department and is the main author of the SixTrack code. “In addition to its use in the design stage, SixTrack is also a key tool for the interpretation of data taken during the first run of the LHC,” adds Massimo Giovannozzi, who also works in CERN’s Accelerators and Beams Physics Group. “We use it to improve our understanding of particle dynamics, which will help us to push the LHC performance even further over the coming years of operation.” He continues: “Managing a project like SixTrack within LHC@home requires resources and competencies that are not easy to find: Igor Zacharov, a senior scientist at the Particle Accelerator Physics Laboratory (LPAP) of the Swiss Federal Institute of Technology in Lausanne (EPFL), provides valuable support for SixTrack by helping with BOINC integration.”

    c
    Volunteer computing is a type of distributed computing through which members of the public donate computing resources (usually processing power) to aid research projects. Image courtesy Eduardo Diez Viñuela, Flickr (CC BY-SA 2.0).

    Before LHC@home was created, SixTrack was run only on desktop computers at CERN, using a platform called the Compact Physics Screen Saver (CPSS). This proved to be a useful tool for a proof of concept, but it was first with the launch of the LHC@home platform in 2004 that things really took off. “I am surprised and delighted by the support from our volunteers,” says Eric McIntosh, who formerly worked in CERN’s IT Department and is now an honorary member of the Beams Department. “We now have over 100,000 users all over the world and many more hosts. Every contribution is welcome, however small, as our strength lies in numbers.”

    Virtualization to the rescue

    Building on the success of SixTrack, the Virtual LHC@home project (formerly known as Test4Theory) was launched in 2011. It enables users to run simulations of high-energy particle physics using their home computers, with the results submitted to a database used as a common resource by both experimental and theoretical scientists working on the LHC.

    Whereas the code for SixTrack was ported for running on Windows, OS X, and Linux, the high-energy-physics code used by each of the LHC experiments is far too large to port in a similar way. It is also being constantly updated. “The experiments at CERN have their own libraries and they all run on Linux, while the majority of people out there have common-or-garden variety Windows machines,” explains CERN honorary staff member of the IT department and chief technology officer of the Citizen Cyberscience Centre Ben Segal. “Virtualization is the way to solve this problem.”

    The birth of the LHC@home platform

    In 2004, Ben Segal and François Grey , who were both members of CERN’s IT department at the time, were asked to plan an outreach event for CERN’s 50th anniversary that would help people around the world to get an impression of the computational challenges facing the LHC. “I had been an early volunteer for SETI@home after it was launched in 1999,” explains Grey. “Volunteer computing was often used as an illustration of what distributed computing means when discussing grid technology. It seemed to me that it ought to be feasible to do something similar for LHC computing and perhaps even combine volunteer computing and grid computing this way.”

    “I contacted David Anderson, the person behind SETI@Home, and it turned out the timing was good, as he was working on an open-source platform called BOINC to enable many projects to use the SETI@home approach,” Grey continues. BOINC (Berkeley Open Infrastructures for Network Computing)is an open-source software platform for computing with volunteered resources. It was first developed at the University of California, Berkeley in the US to manage the SETI@Home project, and uses the unused CPU and GPU cycles on a computer to support scientific research.

    “I vividly remember the day we phoned up David Anderson in Berkeley to see if we could make a SETI-like computing challenge for CERN,” adds Segal. “We needed a CERN application that ran on Windows, as over 90% of BOINC volunteers used that. The SixTrack people had ported their code to Windows and had already built a small CERN-only desktop grid to run it on, as they needed lots of CPU power. So we went with that.”

    A runaway success

    “I was worried that no one would find the LHC as interesting as SETI. Bear in mind that this was well before the whole LHC craziness started with the Angels and Demons movie, and news about possible mini black holes destroying the planet making headlines,” says Grey. “We made a soft launch, without any official announcements, in 2004. To our astonishment, the SETI@home community immediately jumped in, having heard about LHC@home by word of mouth. We had over 1,000 participants in 24 hours, and over 7,000 by the end of the week — our server’s maximum capacity.” He adds: “We’d planned to run the volunteer computing challenge for just three months, at the time of the 50th anniversary. But the accelerator physicists were hooked and insisted the project should go on.”

    Predrag Buncic, who is now coordinator of the offline group within the ALICE experiment, led work to create the CERN Virtual Machine in 2008. He, Artem Harutyunyan (former architect and lead developer of CernVM Co-Pilot), and Segal subsequently adopted this virtualization technology for use within Virtual LHC@home. This has made it significantly easier for the experiments at CERN to create their own volunteer computing applications, since it is no longer necessary for them to port their code. The long-term vision for Virtual LHC@home is to support volunteer-computing applications for each of the large LHC experiments.
    Growth of the platform

    The ATLAS experiment recently launched a project that simulates the creation and decay of supersymmetric bosons and fermions. “ATLAS@Home offers the chance for the wider public to participate in the massive computation required by the ATLAS experiment and to contribute to the greater understanding of our universe,” says David Cameron, a researcher at the University of Oslo in Norway. “ATLAS also gains a significant computing resource at a time when even more resources will be required for the analysis of data from the second run of the LHC.”

    CERN ATLAS New
    ATLAS

    ATLAS@home

    Meanwhile, the LHCb experiment has been running a limited test prototype for over a year now, with an application running Beauty physics simulations set to be launched for the Virtual LHC@home project in the near future. The CMS and ALICE experiments also have plans to launch similar applications.

    CERN LHCb New
    LHCb

    CERN CMS New
    CMS

    CERN ALICE New
    ALICE

    An army of volunteers

    “LHC@home allows CERN to get additional computing resources for simulations that cannot easily be accommodated on regular batch or grid resources,” explains Nils Høimyr, the member of the CERN IT department responsible for running the platform. “Thanks to LHC@home, thousands of CPU years of accelerator beam dynamics simulations for LHC upgrade studies have been done with SixTrack, and billions of events have been simulated with Virtual LHC@home.” He continues: “Furthermore, the LHC@home platform has been an outreach channel, giving publicity to LHC and high-energy physics among the general public.”

    See the full article here.

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    iSGTW is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, iSGTW is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read iSGTW via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
  • richardmitnick 5:23 pm on November 28, 2014 Permalink | Reply
    Tags: , , BOINC, , ,   

    From CERN: “ATLAS@Home looks for CERN volunteers” 

    ATLAS@home

    ATLAS@home

    Mon 01 Dec 2014
    Rosaria Marraffino

    ATLAS@Home is a CERN volunteer computing project that runs simulated ATLAS events. As the project ramps up, the project team is looking for CERN volunteers to test the system before planning a bigger promotion for the public.

    as
    The ATLAS@home outreach website.

    ATLAS@Home is a large-scale research project that runs ATLAS experiment simulation software inside virtual machines hosted by volunteer computers. “People from all over the world offer up their computers’ idle time to run simulation programmes to help physicists extract information from the large amount of data collected by the detector,” explains Claire Adam Bourdarios of the ATLAS@Home project. “The ATLAS@Home project aims to extrapolate the Standard Model at a higher energy and explore what new physics may look like. Everything we’re currently running is preparation for next year’s run.”

    ATLAS@Home became an official BOINC (Berkeley Open Infrastructure for Network Computing) project in May 2014. After a beta test with SUSY events and Z decays, real production started in the summer with inelastic proton-proton interaction events. Since then, the community has grown remarkably and now includes over 10,000 volunteers spread across five continents. “We’re running the full ATLAS simulation and the resulting output files containing the simulated events are integrated with the experiment standard distributed production,” says Bourdarios.

    Compared to other LHC@Home projects, ATLAS@Home is heavier in terms of network traffic and memory requirements. “From the start, we have been successfully challenging the underlying infrastructure of LHC@Home,” says Bourdarios. “Now we’re looking for CERN volunteers to go one step further before doing a bigger public promotion.”

    e
    This simulated event display is created using ATLAS data.

    If you want to join the community and help the ATLAS experiment, you just need to download and run the necessary free software, VirtualBox and BOINC, which are available on NICE. Find out more about the project and how to join on the ATLAS@Home outreach website.

    “This project has huge outreach potential,” adds Bourdarios. “We hope to demonstrate how big discoveries are often unexpected deviations from existing models. This is why we need simulations. We’re also working on an event display, so that people can learn more about the events they have been producing and capture an image of what they have done.”

    If you have any questions about the ATLAS@Home project, e-mail atlas-comp-contact-home@cern.ch
    .

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    ATLAS@Home is a research project that uses volunteer computing to run simulations of the ATLAS experiment at CERN. You can participate by downloading and running a free program on your computer.

    ATLAS is a particle physics experiment taking place at the Large Hadron Collider at CERN, that searches for new particles and processes using head-on collisions of protons of extraordinary high energy. Petabytes of data were recorded, processed and analyzed during the first three years of data taking, leading to up to 300 publications covering all the aspects of the Standard Model of particle physics, including the discovery of the Higgs boson in 2012.

    Large scale simulation campaigns are a key ingredient for physicists, who permanently compare their data with both “known” physics and “new” phenomena predicted by alternative models of the universe, particles and interactions. This simulation runs on the WLCG Computing Grid and at any one point there are around 150,000 tasks running. You can help us run even more simulation by using your computer’s idle time to run these same tasks.

    No knowledge of particle physics is required, but for those interested in more details, at the moment we simulate the creation and decay of supersymmetric bosons and fermions, new types of particles that we would love to discover next year, as they would help us to shed light on the dark matter mystery!

    This project runs on BOINC software from UC Berkeley.
    Visit BOINC, download and install the software and attach to the project.

    BOINCLarge

     
  • richardmitnick 3:22 pm on November 18, 2014 Permalink | Reply
    Tags: , BOINC, , ,   

    From NOVA: “Why There’s No HIV Cure Yet” 

    [After the NOVA article, I tell you how you and your family, friends, and colleagues can help to find a cure for AIDS and other diseases]

    PBS NOVA

    NOVA

    27 Aug 2014
    Alison Hill

    Over the past two years, the phrase “HIV cure” has flashed repeatedly across newspaper headlines. In March 2013, doctors from Mississippi reported that the disease had vanished in a toddler who was infected at birth. Four months later, researchers in Boston reported a similar finding in two previously HIV-positive men. All three were no longer required to take any drug treatments. The media heralded the breakthrough, and there was anxious optimism among HIV researchers. Millions of dollars of grant funds were earmarked to bring this work to more patients.

    But in December 2013, the optimism evaporated. HIV had returned in both of the Boston men. Then, just this summer, researchers announced the same grim results for the child from Mississippi. The inevitable questions mounted from the baffled public. Will there ever be a cure for this disease? As a scientist researching HIV/AIDS, I can tell you there’s no straightforward answer. HIV is a notoriously tricky virus, one that’s eluded promising treatments before. But perhaps just as problematic is the word “cure” itself.

    Science has its fair share of trigger words. Biologists prickle at the words “vegetable” and “fruit”—culinary terms which are used without a botanical basis—chemists wrinkle their noses at “chemical free,” and physicists dislike calling “centrifugal” a force—it’s not; it only feels like one. If you ask an HIV researcher about a cure for the disease, you’ll almost certainly be chastised. What makes “cure” such a heated word?

    t
    HIV hijacks the body’s immune system by attacking T cells.

    It all started with a promise. In the early 1980s, doctors and public health officials noticed large clusters of previously healthy people whose immune systems were completely failing. The new condition became known as AIDS, for “acquired immunodeficiency syndrome.” A few years later, in 1984, researchers discovered the cause—the human immunodeficiency virus, now known commonly as HIV. On the day this breakthrough was announced, health officials assured the public that a vaccine to protect against the dreaded infection was only two years away. Yet here we are, 30 years later, and there’s still no vaccine. This turned out to be the first of many overzealous predictions about controlling the HIV epidemic or curing infected patients.

    The progression from HIV infection to AIDS and eventual death occurs in over 99% of untreated cases—making it more deadly than Ebola or the plague. Despite being identified only a few decades ago, AIDS has already killed 25 million people and currently infects another 35 million, and the World Health Organization lists it as the sixth leading cause of death worldwide.

    HIV disrupts the body’s natural disease-fighting mechanisms, which makes it particularly deadly and complicates efforts to develop a vaccine against it. Like all viruses, HIV gets inside individual cells in the body and highjacks their machinery to make thousands of copies of itself. HIV replication is especially hard for the body to control because the white blood cells it infects, and eventually kills, are a critical part of the immune system. Additionally, when HIV copies its genes, it does so sloppily. This causes it to quickly mutate into many different strains. As a result, the virus easily outwits the body’s immune defenses, eventually throwing the immune system into disarray. That gives other obscure or otherwise innocuous infections a chance to flourish in the body—a defining feature of AIDS.

    Early Hope

    In 1987, the FDA approved AZT as the first drug to treat HIV. With only two years between when the drug was identified in the lab and when it was available for doctors to prescribe, it was—and remains—the fastest approval process in the history of the FDA. AZT was widely heralded as a breakthrough. But as the movie The Dallas Buyer’s Club poignantly retells, AZT was not the miracle drug many hoped. Early prescriptions often elicited toxic side-effects and only offered a temporary benefit, as the virus quickly mutated to become resistant to the treatment. (Today, the toxicity problems have been significantly reduced, thanks to lower doses.) AZT remains a shining example of scientific bravura and is still an important tool to slow the infection, but it is far from the cure the world had hoped for.

    In three decades, over 25 highly-potent drugs have been developed and FDA-approved to treat HIV.

    Then, in the mid-1990s, some mathematicians began probing the data. Together with HIV scientists, they suggested that by taking three drugs together, we could avoid the problem of drug resistance. The chance that the virus would have enough mutations to allow it to avoid all drugs at once, they calculated, would simply be too low to worry about. When the first clinical trials of these “drug cocktails” began, both mathematical and laboratory researchers watched the levels of virus drop steadily in patients until they were undetectable. They extrapolated this decline downwards and calculated that, after two to three years of treatment, all traces of the virus should be gone from a patient’s body. When that happened, scientists believed, drugs could be withdrawn, and finally, a cure achieved. But when the time came for the first patients to stop their drugs, the virus again seemed to outwit modern medicine. Within a few weeks of the last pill, virus levels in patients’ blood sprang up to pre-treatment levels—and stayed there.

    In the three decades since, over 25 more highly-potent drugs have been developed and FDA-approved to treat HIV. When two to five of them are combined into a drug cocktail, the mixture can shut down the virus’s replication, prevent the onset of AIDS, and return life expectancy to a normal level. However, patients must continue taking these treatments for their entire lives. Though better than the alternative, drug regimens are still inconvenient and expensive, especially for patients living in the developing world.

    Given modern medicine’s success in curing other diseases, what makes HIV different? By definition, an infection is cured if treatment can be stopped without the risk of it resurfacing. When you take a week-long course of antibiotics for strep throat, for example, you can rest assured that the infection is on track to be cleared out of your body. But not with HIV.

    A Bad Memory

    The secret to why HIV is so hard to cure lies in a quirk of the type of cell it infects. Our immune system is designed to store information about infections we have had in the past; this property is called “immunologic memory.” That’s why you’re unlikely to be infected with chickenpox a second time or catch a disease you were vaccinated against. When an infection grows in the body, the white blood cells that are best able to fight it multiply repeatedly, perfecting their infection-fighting properties with each new generation. After the infection is cleared, most of these cells will die off, since they are no longer needed. However, to speed the counter-attack if the same infection returns, some white blood cells will transition to a hibernation state. They don’t do much in this state but can live for an extremely long time, thereby storing the “memory” of past infections. If provoked by a recurrence, these dormant cells will reactivate quickly.

    This near-immortal, sleep-like state allows HIV to persist in white blood cells in a patient’s body for decades. White blood cells infected with HIV will occasionally transition to the dormant state before the virus kills them. In the process, the virus also goes temporarily inactive. By the time drugs are started, a typical infected person contains millions of these cells with this “latent” HIV in them. Drug cocktails can prevent the virus from replicating, but they do nothing to the latent virus. Every day, some of the dormant white blood cells wake up. If drug treatment is halted, the latent virus particles can restart the infection.

    Latent HIV’s near-immortal, sleep-like state allows it to persist in white blood cells in a patient’s body for decades.

    HIV researchers call this huge pool of latent virus the “barrier to a cure.” Everyone’s looking for ways to get rid of it. It’s a daunting task, because although a million HIV-infected cells may seem like a lot, there are around a million times that many dormant white blood cells in the whole body. Finding the ones that contain HIV is a true needle-in-a-haystack problem. All that remains of a latent virus is its DNA, which is extremely tiny compared to the entire human genome inside every cell (about 0.001% of the size).
    Defining a Cure

    Around a decade ago, scientists began to talk amongst themselves about what a hypothetical cure could look like. They settled on two approaches. The first would involve purging the body of latent virus so that if drugs were stopped, there would be nothing left to restart the infection. This was often called a “sterilizing cure.” It would have to be done in a more targeted and less toxic way than previous attempts of the late 1990s, which, because they attempted to “wake up” all of the body’s dormant white blood cells, pushed the immune system into a self-destructive overdrive. The second approach would instead equip the body with the ability to control the virus on its own. In this case, even if treatment was stopped and latent virus reemerged, it would be unable to produce a self-sustaining, high-level infection. This approach was referred to as a “functional cure.”

    The functional cure approach acknowledged that latency alone was not the barrier to a cure for HIV. There are other common viruses that have a long-lived latent state, such as the Epstein-Barr virus that causes infectious mononucleosis (“mono”), but they rarely cause full-blown disease when reactivated. HIV is, of course, different because the immune system in most people is unable to control the infection.

    The first hint that a cure for HIV might be more than a pipe-dream came in 2008 in a fortuitous human experiment later known as the “Berlin patient.” The Berlin patient was an HIV-positive man who had also developed leukemia, a blood cancer to which HIV patients are susceptible. His cancer was advanced, so in a last-ditch effort, doctors completely cleared his bone marrow of all cells, cancerous and healthy. They then transplanted new bone marrow cells from a donor.

    Fortunately for the Berlin patient, doctors were able to find a compatible bone marrow donor who carried a unique HIV-resistance mutation in a gene known as CCR5. They completed the transplant with these cells and waited.

    For the last five years, the Berlin patient has remained off treatment without any sign of infection. Doctors still cannot detect any HIV in his body. While the Berlin patient may be cured, this approach cannot be used for most HIV-infected patients. Bone marrow transplants are extremely risky and expensive, and they would never be conducted in someone who wasn’t terminally ill—especially since current anti-HIV drugs are so good at keeping the infection in check.

    Still, the Berlin patient was an important proof-of-principle case. Most of the latent virus was likely cleared out during the transplant, and even if the virus remained, most strains couldn’t replicate efficiently given the new cells with the CCR5 mutation. The Berlin patient case provides evidence that at least one of the two cure methods (sterilizing or functional), or perhaps a combination of them, is effective.

    Researchers have continued to try to find more practical ways to rid patients of the latent virus in safe and targeted ways. In the past five years, they have identified multiple anti-latency drug candidates in the lab. Many have already begun clinical trials. Each time, people grow optimistic that a cure will be found. But so far, the results have been disappointing. None of the drugs have been able to significantly lower levels of latent virus.

    In the meantime, doctors in Boston have attempted to tease out which of the two cure methods was at work in the Berlin patient. They conducted bone marrow transplants on two HIV-infected men with cancer—but this time, since HIV-resistant donor cells were not available, they just used typical cells. Both patients continued their drug cocktails during and after the transplant in the hopes that the new cells would remain HIV-free. After the transplants, no HIV was detectable, but the real test came when these patients volunteered to stop their drug regimens. When they remained HIV-free a few months later, the results were presented at the International AIDS Society meeting in July 2013. News outlets around the world declared that two more individuals had been cured of HIV.

    Latent virus had likely escaped the detection methods available.

    It quickly became clear that everyone had spoken too soon. Six months later, researchers reported that the virus had suddenly and rapidly returned in both individuals. Latent virus had likely escaped the detection methods available—which are not sensitive enough—and persisted at low, but significant levels. Disappointment was widespread. The findings showed that even very small amounts of latent virus could restart an infection. It also meant meant that the anti-latency drugs in development would need to be extremely potent to give any hope of a cure.

    But there was one more hope—the “Mississippi baby.” A baby was born to an HIV-infected mother who had not received any routine prenatal testing or treatment. Tests revealed high levels of HIV in the baby’s blood, so doctors immediately started the infant on a drug cocktail, to be continued for life.

    The mother and child soon lost touch with their health care providers. When they were relocated a few years later, doctors learned that the mother had stopped giving drugs to the child several months prior. The doctors administered all possible tests to look for signs of the virus, both latent and active, but they didn’t find any evidence. They chose not to re-administer drugs, and a year later, when the virus was still nowhere to be found, they presented the findings to the public. It was once again heralded as a cure.

    Again, it was not to be. Just last month, the child’s doctors announced that the virus had sprung back unexpectedly. It seemed that even starting drugs as soon as infection was detected in the newborn could not prevent the infection from returning over two years later.
    Hope Remains

    Despite our grim track record with the disease, HIV is probably not incurable. Although we don’t have a cure yet, we’ve learned many lessons along the way. Most importantly, we should be extremely careful about using the word “cure,” because for now, we’ll never know if a person is cured until they’re not cured.

    Clearing out latent virus may still be a feasible approach to a cure, but the purge will have to be extremely thorough. We need drugs that can carefully reactivate or remove latent HIV, leaving minimal surviving virus while avoiding the problems that befell earlier tests that reactivated the entire immune system. Scientists have proposed multiple, cutting-edge techniques to engineer “smart” drugs for this purpose, but we don’t yet know how to deliver this type of treatment safely or effectively.

    As a result, most investigations focus on traditional types of drugs. Researchers have developed ways to rapidly scan huge repositories of existing medicines for their ability to target latent HIV. These methods have already identified compounds that were previously used to treat alcoholism, cancer, and epilepsy, and researchers are repurposing them to be tested in HIV-infected patients.
    The less latent virus that remains, the less chance there is that the virus will win the game of chance.

    Mathematicians are also helping HIV researchers evaluate new treatments. My colleagues and I use math to take data collected from just a few individuals and fill in the gaps. One question we’re focusing on is exactly how much latent virus must be removed to cure a patient, or at least to let them stop their drug cocktails for a few years. Each cell harboring latent virus is a potential spark that could restart the infection. But we don’t know when the virus will reactivate. Even once a single latent virus awakens, there are still many barriers it must overcome to restart a full-blown infection. The less latent virus that remains, the less chance there is that the virus will win this game of chance. Math allows us to work out these odds very precisely.

    Our calculations show that “apparent cures”—where patients with latent virus levels low enough to escape detection for months or years without treatment—are not a medical anomaly. In fact, math tells us that they are an expected result of these chance dynamics. It can also help researchers determine how good an anti-latency drug should be before it’s worth testing in a clinical trial.

    Many researchers are working to augment the body’s ability to control the infection, providing a functional cure rather than a sterilizing one. Studies are underway to render anyone’s immune cells resistant to HIV, mimicking the CCR5 mutation that gives some people natural resistance. Vaccines that could be given after infection, to boost the immune response or protect the body from the virus’s ill effects, are also in development.

    In the meantime, treating all HIV-infected individuals—which has the added benefit of preventing new transmissions—remains the best way to control the epidemic and reduce mortality. But the promise of “universal treatment” has also not materialized. Currently, even in the U.S., only 25% of HIV-positive people have their viral levels adequately suppressed by treatment. Worldwide, for every two individuals starting treatment, three are newly infected. While there’s no doubt that we’ve made tremendous progress in fighting the virus, we have a long way to go before the word “cure” is not taboo when it comes to HIV/AIDS.

    See the full article here.

    Did you know that you can help in the fight against AIDS? By donating time on your computer to the Fight Aids at Home project of World Community Grid, you can become a part of the solution. The work is called “crunching” because you are crunching computational data the results of which will then be fed back into the necessary lab work. We save researchers literally millions of hours of lab time in this process.
    Vsit World Community Grid (WCG) or Berkeley Open infrastructure for Network Computing (BOINC). Download the BOINC software and install it on your computer. Then visit WCG and attach to the FAAH project. The project will send you computational work units. Your computer will process them and send the results back to the project, the project will then send you more work units. It is that simple. You do nothing, unless you want to get into the nuts and bolts of the BOINC software. If you take up this work, and if you see it as valuable, please tell your family, friends and colleagues, anyone with a computer, even an Android tablet. We found out that my wife’s oncologist’s father in Brazil is a cruncher on two projects from WCG.

    This is the projects web site. Take a look.

    While you are visiting BOINC and WCG, look around at all of the very valuable projects being conducted at some of the worlds most distinguished universities and scientific institutions. You can attach to as many as you like, on one or a number of computers. You can only be a help here, particpating in Citizen Science.

    This is a look at the present and past projects at WCG:

    Please visit the project pages-

    Mapping Cancer Markers
    mappingcancermarkers2

    Uncovering Genome Mysteries
    Uncovering Genome Mysteries

    Say No to Schistosoma

    GO Fight Against Malaria

    Drug Search for Leishmaniasis

    Computing for Clean Water

    The Clean Energy Project

    Discovering Dengue Drugs – Together

    Help Cure Muscular Dystrophy

    Help Fight Childhood Cancer

    Help Conquer Cancer

    Human Proteome Folding

    FightAIDS@Home

    World Community Grid is a social initiative of IBM Corporation
    IBM Corporation
    ibm

    IBM – Smarter Planet
    <img

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NOVA is the highest rated science series on television and the most watched documentary series on public television. It is also one of television’s most acclaimed series, having won every major television award, most of them many times over.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 1:47 pm on November 11, 2014 Permalink | Reply
    Tags: , , BOINC, , , ,   

    From DDDT at WCG: “Discovering Dengue Drugs – Together” 

    New WCG Logo

    10 Nov 2014
    By: Dr. Stan Watowich, PhD
    University of Texas Medical Branch (UTMB) in Galveston, Texas

    Summary
    For week five of our decade of discovery celebrations we’re looking back at the Discovering Dengue Drugs – Together project, which helped researchers at the University of Texas Medical Branch at Galveston search for drugs to help combat dengue – a debilitating tropical disease that threatens 40% of the world’s population. Thanks to World Community Grid volunteers, researchers have identified a drug lead that has the potential to stop the virus in its tracks.

    mic

    Dengue fever, also known as “breakbone fever”, causes excruciating joint and muscle pain, high fever and headaches. Severe dengue, known as “dengue hemorrhagic fever”, has become a leading cause of hospitalization and death among children in many Asian and Latin American countries. According to the World Health Organization (WHO), over 40% of the world’s population is at risk from dengue; another study estimated there were 390 million cases in 2010 alone.

    The disease is a mosquito-borne infection found in tropical and sub-tropical regions – primarily in the developing world. It belongs to the flavivirus family of viruses, together with Hepatitis C, West Nile and Yellow Fever.

    Despite the fact dengue represents a critical global health concern, it has received limited attention from affluent countries until recently and is widely considered to be a neglected tropical disease. Currently, no approved vaccines or treatments exist for the disease. We launched Discovering Dengue Drugs – Together on World Community Grid in 2007 to search for drugs to treat dengue infections using a computer-based discovery approach.

    In the first phase of the project, we aimed to identify compounds that could be used to develop dengue drugs. Thanks to the computing power donated by World Community Grid volunteers, my fellow researchers and I at the University of Texas Medical Branch in Galveston, Texas, screened around three million chemical compounds to determine which ones would bind to the dengue virus and disable it.

    By 2009 we had found several thousand promising compounds to take to the next stage of testing. We began identifying the strongest compounds from the thousands of potentials, with the goal of turning these into molecules that could be suitable for human clinical trials.

    We have recently made an exciting discovery using insights from Discovering Dengue Drugs – Together to guide additional calculations on our web portal for advanced computer-based drug discovery, DrugDiscovery@TACC. A molecule has demonstrated success in binding to and disabling a key dengue enzyme that is necessary for the virus to replicate.

    Furthermore, it also shows signs of being able to effectively disable related flaviviruses, such as the West Nile virus. Importantly, our newly discovered drug lead also demonstrates no negative side effects such as adverse toxicity, carcinogenicity or mutagenicity risks, making it a promising antiviral drug candidate for dengue and potentially other flavivirues. We are working with medicinal chemists to synthesize variants of this exciting candidate molecule with the goal of improving its activity for planned pre-clinical and clinical trials.

    I’d like to express my gratitude for the dedication of World Community Grid volunteers. The advances we are making, and our improved understanding of drug discovery software and its current limitations, would not have been possible without your donated computing power.

    If you’d like to help researchers make more ground-breaking discoveries like this – and have the chance of winning some fantastic prizes – take part in our decade of discovery competition by encouraging your friends to sign up to World Community Grid today. There’s a week left and the field is wide open – get started today!

    See the full article here.

    World Community Grid (WCG) brings people together from across the globe to create the largest non-profit computing grid benefiting humanity. It does this by pooling surplus computer processing power. We believe that innovation combined with visionary scientific research and large-scale volunteerism can help make the planet smarter. Our success depends on like-minded individuals – like you.”

    WCG projects run on BOINC software from UC Berkeley.

    BOINC is a leader in the field(s) of Distributed Computing, Grid Computing and Citizen Cyberscience.BOINC is more properly the Berkeley Open Infrastructure for Network Computing.

    CAN ONE PERSON MAKE A DIFFERENCE? YOU BETCHA!!

    “Download and install secure, free software that captures your computer’s spare power when it is on, but idle. You will then be a World Community Grid volunteer. It’s that simple!” You can download the software at either WCG or BOINC.

    Please visit the project pages-

    Mapping Cancer Markers
    mappingcancermarkers2

    Uncovering Genome Mysteries
    Uncovering Genome Mysteries

    Say No to Schistosoma

    GO Fight Against Malaria

    Drug Search for Leishmaniasis

    Computing for Clean Water

    The Clean Energy Project

    Discovering Dengue Drugs – Together

    Help Cure Muscular Dystrophy

    Help Fight Childhood Cancer

    Help Conquer Cancer

    Human Proteome Folding

    FightAIDS@Home

    World Community Grid is a social initiative of IBM Corporation
    IBM Corporation
    ibm

    IBM – Smarter Planet
    sp

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 3:13 pm on November 7, 2014 Permalink | Reply
    Tags: , , BOINC, , ,   

    From WCG: “Decade of discovery: New precision tools to diagnose and treat cancer” 

    New WCG Logo

    3 Nov 2014
    By: Dr. David J. Foran, PhD
    Rutgers Cancer Institute of New Jersey

    Summary
    It’s week four of our 10th anniversary celebrations, and we’re following up last week’s childhood cancer feature by spotlighting another cancer project that’s helped researchers develop powerful new tools to diagnose cancer and tailor treatments to individual patients, using big data and analytics.

    no

    When it comes to cancer, a doctor’s diagnosis affects how aggressively a patient is treated, which medications might be appropriate and what levels of risk are justified. New precision medicine techniques are enabling physicians and scientists to refine diagnoses by identifying changes and patterns in individual cancers at unprecedented levels of granularity – ultimately improving treatment outcomes for patients.

    A key tool for precision medicine is tissue microarray analysis. This enables investigators to analyze large batches of tissue sample images simultaneously, so they can look for patterns and identify cancer signatures. It also provides them with a deeper understanding of cancer biology and uncovers new sub-classifications of cancer and likely patient responses – all of which influence new courses of treatment and future drug design.

    Tissue microarray analysis shows great promise, but it is not without its limitations. Pathologists typically examine the specimens visually, resulting in subjective interpretations and variations in diagnoses.

    We realized that if this method of analysis could be automated using digital pattern recognition algorithms, we could improve accuracy and reveal new patterns across large sets of data. This would make it possible for researchers to determine a patient’s type and stage of cancer more precisely, meaning they can prescribe therapies or combinations of treatments that are most likely to be effective.

    To study the feasibility of automating tissue microarray analysis, we partnered with IBM’s World Community Grid in 2006 to launch the Help Defeat Cancer project. At the time, we were pioneering a new approach that nobody else was investigating, and it was met with tremendous skepticism by many of our colleagues.

    However, with the support of more than 200,000 World Community Grid volunteers from around the globe who donated over 2,900 years of their computing time, we were able to study over 100,000 patient tissue samples to search for cancer signatures.

    Access to this vast computing power enabled our team to rapidly conduct this research under a much wider range of environmental conditions and to perform specimen analysis at much greater degrees of sensitivity.

    Thanks to World Community Grid and the Help Defeat Cancer project, we demonstrated the success of using computer-based analysis to automatically investigate and classify cancer specimens based on expression signature patterns. We were able to develop a reference library of cancer signatures that can be used to systematically analyze and compare tissue samples across large patient cohorts.

    Leveraging these experimental results, our team secured competitive funding from the National Institutes of Health (NIH) to build a clinical decision support system to automatically analyze and classify cancer specimens with improved diagnostic and prognostic accuracy. We used the core reference library of expression signatures generated through the Help Defeat Cancer project to demonstrate the proof-of-concept for the system.

    These decision support tools are now being tested and refined by investigators from the Rutgers Cancer Institute of New Jersey, Stony Brook University School of Medicine, University of Pittsburgh Medical Center and Emory University. They are exploring how the tools can aid clinical decision-making, plus are pursuing further investigative research. Together, our ultimate aim is to refine these tools sufficiently so they can be certified for routine clinical use in diagnosing and treating patients.

    Although the Help Defeat Cancer project has completed its research on World Community Grid, we continue to investigate the findings and they have contributed to some significant new beginnings. At Rutgers Cancer Institute of New Jersey, physicians and scientists – aided by high-performance computing resources – are analyzing genomes and human tissues, and identifying cancer patterns, faster than ever before.

    In collaboration with our research partners at the Rutgers Discovery Informatics Institute (RDI2) and RUCDR Infinite Biologics (the world’s largest university-based biorepository, located within the Human Genetics Institute of New Jersey), the Rutgers Cancer Institute is shaping a revolution in how best to determine cancer therapy for patients – a vast improvement from the time-intensive, trial-and-error approach that doctors have faced for years. To date, only a fraction of known cancer biomarkers have been examined. The long-term goal is to create a library of biomarkers and their expression patterns so that, in the future, physicians can consult the library to help diagnose cancer patients and provide them with the most effective treatment.

    I would like to express my gratitude to Stanley Litow, Robin Willner, and Jen Crozier from IBM and to World Community Grid’s Advisory Board for supporting the Help Defeat Cancer project. I’d also like to extend my special thanks to the IBM World Community Grid team members who contributed to the success of the project – I hope to have the opportunity to work with them again in the near future.

    Additionally, I would like to acknowledge the NIH, Department of Defense and IBM for supporting this research – and give credit to those individuals from my laboratory and partnering institutions who were involved in the early experiments and the initial design and development of the imaging and computational tools, which we then used throughout the project. And, of course, a very big thank you to all the World Community Grid volunteers – without their support, our accomplishments with Help Defeat Cancer would not have been possible.

    The Help Defeat Cancer project has completed its analysis on World Community Grid – but another innovative project, Mapping Cancer Markers, is currently running and needs your help. Help us celebrate a decade of discovery on World Community Grid by sharing this story and encouraging your friends to donate their unused computing power to cutting-edge cancer research.

    Here’s to another decade of discovery.

    See the full article here.

    World Community Grid (WCG) brings people together from across the globe to create the largest non-profit computing grid benefiting humanity. It does this by pooling surplus computer processing power. We believe that innovation combined with visionary scientific research and large-scale volunteerism can help make the planet smarter. Our success depends on like-minded individuals – like you.”

    WCG projects run on BOINC software from UC Berkeley.

    BOINC is a leader in the field(s) of Distributed Computing, Grid Computing and Citizen Cyberscience.BOINC is more properly the Berkeley Open Infrastructure for Network Computing.

    CAN ONE PERSON MAKE A DIFFERENCE? YOU BETCHA!!

    “Download and install secure, free software that captures your computer’s spare power when it is on, but idle. You will then be a World Community Grid volunteer. It’s that simple!” You can download the software at either WCG or BOINC.

    Please visit the project pages-

    Mapping Cancer Markers
    mappingcancermarkers2

    Uncovering Genome Mysteries
    Uncovering Genome Mysteries

    Say No to Schistosoma

    GO Fight Against Malaria

    Drug Search for Leishmaniasis

    Computing for Clean Water

    The Clean Energy Project

    Discovering Dengue Drugs – Together

    Help Cure Muscular Dystrophy

    Help Fight Childhood Cancer

    Help Conquer Cancer

    Human Proteome Folding

    FightAIDS@Home

    World Community Grid is a social initiative of IBM Corporation
    IBM Corporation
    ibm

    IBM – Smarter Planet
    sp

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 4:52 pm on November 6, 2014 Permalink | Reply
    Tags: , BOINC, , ,   

    From FAAH at WCG: “Teamwork yields experimental support for FightAIDS@Home calculations” 

    New WCG Logo

    By: The FightAIDS@Home research team
    6 Nov 2014

    Summary
    Imaging studies have now confirmed some of the computational predictions made during FightAIDS@Home, providing important confirmation of our methodology and the value of your computational results. This work is ongoing, but promises to increase our understanding of how HIV protease can be disrupted.

    site
    The “exo-site” discovered in HIV protease (shown here in green), showing the original bound 4d9 fragment (shown here as red and orange sticks) and the volume (shown as the orange mesh) that is being targeted by FightAIDS@Home. (image credit: Stefano Forli, TSRI)

    Our lab at the Scripps Research Institute, La Jolla, is part of the HIV Interaction and Viral Evolution (HIVE) Center – a group of investigators with expertise in HIV crystallography, virology, molecular biology, biochemistry, synthetic chemistry and computational biology. This means that we have world-class resources available to verify and build upon our computational work, including the nuclear magnetic resonance (NMR) facility at the Scripps Research Institute, Florida. NMR is a technique for determining the molecular structure of a chemical sample, and therefore is very useful for validating some of the predictions made during the computational phase of FightAIDS@Home.

    We’re excited to announce that our collaborators at Scripps Florida have now optimized their NMR experiments and have been able to characterize the binding of promising ligands with the prospective allosteric sites on the HIV protease. These sites represent new footholds in the search for therapies that defeat viral drug resistance. The NMR experiment allows us to detect the location of the interactions between the candidate inhibitors and the protein, but unlike X-ray crystallography experiments, these interactions are measured in solution, which better represents the biological environment.

    In fact, the first results from the NMR experiments validated the exo site we so thoroughly investigated in FightAIDS@Home. As a result, we now have experimental evidence that a small molecule binds to the exo site in solution with structural effects that seem to perturb the dynamic behavior of protease, even with a known inhibitor in the active site.

    There are many more NMR experiments still to run, but another advantage of NMR over crystallography is that it does not require the lengthy step of growing diffraction-quality crystals. This allows higher experimental throughput, so we look forward to experimental confirmation of many more compounds in much shorter time. So far we have shipped 15 compounds to test and another batch is going to be sent this week. The new compounds will help to validate another potential interaction site on one of HIV protease’s two movable “flaps”.

    Once the validation is completed, we will proceed to test a number of compounds that we identified in different FightAIDS@Home experiments for all of the target protease allosteric sites.

    As always, thank you for your support! This research would not be possible without your valuable computing time.

    The Scripps research team needs your help to continue making progress on developing new treatments for AIDS! Take part in our decade of discovery competition by encouraging your friends to sign up to World Community Grid today to start donating their computer or mobile device’s computing power to FightAIDS@Home. There’s just over a week left and some great prizes are up for grabs – get started today!

    Here’s to another decade of discovery.

    See the full article here.

    World Community Grid (WCG) brings people together from across the globe to create the largest non-profit computing grid benefiting humanity. It does this by pooling surplus computer processing power. We believe that innovation combined with visionary scientific research and large-scale volunteerism can help make the planet smarter. Our success depends on like-minded individuals – like you.”

    WCG projects run on BOINC software from UC Berkeley.

    BOINC is a leader in the field(s) of Distributed Computing, Grid Computing and Citizen Cyberscience.BOINC is more properly the Berkeley Open Infrastructure for Network Computing.

    CAN ONE PERSON MAKE A DIFFERENCE? YOU BETCHA!!

    “Download and install secure, free software that captures your computer’s spare power when it is on, but idle. You will then be a World Community Grid volunteer. It’s that simple!” You can download the software at either WCG or BOINC.

    Please visit the project pages-

    Mapping Cancer Markers
    mappingcancermarkers2

    Uncovering Genome Mysteries
    Uncovering Genome Mysteries

    Say No to Schistosoma

    GO Fight Against Malaria

    Drug Search for Leishmaniasis

    Computing for Clean Water

    The Clean Energy Project

    Discovering Dengue Drugs – Together

    Help Cure Muscular Dystrophy

    Help Fight Childhood Cancer

    Help Conquer Cancer

    Human Proteome Folding

    FightAIDS@Home

    World Community Grid is a social initiative of IBM Corporation
    IBM Corporation
    ibm

    IBM – Smarter Planet
    sp

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 7:38 pm on October 2, 2014 Permalink | Reply
    Tags: , , BOINC, ,   

    From WCG: “Global PC network gives researchers supercomputer power” 

    star

    Sep 21 2014
    Joseph Hall

    Igor Jurisica wants you to help him conquer cancer.

    Oh, don’t worry, the Princess Margaret Cancer Centre scientist is not looking for money.

    But he would like to borrow your computer.

    In the age of molecular medicine, with its staggering genetic complexity, much cutting-edge cancer research has become a game of brute computational number crunching.

    And with access to laboratory supercomputers scarce and expensive, Jurisica has turned to a massive network of home and business PCs to run his research algorithms.

    “It’s basically a network of workstations around the globe,” says Jurisica, a computational biologist at the hospital and a University of Toronto professor.

    “When you’re not using your machine (it) can be donated for the project.”

    Known as the World Community Grid, the IBM-run network has gathered some 676,000 businesses and individuals globally who have volunteered about 2.9 million computers of varying capacities to help run scientific studies.

    Some 13,000 Canadian volunteers are currently donating time on about 67,000 devices.

    Begun last November, Jurisica’s Mapping Cancer Markers project has been granted access to about one-third of the machines worldwide, which gives him some 258 computer processing unit (CPU) years worth of power to run his data each day.

    jm
    Igor Jurisica is using global network of computers to discover more precise cancer treatments. Andrew Francis Wallace / Toronto Star

    That means a typical computer would have to run continuously for 258 years to process the data the network can work through in 24 hours.

    In aggregate, the full grid can generate more than 400 CPU years each day, which would rank it among the world’s 15 largest supercomputers, said Viktors Berstis, the senior IBM software engineer who runs the network.

    “When you have these big data problems, you have big processing problems to go with them,” Berstis said.

    “And so these kinds of projects that take many tens of thousands of years of CPU time are so massive that only the biggest supercomputers can handle them.”

    Again the problem, Berstis said, is that an institution with a supercomputer must typically divvy up access to it among hundreds or thousands of competing researchers.

    “So no one researcher gets that supercomputer 24/7 for several years on end which is the equivalent of what we’re giving (them),” he said.

    “They are getting something extremely rare and they are getting it for free.”

    The grid, which is eager for more volunteers, is run through a Toronto-based central processor that accesses home and business computers when the donors are not using them, Berstis said.

    It’s available for downloading to anyone who has a computer or Android device running Windows, Mac or Linux systems by going to the grid site and clicking the join link.

    That downloads a program to the home or business computer which will run in the device’s background at the lowest priority, Berstis said.

    “The instant your computer has nothing else left to do for you, then it can work a piece of this big research problem,” he said.

    “We try to make this software very unobtrusive so it doesn’t bother anything else.”

    Volunteers can donate their unused capacity in a number of ways, even allowing project computing to be done in the microseconds between key strokes.

    Member machines contact the central Toronto processor when they’re ready for work and are sent a tiny portion of a project problem.

    The worked information is then sent back to the server where it is checked for accuracy and cobbled together with all the other incoming data.

    Berstis said the grid code has been scoured line by line by IBM programmers for potential security problems and is likely to be the safest piece of software on any machine.

    He said the network also boasts environmental benefits.

    “When you have a supercomputer centre you have to have an air conditioning system that is almost as powerful as the computer to cool it back down so that the building doesn’t melt,” he said.

    The IBM grid is similar to one used by the earlier SETI — or Search for Extraterrestrial Intelligence — project, which linked millions of home computers to help scan the heavens for alien signals.

    Grid volunteers can also download screensavers that relate to the science project — there are currently three — that their computers are helping to crunch.

    Jurisica’s cancer marker project is the largest of these and is looking to discover the genetic and molecular signatures of lung, prostate, ovarian and sarcoma cancers — a search of stupefying complexity.

    When the Human Genome Project released its map of our species’ DNA more than a decade ago, it opened the door to the possibility of personalized medicine, where an individual’s cancer or heart disease could be diagnosed and treated according to its specific genetic signatures.

    Unfortunately, the genome project also opened a Pandora’s box of complexity in medicine with the realization that any single gene could be run or influenced by a mesmerizing array of other genetic materials and their protein products.

    And an individual’s complex cancer signatures, for example, would determine whether their disease could be detected early or would respond to given therapies.

    Jurisica said, however, that one cancer biopsy may now generate some 40,000 potentially involved variables. That means finding a set of signatures for any particular cancer — and there may be dozens across the patient population — could be a daunting exercise.

    In its search for such signatures — or markers — the Princess Margaret project has so far used up more than 81,000 CPU years of computation.

    Berstis said IBM began building the service a decade ago as one of its “Good Citizen’s Projects” and that researchers are selected on the scientific value of their proposals.

    [Correct certain inaccuracies: First, SETI@home is still running. Second, no mention was made that all of WCG runs on BOINC software from the Space Science Lab at U.C. Berkeley. Most important, long past is the day when WCG ran only when a computer was idle or took last position in what was running. All of that was true when BOINC and WCG were much younger and home computers had little of today’s power. While you can calibrate down how much CPU and memory are used, there is little need to with quad core and hyper threaded dual-core processors. Just know that the BOINC process develops a great deal of heat which must be dissipated.]

    See the full article here.

    World Community Grid (WCG) brings people together from across the globe to create the largest non-profit computing grid benefiting humanity. It does this by pooling surplus computer processing power. We believe that innovation combined with visionary scientific research and large-scale volunteerism can help make the planet smarter. Our success depends on like-minded individuals – like you.”

    WCG projects run on BOINC software from UC Berkeley.


    BOINC is a leader in the field(s) of Distributed Computing, Grid Computing and Citizen Cyberscience.BOINC is more properly the Berkeley Open Infrastructure for Network Computing.

    “Download and install secure, free software that captures your computer’s spare power when it is on, but idle. You will then be a World Community Grid volunteer. It’s that simple!” You can download the software at either WCG or BOINC.

    Please visit the project pages-

    Say No to Schistosoma

    GO Fight Against Malaria

    Drug Search for Leishmaniasis

    Computing for Clean Water

    The Clean Energy Project

    Discovering Dengue Drugs – Together

    Help Cure Muscular Dystrophy

    Help Fight Childhood Cancer

    Help Conquer Cancer

    Human Proteome Folding

    FightAIDS@Home

    World Community Grid is a social initiative of IBM Corporation
    IBM Corporation
    ibm

    IBM – Smarter Planet
    sp

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 4:19 pm on September 18, 2014 Permalink | Reply
    Tags: , , BOINC,   

    From WCG: New Team 

    September 18, 2014

    World Community Grid is pleased to welcome Vivere Ateneo, as a new partner! Vivere Ateneo is associated with the Polytechnic School at the University of Palermo, Italy and is committed to supporting World Community Grid as one of their philanthropic projects. You can learn more about their team here: http://ow.ly/BEArd

    Team Information
    Name: BOINC – Vivere Ateneo – Scuola Politecnica
    Created: 08/24/2014
    Captain: Ivan Marchese
    Country: ITALY
    Type: University or department
    Description: The project BOINC – Living University is a collaboration between Living University, the Polytechnic School of the University of Palermo and the IBM Foundation Italy to support the design of distributed computing platform WCG. The Team BOINC – Living University, wants to be a clear structure for anyone who wants to start a volunteer computing project dedicated to the development of treatments against AIDS, Cancer and energy innovation of the future (Clean Energy).

    BOINC Team Id: 31488

    World Community Grid (WCG) brings people together from across the globe to create the largest non-profit computing grid benefiting humanity. It does this by pooling surplus computer processing power. We believe that innovation combined with visionary scientific research and large-scale volunteerism can help make the planet smarter. Our success depends on like-minded individuals – like you.”

    WCG projects run on BOINC software from UC Berkeley.

    BOINC is a leader in the field(s) of Distributed Computing, Grid Computing and Citizen Cyberscience.BOINC is more properly the Berkeley Open Infrastructure for Network Computing.

    CAN ONE PERSON MAKE A DIFFERENCE? YOU BETCHA!!

    “Download and install secure, free software that captures your computer’s spare power when it is on, but idle. You will then be a World Community Grid volunteer. It’s that simple!” You can download the software at either WCG or BOINC.

    Please visit the project pages-

    Say No to Schistosoma

    GO Fight Against Malaria

    Drug Search for Leishmaniasis

    Computing for Clean Water

    The Clean Energy Project

    Discovering Dengue Drugs – Together

    Help Cure Muscular Dystrophy

    Help Fight Childhood Cancer

    Help Conquer Cancer

    Human Proteome Folding

    FightAIDS@Home

    World Community Grid is a social initiative of IBM Corporation
    IBM Corporation
    ibm

    IBM – Smarter Planet
    sp

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
Follow

Get every new post delivered to your Inbox.

Join 378 other followers

%d bloggers like this: