Tagged: Citizen Science Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 10:28 pm on December 3, 2014 Permalink | Reply
    Tags: , , , , , Citizen Science, , , , ,   

    From isgtw: “Volunteer computing: 10 years of supporting CERN through LHC@home” 


    international science grid this week

    December 3, 2014
    Andrew Purcell

    LHC@home recently celebrated a decade since its launch in 2004. Through its SixTrack project, the LHC@home platform harnesses the power of volunteer computing to model the progress of sub-atomic particles traveling at nearly the speed of light around the Large Hadron Collider (LHC) at CERN, near Geneva, Switzerland. It typically simulates about 60 particles whizzing around the collider’s 27km-long ring for ten seconds, or up to one million loops. Results from SixTrack were used to help the engineers and physicists at CERN design stable beam conditions for the LHC, so today the beams stay on track and don’t cause damage by flying off course into the walls of the vacuum tube. It’s now also being used to carry out simulations relevant to the design of the next phase of the LHC, known as the High-Luminosity LHC.

    CERN LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    “The results of SixTrack played an essential role in the design of the LHC, and the high-luminosity upgrades will naturally require additional development work on SixTrack,” explains Frank Schmidt, who works in CERN’s Accelerators and Beam Physics Group of the Beams Department and is the main author of the SixTrack code. “In addition to its use in the design stage, SixTrack is also a key tool for the interpretation of data taken during the first run of the LHC,” adds Massimo Giovannozzi, who also works in CERN’s Accelerators and Beams Physics Group. “We use it to improve our understanding of particle dynamics, which will help us to push the LHC performance even further over the coming years of operation.” He continues: “Managing a project like SixTrack within LHC@home requires resources and competencies that are not easy to find: Igor Zacharov, a senior scientist at the Particle Accelerator Physics Laboratory (LPAP) of the Swiss Federal Institute of Technology in Lausanne (EPFL), provides valuable support for SixTrack by helping with BOINC integration.”

    c
    Volunteer computing is a type of distributed computing through which members of the public donate computing resources (usually processing power) to aid research projects. Image courtesy Eduardo Diez Viñuela, Flickr (CC BY-SA 2.0).

    Before LHC@home was created, SixTrack was run only on desktop computers at CERN, using a platform called the Compact Physics Screen Saver (CPSS). This proved to be a useful tool for a proof of concept, but it was first with the launch of the LHC@home platform in 2004 that things really took off. “I am surprised and delighted by the support from our volunteers,” says Eric McIntosh, who formerly worked in CERN’s IT Department and is now an honorary member of the Beams Department. “We now have over 100,000 users all over the world and many more hosts. Every contribution is welcome, however small, as our strength lies in numbers.”

    Virtualization to the rescue

    Building on the success of SixTrack, the Virtual LHC@home project (formerly known as Test4Theory) was launched in 2011. It enables users to run simulations of high-energy particle physics using their home computers, with the results submitted to a database used as a common resource by both experimental and theoretical scientists working on the LHC.

    Whereas the code for SixTrack was ported for running on Windows, OS X, and Linux, the high-energy-physics code used by each of the LHC experiments is far too large to port in a similar way. It is also being constantly updated. “The experiments at CERN have their own libraries and they all run on Linux, while the majority of people out there have common-or-garden variety Windows machines,” explains CERN honorary staff member of the IT department and chief technology officer of the Citizen Cyberscience Centre Ben Segal. “Virtualization is the way to solve this problem.”

    The birth of the LHC@home platform

    In 2004, Ben Segal and François Grey , who were both members of CERN’s IT department at the time, were asked to plan an outreach event for CERN’s 50th anniversary that would help people around the world to get an impression of the computational challenges facing the LHC. “I had been an early volunteer for SETI@home after it was launched in 1999,” explains Grey. “Volunteer computing was often used as an illustration of what distributed computing means when discussing grid technology. It seemed to me that it ought to be feasible to do something similar for LHC computing and perhaps even combine volunteer computing and grid computing this way.”

    “I contacted David Anderson, the person behind SETI@Home, and it turned out the timing was good, as he was working on an open-source platform called BOINC to enable many projects to use the SETI@home approach,” Grey continues. BOINC (Berkeley Open Infrastructures for Network Computing)is an open-source software platform for computing with volunteered resources. It was first developed at the University of California, Berkeley in the US to manage the SETI@Home project, and uses the unused CPU and GPU cycles on a computer to support scientific research.

    “I vividly remember the day we phoned up David Anderson in Berkeley to see if we could make a SETI-like computing challenge for CERN,” adds Segal. “We needed a CERN application that ran on Windows, as over 90% of BOINC volunteers used that. The SixTrack people had ported their code to Windows and had already built a small CERN-only desktop grid to run it on, as they needed lots of CPU power. So we went with that.”

    A runaway success

    “I was worried that no one would find the LHC as interesting as SETI. Bear in mind that this was well before the whole LHC craziness started with the Angels and Demons movie, and news about possible mini black holes destroying the planet making headlines,” says Grey. “We made a soft launch, without any official announcements, in 2004. To our astonishment, the SETI@home community immediately jumped in, having heard about LHC@home by word of mouth. We had over 1,000 participants in 24 hours, and over 7,000 by the end of the week — our server’s maximum capacity.” He adds: “We’d planned to run the volunteer computing challenge for just three months, at the time of the 50th anniversary. But the accelerator physicists were hooked and insisted the project should go on.”

    Predrag Buncic, who is now coordinator of the offline group within the ALICE experiment, led work to create the CERN Virtual Machine in 2008. He, Artem Harutyunyan (former architect and lead developer of CernVM Co-Pilot), and Segal subsequently adopted this virtualization technology for use within Virtual LHC@home. This has made it significantly easier for the experiments at CERN to create their own volunteer computing applications, since it is no longer necessary for them to port their code. The long-term vision for Virtual LHC@home is to support volunteer-computing applications for each of the large LHC experiments.
    Growth of the platform

    The ATLAS experiment recently launched a project that simulates the creation and decay of supersymmetric bosons and fermions. “ATLAS@Home offers the chance for the wider public to participate in the massive computation required by the ATLAS experiment and to contribute to the greater understanding of our universe,” says David Cameron, a researcher at the University of Oslo in Norway. “ATLAS also gains a significant computing resource at a time when even more resources will be required for the analysis of data from the second run of the LHC.”

    CERN ATLAS New
    ATLAS

    ATLAS@home

    Meanwhile, the LHCb experiment has been running a limited test prototype for over a year now, with an application running Beauty physics simulations set to be launched for the Virtual LHC@home project in the near future. The CMS and ALICE experiments also have plans to launch similar applications.

    CERN LHCb New
    LHCb

    CERN CMS New
    CMS

    CERN ALICE New
    ALICE

    An army of volunteers

    “LHC@home allows CERN to get additional computing resources for simulations that cannot easily be accommodated on regular batch or grid resources,” explains Nils Høimyr, the member of the CERN IT department responsible for running the platform. “Thanks to LHC@home, thousands of CPU years of accelerator beam dynamics simulations for LHC upgrade studies have been done with SixTrack, and billions of events have been simulated with Virtual LHC@home.” He continues: “Furthermore, the LHC@home platform has been an outreach channel, giving publicity to LHC and high-energy physics among the general public.”

    See the full article here.

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    iSGTW is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, iSGTW is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read iSGTW via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
  • richardmitnick 5:23 pm on November 28, 2014 Permalink | Reply
    Tags: , , , , Citizen Science,   

    From CERN: “ATLAS@Home looks for CERN volunteers” 

    ATLAS@home

    ATLAS@home

    Mon 01 Dec 2014
    Rosaria Marraffino

    ATLAS@Home is a CERN volunteer computing project that runs simulated ATLAS events. As the project ramps up, the project team is looking for CERN volunteers to test the system before planning a bigger promotion for the public.

    as
    The ATLAS@home outreach website.

    ATLAS@Home is a large-scale research project that runs ATLAS experiment simulation software inside virtual machines hosted by volunteer computers. “People from all over the world offer up their computers’ idle time to run simulation programmes to help physicists extract information from the large amount of data collected by the detector,” explains Claire Adam Bourdarios of the ATLAS@Home project. “The ATLAS@Home project aims to extrapolate the Standard Model at a higher energy and explore what new physics may look like. Everything we’re currently running is preparation for next year’s run.”

    ATLAS@Home became an official BOINC (Berkeley Open Infrastructure for Network Computing) project in May 2014. After a beta test with SUSY events and Z decays, real production started in the summer with inelastic proton-proton interaction events. Since then, the community has grown remarkably and now includes over 10,000 volunteers spread across five continents. “We’re running the full ATLAS simulation and the resulting output files containing the simulated events are integrated with the experiment standard distributed production,” says Bourdarios.

    Compared to other LHC@Home projects, ATLAS@Home is heavier in terms of network traffic and memory requirements. “From the start, we have been successfully challenging the underlying infrastructure of LHC@Home,” says Bourdarios. “Now we’re looking for CERN volunteers to go one step further before doing a bigger public promotion.”

    e
    This simulated event display is created using ATLAS data.

    If you want to join the community and help the ATLAS experiment, you just need to download and run the necessary free software, VirtualBox and BOINC, which are available on NICE. Find out more about the project and how to join on the ATLAS@Home outreach website.

    “This project has huge outreach potential,” adds Bourdarios. “We hope to demonstrate how big discoveries are often unexpected deviations from existing models. This is why we need simulations. We’re also working on an event display, so that people can learn more about the events they have been producing and capture an image of what they have done.”

    If you have any questions about the ATLAS@Home project, e-mail atlas-comp-contact-home@cern.ch
    .

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    ATLAS@Home is a research project that uses volunteer computing to run simulations of the ATLAS experiment at CERN. You can participate by downloading and running a free program on your computer.

    ATLAS is a particle physics experiment taking place at the Large Hadron Collider at CERN, that searches for new particles and processes using head-on collisions of protons of extraordinary high energy. Petabytes of data were recorded, processed and analyzed during the first three years of data taking, leading to up to 300 publications covering all the aspects of the Standard Model of particle physics, including the discovery of the Higgs boson in 2012.

    Large scale simulation campaigns are a key ingredient for physicists, who permanently compare their data with both “known” physics and “new” phenomena predicted by alternative models of the universe, particles and interactions. This simulation runs on the WLCG Computing Grid and at any one point there are around 150,000 tasks running. You can help us run even more simulation by using your computer’s idle time to run these same tasks.

    No knowledge of particle physics is required, but for those interested in more details, at the moment we simulate the creation and decay of supersymmetric bosons and fermions, new types of particles that we would love to discover next year, as they would help us to shed light on the dark matter mystery!

    This project runs on BOINC software from UC Berkeley.
    Visit BOINC, download and install the software and attach to the project.

    BOINCLarge

     
  • richardmitnick 3:22 pm on November 18, 2014 Permalink | Reply
    Tags: , , Citizen Science, ,   

    From NOVA: “Why There’s No HIV Cure Yet” 

    [After the NOVA article, I tell you how you and your family, friends, and colleagues can help to find a cure for AIDS and other diseases]

    PBS NOVA

    NOVA

    27 Aug 2014
    Alison Hill

    Over the past two years, the phrase “HIV cure” has flashed repeatedly across newspaper headlines. In March 2013, doctors from Mississippi reported that the disease had vanished in a toddler who was infected at birth. Four months later, researchers in Boston reported a similar finding in two previously HIV-positive men. All three were no longer required to take any drug treatments. The media heralded the breakthrough, and there was anxious optimism among HIV researchers. Millions of dollars of grant funds were earmarked to bring this work to more patients.

    But in December 2013, the optimism evaporated. HIV had returned in both of the Boston men. Then, just this summer, researchers announced the same grim results for the child from Mississippi. The inevitable questions mounted from the baffled public. Will there ever be a cure for this disease? As a scientist researching HIV/AIDS, I can tell you there’s no straightforward answer. HIV is a notoriously tricky virus, one that’s eluded promising treatments before. But perhaps just as problematic is the word “cure” itself.

    Science has its fair share of trigger words. Biologists prickle at the words “vegetable” and “fruit”—culinary terms which are used without a botanical basis—chemists wrinkle their noses at “chemical free,” and physicists dislike calling “centrifugal” a force—it’s not; it only feels like one. If you ask an HIV researcher about a cure for the disease, you’ll almost certainly be chastised. What makes “cure” such a heated word?

    t
    HIV hijacks the body’s immune system by attacking T cells.

    It all started with a promise. In the early 1980s, doctors and public health officials noticed large clusters of previously healthy people whose immune systems were completely failing. The new condition became known as AIDS, for “acquired immunodeficiency syndrome.” A few years later, in 1984, researchers discovered the cause—the human immunodeficiency virus, now known commonly as HIV. On the day this breakthrough was announced, health officials assured the public that a vaccine to protect against the dreaded infection was only two years away. Yet here we are, 30 years later, and there’s still no vaccine. This turned out to be the first of many overzealous predictions about controlling the HIV epidemic or curing infected patients.

    The progression from HIV infection to AIDS and eventual death occurs in over 99% of untreated cases—making it more deadly than Ebola or the plague. Despite being identified only a few decades ago, AIDS has already killed 25 million people and currently infects another 35 million, and the World Health Organization lists it as the sixth leading cause of death worldwide.

    HIV disrupts the body’s natural disease-fighting mechanisms, which makes it particularly deadly and complicates efforts to develop a vaccine against it. Like all viruses, HIV gets inside individual cells in the body and highjacks their machinery to make thousands of copies of itself. HIV replication is especially hard for the body to control because the white blood cells it infects, and eventually kills, are a critical part of the immune system. Additionally, when HIV copies its genes, it does so sloppily. This causes it to quickly mutate into many different strains. As a result, the virus easily outwits the body’s immune defenses, eventually throwing the immune system into disarray. That gives other obscure or otherwise innocuous infections a chance to flourish in the body—a defining feature of AIDS.

    Early Hope

    In 1987, the FDA approved AZT as the first drug to treat HIV. With only two years between when the drug was identified in the lab and when it was available for doctors to prescribe, it was—and remains—the fastest approval process in the history of the FDA. AZT was widely heralded as a breakthrough. But as the movie The Dallas Buyer’s Club poignantly retells, AZT was not the miracle drug many hoped. Early prescriptions often elicited toxic side-effects and only offered a temporary benefit, as the virus quickly mutated to become resistant to the treatment. (Today, the toxicity problems have been significantly reduced, thanks to lower doses.) AZT remains a shining example of scientific bravura and is still an important tool to slow the infection, but it is far from the cure the world had hoped for.

    In three decades, over 25 highly-potent drugs have been developed and FDA-approved to treat HIV.

    Then, in the mid-1990s, some mathematicians began probing the data. Together with HIV scientists, they suggested that by taking three drugs together, we could avoid the problem of drug resistance. The chance that the virus would have enough mutations to allow it to avoid all drugs at once, they calculated, would simply be too low to worry about. When the first clinical trials of these “drug cocktails” began, both mathematical and laboratory researchers watched the levels of virus drop steadily in patients until they were undetectable. They extrapolated this decline downwards and calculated that, after two to three years of treatment, all traces of the virus should be gone from a patient’s body. When that happened, scientists believed, drugs could be withdrawn, and finally, a cure achieved. But when the time came for the first patients to stop their drugs, the virus again seemed to outwit modern medicine. Within a few weeks of the last pill, virus levels in patients’ blood sprang up to pre-treatment levels—and stayed there.

    In the three decades since, over 25 more highly-potent drugs have been developed and FDA-approved to treat HIV. When two to five of them are combined into a drug cocktail, the mixture can shut down the virus’s replication, prevent the onset of AIDS, and return life expectancy to a normal level. However, patients must continue taking these treatments for their entire lives. Though better than the alternative, drug regimens are still inconvenient and expensive, especially for patients living in the developing world.

    Given modern medicine’s success in curing other diseases, what makes HIV different? By definition, an infection is cured if treatment can be stopped without the risk of it resurfacing. When you take a week-long course of antibiotics for strep throat, for example, you can rest assured that the infection is on track to be cleared out of your body. But not with HIV.

    A Bad Memory

    The secret to why HIV is so hard to cure lies in a quirk of the type of cell it infects. Our immune system is designed to store information about infections we have had in the past; this property is called “immunologic memory.” That’s why you’re unlikely to be infected with chickenpox a second time or catch a disease you were vaccinated against. When an infection grows in the body, the white blood cells that are best able to fight it multiply repeatedly, perfecting their infection-fighting properties with each new generation. After the infection is cleared, most of these cells will die off, since they are no longer needed. However, to speed the counter-attack if the same infection returns, some white blood cells will transition to a hibernation state. They don’t do much in this state but can live for an extremely long time, thereby storing the “memory” of past infections. If provoked by a recurrence, these dormant cells will reactivate quickly.

    This near-immortal, sleep-like state allows HIV to persist in white blood cells in a patient’s body for decades. White blood cells infected with HIV will occasionally transition to the dormant state before the virus kills them. In the process, the virus also goes temporarily inactive. By the time drugs are started, a typical infected person contains millions of these cells with this “latent” HIV in them. Drug cocktails can prevent the virus from replicating, but they do nothing to the latent virus. Every day, some of the dormant white blood cells wake up. If drug treatment is halted, the latent virus particles can restart the infection.

    Latent HIV’s near-immortal, sleep-like state allows it to persist in white blood cells in a patient’s body for decades.

    HIV researchers call this huge pool of latent virus the “barrier to a cure.” Everyone’s looking for ways to get rid of it. It’s a daunting task, because although a million HIV-infected cells may seem like a lot, there are around a million times that many dormant white blood cells in the whole body. Finding the ones that contain HIV is a true needle-in-a-haystack problem. All that remains of a latent virus is its DNA, which is extremely tiny compared to the entire human genome inside every cell (about 0.001% of the size).
    Defining a Cure

    Around a decade ago, scientists began to talk amongst themselves about what a hypothetical cure could look like. They settled on two approaches. The first would involve purging the body of latent virus so that if drugs were stopped, there would be nothing left to restart the infection. This was often called a “sterilizing cure.” It would have to be done in a more targeted and less toxic way than previous attempts of the late 1990s, which, because they attempted to “wake up” all of the body’s dormant white blood cells, pushed the immune system into a self-destructive overdrive. The second approach would instead equip the body with the ability to control the virus on its own. In this case, even if treatment was stopped and latent virus reemerged, it would be unable to produce a self-sustaining, high-level infection. This approach was referred to as a “functional cure.”

    The functional cure approach acknowledged that latency alone was not the barrier to a cure for HIV. There are other common viruses that have a long-lived latent state, such as the Epstein-Barr virus that causes infectious mononucleosis (“mono”), but they rarely cause full-blown disease when reactivated. HIV is, of course, different because the immune system in most people is unable to control the infection.

    The first hint that a cure for HIV might be more than a pipe-dream came in 2008 in a fortuitous human experiment later known as the “Berlin patient.” The Berlin patient was an HIV-positive man who had also developed leukemia, a blood cancer to which HIV patients are susceptible. His cancer was advanced, so in a last-ditch effort, doctors completely cleared his bone marrow of all cells, cancerous and healthy. They then transplanted new bone marrow cells from a donor.

    Fortunately for the Berlin patient, doctors were able to find a compatible bone marrow donor who carried a unique HIV-resistance mutation in a gene known as CCR5. They completed the transplant with these cells and waited.

    For the last five years, the Berlin patient has remained off treatment without any sign of infection. Doctors still cannot detect any HIV in his body. While the Berlin patient may be cured, this approach cannot be used for most HIV-infected patients. Bone marrow transplants are extremely risky and expensive, and they would never be conducted in someone who wasn’t terminally ill—especially since current anti-HIV drugs are so good at keeping the infection in check.

    Still, the Berlin patient was an important proof-of-principle case. Most of the latent virus was likely cleared out during the transplant, and even if the virus remained, most strains couldn’t replicate efficiently given the new cells with the CCR5 mutation. The Berlin patient case provides evidence that at least one of the two cure methods (sterilizing or functional), or perhaps a combination of them, is effective.

    Researchers have continued to try to find more practical ways to rid patients of the latent virus in safe and targeted ways. In the past five years, they have identified multiple anti-latency drug candidates in the lab. Many have already begun clinical trials. Each time, people grow optimistic that a cure will be found. But so far, the results have been disappointing. None of the drugs have been able to significantly lower levels of latent virus.

    In the meantime, doctors in Boston have attempted to tease out which of the two cure methods was at work in the Berlin patient. They conducted bone marrow transplants on two HIV-infected men with cancer—but this time, since HIV-resistant donor cells were not available, they just used typical cells. Both patients continued their drug cocktails during and after the transplant in the hopes that the new cells would remain HIV-free. After the transplants, no HIV was detectable, but the real test came when these patients volunteered to stop their drug regimens. When they remained HIV-free a few months later, the results were presented at the International AIDS Society meeting in July 2013. News outlets around the world declared that two more individuals had been cured of HIV.

    Latent virus had likely escaped the detection methods available.

    It quickly became clear that everyone had spoken too soon. Six months later, researchers reported that the virus had suddenly and rapidly returned in both individuals. Latent virus had likely escaped the detection methods available—which are not sensitive enough—and persisted at low, but significant levels. Disappointment was widespread. The findings showed that even very small amounts of latent virus could restart an infection. It also meant meant that the anti-latency drugs in development would need to be extremely potent to give any hope of a cure.

    But there was one more hope—the “Mississippi baby.” A baby was born to an HIV-infected mother who had not received any routine prenatal testing or treatment. Tests revealed high levels of HIV in the baby’s blood, so doctors immediately started the infant on a drug cocktail, to be continued for life.

    The mother and child soon lost touch with their health care providers. When they were relocated a few years later, doctors learned that the mother had stopped giving drugs to the child several months prior. The doctors administered all possible tests to look for signs of the virus, both latent and active, but they didn’t find any evidence. They chose not to re-administer drugs, and a year later, when the virus was still nowhere to be found, they presented the findings to the public. It was once again heralded as a cure.

    Again, it was not to be. Just last month, the child’s doctors announced that the virus had sprung back unexpectedly. It seemed that even starting drugs as soon as infection was detected in the newborn could not prevent the infection from returning over two years later.
    Hope Remains

    Despite our grim track record with the disease, HIV is probably not incurable. Although we don’t have a cure yet, we’ve learned many lessons along the way. Most importantly, we should be extremely careful about using the word “cure,” because for now, we’ll never know if a person is cured until they’re not cured.

    Clearing out latent virus may still be a feasible approach to a cure, but the purge will have to be extremely thorough. We need drugs that can carefully reactivate or remove latent HIV, leaving minimal surviving virus while avoiding the problems that befell earlier tests that reactivated the entire immune system. Scientists have proposed multiple, cutting-edge techniques to engineer “smart” drugs for this purpose, but we don’t yet know how to deliver this type of treatment safely or effectively.

    As a result, most investigations focus on traditional types of drugs. Researchers have developed ways to rapidly scan huge repositories of existing medicines for their ability to target latent HIV. These methods have already identified compounds that were previously used to treat alcoholism, cancer, and epilepsy, and researchers are repurposing them to be tested in HIV-infected patients.
    The less latent virus that remains, the less chance there is that the virus will win the game of chance.

    Mathematicians are also helping HIV researchers evaluate new treatments. My colleagues and I use math to take data collected from just a few individuals and fill in the gaps. One question we’re focusing on is exactly how much latent virus must be removed to cure a patient, or at least to let them stop their drug cocktails for a few years. Each cell harboring latent virus is a potential spark that could restart the infection. But we don’t know when the virus will reactivate. Even once a single latent virus awakens, there are still many barriers it must overcome to restart a full-blown infection. The less latent virus that remains, the less chance there is that the virus will win this game of chance. Math allows us to work out these odds very precisely.

    Our calculations show that “apparent cures”—where patients with latent virus levels low enough to escape detection for months or years without treatment—are not a medical anomaly. In fact, math tells us that they are an expected result of these chance dynamics. It can also help researchers determine how good an anti-latency drug should be before it’s worth testing in a clinical trial.

    Many researchers are working to augment the body’s ability to control the infection, providing a functional cure rather than a sterilizing one. Studies are underway to render anyone’s immune cells resistant to HIV, mimicking the CCR5 mutation that gives some people natural resistance. Vaccines that could be given after infection, to boost the immune response or protect the body from the virus’s ill effects, are also in development.

    In the meantime, treating all HIV-infected individuals—which has the added benefit of preventing new transmissions—remains the best way to control the epidemic and reduce mortality. But the promise of “universal treatment” has also not materialized. Currently, even in the U.S., only 25% of HIV-positive people have their viral levels adequately suppressed by treatment. Worldwide, for every two individuals starting treatment, three are newly infected. While there’s no doubt that we’ve made tremendous progress in fighting the virus, we have a long way to go before the word “cure” is not taboo when it comes to HIV/AIDS.

    See the full article here.

    Did you know that you can help in the fight against AIDS? By donating time on your computer to the Fight Aids at Home project of World Community Grid, you can become a part of the solution. The work is called “crunching” because you are crunching computational data the results of which will then be fed back into the necessary lab work. We save researchers literally millions of hours of lab time in this process.
    Vsit World Community Grid (WCG) or Berkeley Open infrastructure for Network Computing (BOINC). Download the BOINC software and install it on your computer. Then visit WCG and attach to the FAAH project. The project will send you computational work units. Your computer will process them and send the results back to the project, the project will then send you more work units. It is that simple. You do nothing, unless you want to get into the nuts and bolts of the BOINC software. If you take up this work, and if you see it as valuable, please tell your family, friends and colleagues, anyone with a computer, even an Android tablet. We found out that my wife’s oncologist’s father in Brazil is a cruncher on two projects from WCG.

    This is the projects web site. Take a look.

    While you are visiting BOINC and WCG, look around at all of the very valuable projects being conducted at some of the worlds most distinguished universities and scientific institutions. You can attach to as many as you like, on one or a number of computers. You can only be a help here, particpating in Citizen Science.

    This is a look at the present and past projects at WCG:

    Please visit the project pages-

    Mapping Cancer Markers
    mappingcancermarkers2

    Uncovering Genome Mysteries
    Uncovering Genome Mysteries

    Say No to Schistosoma

    GO Fight Against Malaria

    Drug Search for Leishmaniasis

    Computing for Clean Water

    The Clean Energy Project

    Discovering Dengue Drugs – Together

    Help Cure Muscular Dystrophy

    Help Fight Childhood Cancer

    Help Conquer Cancer

    Human Proteome Folding

    FightAIDS@Home

    World Community Grid is a social initiative of IBM Corporation
    IBM Corporation
    ibm

    IBM – Smarter Planet
    <img

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NOVA is the highest rated science series on television and the most watched documentary series on public television. It is also one of television’s most acclaimed series, having won every major television award, most of them many times over.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 1:47 pm on November 11, 2014 Permalink | Reply
    Tags: , , , Citizen Science, , ,   

    From DDDT at WCG: “Discovering Dengue Drugs – Together” 

    New WCG Logo

    10 Nov 2014
    By: Dr. Stan Watowich, PhD
    University of Texas Medical Branch (UTMB) in Galveston, Texas

    Summary
    For week five of our decade of discovery celebrations we’re looking back at the Discovering Dengue Drugs – Together project, which helped researchers at the University of Texas Medical Branch at Galveston search for drugs to help combat dengue – a debilitating tropical disease that threatens 40% of the world’s population. Thanks to World Community Grid volunteers, researchers have identified a drug lead that has the potential to stop the virus in its tracks.

    mic

    Dengue fever, also known as “breakbone fever”, causes excruciating joint and muscle pain, high fever and headaches. Severe dengue, known as “dengue hemorrhagic fever”, has become a leading cause of hospitalization and death among children in many Asian and Latin American countries. According to the World Health Organization (WHO), over 40% of the world’s population is at risk from dengue; another study estimated there were 390 million cases in 2010 alone.

    The disease is a mosquito-borne infection found in tropical and sub-tropical regions – primarily in the developing world. It belongs to the flavivirus family of viruses, together with Hepatitis C, West Nile and Yellow Fever.

    Despite the fact dengue represents a critical global health concern, it has received limited attention from affluent countries until recently and is widely considered to be a neglected tropical disease. Currently, no approved vaccines or treatments exist for the disease. We launched Discovering Dengue Drugs – Together on World Community Grid in 2007 to search for drugs to treat dengue infections using a computer-based discovery approach.

    In the first phase of the project, we aimed to identify compounds that could be used to develop dengue drugs. Thanks to the computing power donated by World Community Grid volunteers, my fellow researchers and I at the University of Texas Medical Branch in Galveston, Texas, screened around three million chemical compounds to determine which ones would bind to the dengue virus and disable it.

    By 2009 we had found several thousand promising compounds to take to the next stage of testing. We began identifying the strongest compounds from the thousands of potentials, with the goal of turning these into molecules that could be suitable for human clinical trials.

    We have recently made an exciting discovery using insights from Discovering Dengue Drugs – Together to guide additional calculations on our web portal for advanced computer-based drug discovery, DrugDiscovery@TACC. A molecule has demonstrated success in binding to and disabling a key dengue enzyme that is necessary for the virus to replicate.

    Furthermore, it also shows signs of being able to effectively disable related flaviviruses, such as the West Nile virus. Importantly, our newly discovered drug lead also demonstrates no negative side effects such as adverse toxicity, carcinogenicity or mutagenicity risks, making it a promising antiviral drug candidate for dengue and potentially other flavivirues. We are working with medicinal chemists to synthesize variants of this exciting candidate molecule with the goal of improving its activity for planned pre-clinical and clinical trials.

    I’d like to express my gratitude for the dedication of World Community Grid volunteers. The advances we are making, and our improved understanding of drug discovery software and its current limitations, would not have been possible without your donated computing power.

    If you’d like to help researchers make more ground-breaking discoveries like this – and have the chance of winning some fantastic prizes – take part in our decade of discovery competition by encouraging your friends to sign up to World Community Grid today. There’s a week left and the field is wide open – get started today!

    See the full article here.

    World Community Grid (WCG) brings people together from across the globe to create the largest non-profit computing grid benefiting humanity. It does this by pooling surplus computer processing power. We believe that innovation combined with visionary scientific research and large-scale volunteerism can help make the planet smarter. Our success depends on like-minded individuals – like you.”

    WCG projects run on BOINC software from UC Berkeley.

    BOINC is a leader in the field(s) of Distributed Computing, Grid Computing and Citizen Cyberscience.BOINC is more properly the Berkeley Open Infrastructure for Network Computing.

    CAN ONE PERSON MAKE A DIFFERENCE? YOU BETCHA!!

    “Download and install secure, free software that captures your computer’s spare power when it is on, but idle. You will then be a World Community Grid volunteer. It’s that simple!” You can download the software at either WCG or BOINC.

    Please visit the project pages-

    Mapping Cancer Markers
    mappingcancermarkers2

    Uncovering Genome Mysteries
    Uncovering Genome Mysteries

    Say No to Schistosoma

    GO Fight Against Malaria

    Drug Search for Leishmaniasis

    Computing for Clean Water

    The Clean Energy Project

    Discovering Dengue Drugs – Together

    Help Cure Muscular Dystrophy

    Help Fight Childhood Cancer

    Help Conquer Cancer

    Human Proteome Folding

    FightAIDS@Home

    World Community Grid is a social initiative of IBM Corporation
    IBM Corporation
    ibm

    IBM – Smarter Planet
    sp

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 4:52 pm on November 6, 2014 Permalink | Reply
    Tags: , , Citizen Science, ,   

    From FAAH at WCG: “Teamwork yields experimental support for FightAIDS@Home calculations” 

    New WCG Logo

    By: The FightAIDS@Home research team
    6 Nov 2014

    Summary
    Imaging studies have now confirmed some of the computational predictions made during FightAIDS@Home, providing important confirmation of our methodology and the value of your computational results. This work is ongoing, but promises to increase our understanding of how HIV protease can be disrupted.

    site
    The “exo-site” discovered in HIV protease (shown here in green), showing the original bound 4d9 fragment (shown here as red and orange sticks) and the volume (shown as the orange mesh) that is being targeted by FightAIDS@Home. (image credit: Stefano Forli, TSRI)

    Our lab at the Scripps Research Institute, La Jolla, is part of the HIV Interaction and Viral Evolution (HIVE) Center – a group of investigators with expertise in HIV crystallography, virology, molecular biology, biochemistry, synthetic chemistry and computational biology. This means that we have world-class resources available to verify and build upon our computational work, including the nuclear magnetic resonance (NMR) facility at the Scripps Research Institute, Florida. NMR is a technique for determining the molecular structure of a chemical sample, and therefore is very useful for validating some of the predictions made during the computational phase of FightAIDS@Home.

    We’re excited to announce that our collaborators at Scripps Florida have now optimized their NMR experiments and have been able to characterize the binding of promising ligands with the prospective allosteric sites on the HIV protease. These sites represent new footholds in the search for therapies that defeat viral drug resistance. The NMR experiment allows us to detect the location of the interactions between the candidate inhibitors and the protein, but unlike X-ray crystallography experiments, these interactions are measured in solution, which better represents the biological environment.

    In fact, the first results from the NMR experiments validated the exo site we so thoroughly investigated in FightAIDS@Home. As a result, we now have experimental evidence that a small molecule binds to the exo site in solution with structural effects that seem to perturb the dynamic behavior of protease, even with a known inhibitor in the active site.

    There are many more NMR experiments still to run, but another advantage of NMR over crystallography is that it does not require the lengthy step of growing diffraction-quality crystals. This allows higher experimental throughput, so we look forward to experimental confirmation of many more compounds in much shorter time. So far we have shipped 15 compounds to test and another batch is going to be sent this week. The new compounds will help to validate another potential interaction site on one of HIV protease’s two movable “flaps”.

    Once the validation is completed, we will proceed to test a number of compounds that we identified in different FightAIDS@Home experiments for all of the target protease allosteric sites.

    As always, thank you for your support! This research would not be possible without your valuable computing time.

    The Scripps research team needs your help to continue making progress on developing new treatments for AIDS! Take part in our decade of discovery competition by encouraging your friends to sign up to World Community Grid today to start donating their computer or mobile device’s computing power to FightAIDS@Home. There’s just over a week left and some great prizes are up for grabs – get started today!

    Here’s to another decade of discovery.

    See the full article here.

    World Community Grid (WCG) brings people together from across the globe to create the largest non-profit computing grid benefiting humanity. It does this by pooling surplus computer processing power. We believe that innovation combined with visionary scientific research and large-scale volunteerism can help make the planet smarter. Our success depends on like-minded individuals – like you.”

    WCG projects run on BOINC software from UC Berkeley.

    BOINC is a leader in the field(s) of Distributed Computing, Grid Computing and Citizen Cyberscience.BOINC is more properly the Berkeley Open Infrastructure for Network Computing.

    CAN ONE PERSON MAKE A DIFFERENCE? YOU BETCHA!!

    “Download and install secure, free software that captures your computer’s spare power when it is on, but idle. You will then be a World Community Grid volunteer. It’s that simple!” You can download the software at either WCG or BOINC.

    Please visit the project pages-

    Mapping Cancer Markers
    mappingcancermarkers2

    Uncovering Genome Mysteries
    Uncovering Genome Mysteries

    Say No to Schistosoma

    GO Fight Against Malaria

    Drug Search for Leishmaniasis

    Computing for Clean Water

    The Clean Energy Project

    Discovering Dengue Drugs – Together

    Help Cure Muscular Dystrophy

    Help Fight Childhood Cancer

    Help Conquer Cancer

    Human Proteome Folding

    FightAIDS@Home

    World Community Grid is a social initiative of IBM Corporation
    IBM Corporation
    ibm

    IBM – Smarter Planet
    sp

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 8:26 am on November 1, 2014 Permalink | Reply
    Tags: , , , Citizen Science, ,   

    From RAS: “When did galaxies settle down?” 

    Royal Astronomical Society

    Royal Astronomical Society

    30 October 2014
    Media contact
    Dr Robert Massey
    Royal Astronomical Society
    Tel: +44 (0)20 7734 3307 / 4582
    Mob: +44 (0)794 124 8035
    rm@ras.org.uk

    Science contact
    Dr Brooke Simmons
    University of Oxford
    Tel: +44 (0)1865 273637
    brooke.simmons@astro.ox.ac.uk

    Astronomers have long sought to understand exactly how the universe evolved from its earliest history to the cosmos we see around us in the present day. In particular, the way that galaxies form and develop is still a matter for debate. Now a group of researchers have used the collective efforts of the hundreds of thousands of people that volunteer for the Galaxy Zoo project to shed some light on this problem. They find that galaxies may have settled into their current form some two billion years earlier than previously thought.

    gz
    A Hubble Space Telescope image of a spiral galaxy seen when the Universe was less than a third of its current age, yet showing the same barred feature as much older, settled disk galaxies. Credit: NASA, ESA, J. Kartaltepe (NOAO), C. Lintott (Oxford), H. Ferguson (STScI), S. Faber (UCO).

    Dr Brooke Simmons of the University of Oxford and her collaborators describe the work in a paper in Monthly Notices of the Royal Astronomical Society. The team set Zoo volunteers the task of classifying the shapes of tens of thousands of galaxies observed by the Hubble Space Telescope. These objects are typically very distant, so we see them as they appeared more than 10 billion years ago, when the universe was about 3 billion years old, less than a quarter of its present age.

    NASA Hubble Telescope
    NASA Hubble schematic
    NASA/ESA Hubble

    The newly classified galaxies are striking in that they look a lot like those in today’s universe, with disks, bars and spiral arms. But theorists predict that these should have taken another 2 billion years to begin to form, so things seem to have been settling down a lot earlier than expected.

    ngc
    A European Southern Observatory image of the barred spiral galaxy NGC 1365, rotated to match the orientation of the first image. NGC 1365 is about 56 million light years away, so we see it as it appears 56 million years ago, or 10 billion years later than the galaxy in the HST image. Credit: ESO/IDA/Danish 1.5 m/ R. Gendler, J-E. Ovaldsen, C. Thöne, and C. Feron. Brooke comments: “When we started looking for these galaxies, we didn’t really know what we’d find. We had predictions from galaxy simulations that we shouldn’t find any of the barred features that we see in nearby, evolved galaxies, because very young galaxies might be too agitated for them to form.”

    ‘But we now know that isn’t the case. With the public helping us search through many thousands of images of distant galaxies, we discovered that some galaxies settle very early on in the Universe.”

    See the full article here.

    The Royal Astronomical Society (RAS), founded in 1820, encourages and promotes the study of astronomy, solar-system science, geophysics and closely related branches of science.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 7:44 am on November 1, 2014 Permalink | Reply
    Tags: , , , Citizen Science, ,   

    From Science Daily: “Planet discovered that won’t stick to a schedule” 

    ScienceDaily Icon

    Science Daily

    October 30, 2014
    From materials By Jim Shelton, Yale University

    For their latest discovery, Yale astronomers and the Planet Hunters program have found a low-mass, low-density planet with a punctuality problem.

    scoppe
    Silhouette of telescope (stock image). Not only did Planet Hunters spot PH3c, but the discovery also enabled astronomers to better characterize two other planets — one on each side of PH3c. An outer planet, PH3d, is slightly larger and heavier than Saturn, for example. An inner planet, PH3b, may have a rocky composition, like Earth.
    Credit: © Tryfonov / Fotolia

    The new planet, called PH3c, is located 2,300 PH3 light years from Earth and has an atmosphere loaded with hydrogen and helium. It is described in the Oct. 29 online edition of The Astrophysical Journal.

    The elusive orb nearly avoided detection. This is because PH3c has a highly inconsistent orbit time around its sun, due to the gravitational influence of other planets in its system. “On Earth, these effects are very small, only on the scale of one second or so,” said Joseph Schmitt, a Yale graduate student and first author of the paper. “PH3c’s orbital period changed by 10.5 hours in just 10 orbits.”

    That inconsistency kept it from being picked up by automated computer algorithms that search stellar light curves and identify regular dips caused by objects passing in front of stars.

    Luckily, Planet Hunters came to the rescue. The program, which has found more than 60 planet candidates since 2010, enlists citizen scientists to check survey data from the Kepler spacecraft. Planet Hunters recently unveiled a new website and an expanded scientific mission.

    NASA Kepler Telescope
    NASA/Kepler

    “It harnesses the human dimension of science,” said Debra Fischer, who leads the exoplanets group at Yale and is a co-author of the paper. “Computers can’t find the unexpected, but people can, when they eyeball the data.”

    More than 300,000 volunteers are part of Planet Hunters, which is coordinated by Yale and the University of Oxford. The program’s revamped website will allow Planet Hunters to analyze data more quickly than before, Fischer said. In addition, Planet Hunters is launching an effort to see if there is a correlation between types of stars and the planets that form around them.

    “I think we’ll be able to contribute some really unique science this way,” Fischer said.

    Not only did Planet Hunters spot PH3c, but the discovery also enabled astronomers to better characterize two other planets — one on each side of PH3c. An outer planet, PH3d, is slightly larger and heavier than Saturn, for example. An inner planet, PH3b, may have a rocky composition, like Earth.

    “Finding the middle planet was key to confirming the others and allowing us to find their masses,” Schmitt said. “The outer planet’s orbital period also changes slightly, by about 10 minutes. You need to see both planets’ changing orbital periods in order to find out the masses of the planets. One planet doesn’t give enough information.”

    There’s also a quirky aspect of the planetary trio, Schmitt added. The outer planet’s year is 1.91 times longer than the middle planet’s year, and the middle planet’s year is 1.91 times longer than the inner planet’s year.

    “We’re not sure if this is just a coincidence or whether this might tell us something about how the planets were formed,” Schmitt said.

    See the full article here.

    ScienceDaily is one of the Internet’s most popular science news web sites. Since starting in 1995, the award-winning site has earned the loyalty of students, researchers, healthcare professionals, government agencies, educators and the general public around the world. Now with more than 3 million monthly visitors, ScienceDaily generates nearly 15 million page views a month and is steadily growing in its global audience.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 2:04 pm on August 14, 2014 Permalink | Reply
    Tags: , , , Citizen Science, , ,   

    From SPACE.com: “After Moon Flyby, Vintage NASA Spacecraft to Study the Sun” True Citizen Science 

    space-dot-com logo

    SPACE.com

    August 14, 2014
    Elizabeth Howell

    As a vintage spacecraft soars out of Earth’s vicinity, the private team working with it plans to use the probe for solar science for as long as they can stay in touch with the satellite.

    The minds behind the so-called ISEE-3 Reboot Project have been controlling the 36-year-old International Sun-Earth Explorer (ISEE-3) for the past few weeks. At first they planned to park it close to Earth, but they abandoned that plan after finding out that the probe was out of the pressurant needed to move the craft.

    isee

    At least some of the 13 science instruments are still working, however. So the old spacecraft will do one of the things it was originally tasked to do: study solar weather. Its measurements will be compared with those taken by the network of satellites that are closer to Earth’s vicinity like NASA’s Solar TErrestrial RElations Observatory (STEREO).

    stereo
    NASA STEREO

    “By comparing the measurements between these spacecraft, we can get some idea of the scale sizes of the turbulence of the solar wind and the structure within the solar wind,” said Christopher Scott, a United Kingdom-based project scientist with STEREO, in a Google+ Hangout on ISEE-3 Sunday (Aug. 10).

    He added this would be important information for space weather forecasts, which allow scientists to predict how severe a storm could be when it reaches Earth. Strong solar storms have the potential to damage satellites in orbit or even cause ill effects to power systems on the ground.

    ISEE-3 passed within about 7,500 miles (12,000 kilometers) of the moon on Sunday before continuing on its orbit around the sun. Officials on the broadcast predict they will be able to hear from the probe for about the next couple of months.

    Before going into lunar space, ISEE-3 passed through a part of Earth’s magnetic field, specifically the magnetopause (the outer limit of the magnetosphere) and the bow shock (the area between the magnetopause and more neutral space.) The University of Iowa is now examining data collected during the fly-through, said co-leader Dennis Wingo.

    magnetopause
    Artistic rendition of the Earth’s magnetopause. The magnetopause is where the pressure from the solar wind and the planet’s magnetic field are equal. The position of the Sun would be far to the left in this image

    “To me, it’s absolutely thrilling that we’re getting all this space weather,” Wingo said during the broadcast. Officials also noted that learning about space weather in our solar system could help researchers learn more about space weather in other solar systems.

    The founders behind the ISEE-3 project raised roughly $160,000 through crowdfunding in order to open communication with and attempt to move the spacecraft.

    During the Sunday broadcast, co-leader of the project Keith Cowing said that most donations were only in the $10 to $50 range, and mostly from contributors who are not self-described space people.

    “I tweeted a joke about disco once and I suddenly got donations from people saying, ‘Hey, I heard your comment about disco,'” Cowing said.

    ISEE-3 was launched in the 1970s to examine solar activity, and was repurposed for flying by two comets, among other tasks. NASA put the spacecraft into hibernation in 1998, where it remained until the group made contact with it again this year under a Space Act Agreement.

    See the full article here.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 4:04 pm on January 10, 2014 Permalink | Reply
    Tags: , , , Citizen Science, ,   

    From Symmetry: “Citizen scientists discover hidden galaxies at record speed” 

    January 10, 2014
    Kelen Tuttle

    The distant universe looks a little clearer, thanks to tens of thousands of citizen scientists who classified more than 6 million images over the past three days.

    mes
    Courtesy of VICS82, thanks to TERAPIX/CNRS/INSU/CASU

    This week, encouraged by a program on the BBC, more than 55,000 citizen scientists powered up their computers, navigated to Spacewarps.org and, over the course of just 72 hours, made a difference to the future of astrophysics.

    “I’ve never seen anything like it,” says Phil Marshall, an astrophysicist at the Kavli Institute for Particle Astrophysics and Cosmology, jointly located at SLAC and Stanford University. “The response we’ve had has been incredible and really shows what sort of impact citizen science could have on astronomy in the big data era.”

    The Space Warps website invites members of the general public to inspect images captured by some of the most powerful survey telescopes on Earth, searching for an unusual phenomenon called “gravitational lensing.” The site was conceived of by University of Oxford astronomer Aprajita Verma, University of Tokyo research fellow Anupreeta More, and Marshall, and was designed and built by the Zooniverse team at Adler Planetarium in Chicago in consultation with a team of dedicated citizen scientists.

    In a gravitational lens, the light from a distant object—such as a faraway galaxy—interacts with another galaxy on its way to our telescopes on Earth. Due to the nature of gravity and space, that intermediate galaxy bends the light rays, focusing the original light and actually making it easier to see on Earth.

    Only a few hundred gravitational lenses have been discovered by astronomers to date, but Marshall says that there are many more out there. They’re just hard to find because it’s extremely time intensive to scan telescope data arcsecond by arcsecond for the telltale signs of a gravitational lens. Computers (or, as Marshall calls them, robots) trained to recognize patterns have difficulty identifying gravitational lenses because the lensed features tend to be faint and tricky to distinguish from the spiral arms, tidal tails and satellites of ordinary galaxies.

    “Robots can get quite confused if the lenses are not obvious,” he says. “Humans are better at finding the more difficult ones because they understand the context of the images.”

    That’s where the citizen scientists come in. Through the Space Warps website, anyone with a few extra minutes can take a quick tutorial on identifying gravitational lenses and then can click through telescope images to search for new ones.

    This week, the first of the three daily episodes of the BBC television series Stargazing Live, which highlights astronomy research in the United Kingdom, included a challenge encouraging viewers to find undiscovered galaxies at the edge of space through SpaceWarps.org. The response was immense. Tens of thousands of people visited the website and, in just three days, made more than 6 million image classifications.

    “It was brilliant,” says Marshall. “People saw the site on their TV sets, powered up their computers or picked up their iPads, and started classifying. At the peak, they were classifying at a rate of about one million images per hour.”

    The images that viewers classified were from infrared data taken with the CFHT telescope in Hawaii and the [ESO] VISTA telescope in Chile by the VICS82 survey team, led by Jim Geach of the University of Hertfordshire. The patch of sky in question, though previously imaged with optical telescopes, had never before been searched for gravitational lenses in the infrared.

    “Because they weren’t found in optical data, these newly found galaxies will be either dusty or very far away, or both,” Marshall says. “The former is interesting because a lot of star formation is hidden behind dust, while the latter is interesting because we see them shining at a time when the universe was very young. Studying both is important for understanding how galaxies form and how they evolve, and the lenses give us a magnified view of them.”

    In addition to kick-starting research into these dusty and distant galaxies, Marshall says that the overwhelming amount of interest shown by citizen scientists in the past week may also have implications for future data-intensive experiments like the Large Synoptic Survey Telescope, which will, among other things, search for signs of dark matter and dark energy. When it turns on near the beginning of the next decade, LSST is expected to produce more than 100 petabytes of data in 10 years.

    See the full article here.

    Symmetry is a joint Fermilab/SLAC publication.



    ScienceSprings is powered by MAINGEAR computers

     
  • richardmitnick 8:11 am on January 12, 2013 Permalink | Reply
    Tags: , , , , , Citizen Science,   

    For WCG from isgtw: “Desktop power helps map protein dance” 


    World Community Grid

    isgtw

    Proteins are part of a complex social network, and rarely act alone. Protein-protein interactions is the term used to describe when two or more proteins ‘partner-up‘ and bind together to carry out a different biological function. While experimental techniques are used to identify the relationships between one protein and another in its cellular neighborhood, computational simulations are still needed to uncover the more complex web of connections for multiple protein partners.

    proteins
    Neuromuscular disease is a generic term for a group of disorders (more than 200 in all) that impair muscle functioning either directly through muscle damage (muscular dystrophy) or
    indirectly damaging nerves. It affects one in 2,000 people. These chronic diseases lead to a decrease in muscle strength, causing serious disabilities in motor functions (moving, breathing etc.). The most well-known is muscular dystrophy. In cases of muscular dystrophy contraction of the muscle leads to disruption of the outer membrane of the muscle cells and eventual weakening and wasting of the muscle. Dystrophin is part of a protein complex that connects the cytoskeleton of a muscle fiber to the tissue framework surrounding each cell through the cell membrane. This complex does not form correctly in muscular dystrophy. (Image courtesy Alessandra Carbone).

    Distributed computing power from the World Community Grid (WCG) has recently aided the Help Cure Muscular Dystrophy (HMCD) project in capturing all the possible molecular and atomic connections between 2,280 human proteins. The analyzed proteins include those that are known to mutate and induce different forms of neuromuscular disorders, including Muscular Dystrophy.

    WCGLarge

    MuscularDystrophy-1

    MuscularDystrophy

    HCMD is part of a larger-scale venture, the Decrypthon Molecular Docking Project. This is an alliance between AFM (French Muscular Dystrophy Association), CNRS (French National Center for Scientific Research) and IBM, who are using the World Community Grid resources to help them decipher and map all the functions of interacting proteins found in humans to a worldwide repository of information such as the Research Collaboratory for Structural Bioinformatics (RCSB) protein databank.

    IBM

    SmarterPlanet

    The first phase of the HMCD project, completed in June 2007, scrutinized relationships among 168 proteins using molecular docking simulations. The researchers predicted that it would have taken over 14,000 years of computational time on a 2 GHz PC to reveal and rule out all possible docking confirmations for all 168 proteins. However a ‘distributed calculation’ allowed them to considerably reduce the processing time. Over 6,000-8,000 donor machines meant the task took under 26 weeks. However, to test 2,280 proteins on a one-to-one basis for phase II of the project, researchers needed a method to significantly reduce the number of configurations they would have to check. Molecular docking data from analysis of the 168 proteins (known to form 84 complexes) helped them develop a fast docking algorithm to predict potential partners for this large pool of proteins.

    See the full article here.

    The World Community Grid is comparable to one of the world’s top 15 supercomputers [ curently at 590.673 TeraFLOPS]. Its software has been downloaded onto over two million computers, which together have completed almost 700,000 years of scientific computation.”

    World Community Grid (WCG) runs on BOINC software from Berkeley Open Infrastructure for Network Computing from the Space Science Lab at UC Berkeley.

    BOINC

    SpaceScienceLabs

    isgtw is an international weekly online publication that covers distributed computing and the research it enables.

    World Community Grid (WCG) brings people together from across the globe to create the largest non-profit computing grid benefiting humanity. It does this by pooling surplus computer processing power. We believe that innovation combined with visionary scientific research and large-scale volunteerism can help make the planet smarter. Our success depends on like-minded individuals – like you.”

    WCG projects run on BOINC software from UC Berkeley.

    BOINC is a leader in the field(s) of Distributed Computing, Grid Computing and Citizen Cyberscience.BOINC is more properly the Berkeley Open Infrastructure for Network Computing.

    CAN ONE PERSON MAKE A DIFFERENCE? YOU BETCHA!!

    “Download and install secure, free software that captures your computer’s spare power when it is on, but idle. You will then be a World Community Grid volunteer. It’s that simple!” You can download the software at either WCG or BOINC.

    Please visit the project pages-

    Say No to Schistosoma

    GO Fight Against Malaria

    Drug Search for Leishmaniasis

    Computing for Clean Water

    The Clean Energy Project

    Discovering Dengue Drugs – Together

    Help Cure Muscular Dystrophy

    Help Fight Childhood Cancer

    Help Conquer Cancer

    Human Proteome Folding

    FightAIDS@Home

    Computing for Sustainable Water

    World Community Grid is a social initiative of IBM Corporation
    IBM Corporation
    ibm

    IBM – Smarter Planet
    sp


    ScienceSprings is powered by MAINGEAR computers

    My BOINC

    graph

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
Follow

Get every new post delivered to your Inbox.

Join 378 other followers

%d bloggers like this: