Tagged: Citizen Science Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 7:56 am on April 28, 2015 Permalink | Reply
    Tags: , , , Citizen Science   

    From ANU: “Amateur stargazers find supernovas in distant galaxies” 

    ANU Australian National University Bloc

    Australian National University

    2 April 2015
    No Writer Credit

    1

    More than 40,000 amateur astronomers have classified two million unidentified heavenly bodies found by the SkyMapper telescope at The Australian National University (ANU).

    ANU Skymapper telescope
    ANU Skymapper telescope interior
    ANU SkyMapper telescope

    Among the haystack of celestial data, the volunteers uncovered five sought-after supernovas, extremely bright exploding stars, which provide crucial information about the history and future of the universe.

    “It was a huge success, everyone was really excited to take part,” said Dr Richard Scalzo, from the ANU Research School of Astronomy and Astrophysics.

    “One volunteer was so determined to find a supernova that he stayed online for 25 hours. Unfortunately he didn’t find one, but he did find an unusual variable star, which we think might explode in the next 700 million years or so.”

    The SkyMapper telescope, at the Siding Spring Observatory near Coonabarabran in central New South Wales, is creating a digital survey of the entire southern sky with a detailed record of more than a billion stars and galaxies.

    Siding Spring Campus
    Siding Spring Observatory

    Under the volunteer project, amateur astronomers looked for differences in photos of the same patch of sky, taken at different times. Apart from the supernovas, they also found a number of variable stars and a raft of asteroids, some never previously discovered.

    Because they are so bright, supernovas are used as beacons to measure the most distant galaxies. Their study led to the discovery of the accelerating universe by Professor Brian Schmidt for which he shared the 2011 Nobel Prize.

    “When a star explodes and becomes a supernova, for approximately a month it shines more brightly than all the billions of other stars in its galaxy put together,” Dr Scalzo said.

    “The wide range of supernovas tells us how different stars evolve and end their lives in different ways,” he said.

    “Identifying them is something that human eyes are very good at. It’s hard to train a computer to do it. We had five different people classify each object, and for the borderline objects up to 20 people.”

    The scientists hope that the large data set from the program will enable them to train computers to automate the identification process.

    The program was a five-day supernova hunt run by the Zooniverse platform run by a team based at the University of Oxford, in collaboration with the annual BBC Stargazing Live. It attracted volunteers in Britain and as far afield as the United States and New Zealand.

    “It was wonderful to work with a survey like SkyMapper,” said Professor Chris Lintott, Principal Investigator for Zooniverse and Oxford Professor of Astrophysics.

    “Our volunteers and the millions of Stargazing Live viewers will have got a real kick from hearing about discoveries made a matter of moments after our project launched. We’re looking forward to more collaboration and more discoveries soon,” he said.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    ANU Campus

    ANU is a world-leading university in Australia’s capital city, Canberra. Our location points to our unique history, ties to the Australian Government and special standing as a resource for the Australian people.

    Our focus on research as an asset, and an approach to education, ensures our graduates are in demand the world-over for their abilities to understand, and apply vision and creativity to addressing complex contemporary challenges.

     
  • richardmitnick 8:14 pm on April 9, 2015 Permalink | Reply
    Tags: , , Citizen Science, Milky Way Project - Zooinverse   

    From NASA Science: “Citizen Scientists Discover Yellow ‘Space Balls'” 

    NASA Science Science News

    April 9, 2015
    Rachel Molina

    Citizen scientists scanning images from NASA’s Spitzer Space Telescope, an orbiting infra-red observatory, recently stumbled upon a new class of curiosities that had gone largely unrecognized before: yellow balls.

    NASA Spitzer Telescope
    Spitzer

    “The volunteers started chatting about the yellow balls they kept seeing in the images of our galaxy, and this brought the features to our attention,” said Grace Wolf-Chase of the Adler Planetarium in Chicago.

    The Milky Way Project is one of many “citizen scientist” projects making up the Zooniverse website, which relies on crowdsourcing to help process scientific data. For years, volunteers have been scanning Spitzer’s images of star-forming regions—places where clouds of gas and dust are collapsing to form clusters of young stars. Professional astronomers don’t fully understand the process of star formation; much of the underlying physics remains a mystery. Citizen scientists have been helping by looking for clues.

    Before the yellow balls popped up, volunteers had already noticed green bubbles with red centers, populating a landscape of swirling gas and dust. These bubbles are the result of massive newborn stars blowing out cavities in their surroundings. When the volunteers started reporting that they were finding objects in the shape of yellow balls, the Spitzer researchers took note.

    The rounded features captured by the telescope, of course, are not actually yellow, red, or green—they just appear that way in the infrared, color-assigned images that the telescope sends to Earth. The false colors provide a way to humans to talk about infrared wavelengths of light their eyes cannot actually see.

    “With prompting by the volunteers, we analyzed the yellow balls and figured out that they are a new way to detect the early stages of massive star formation,” said Charles Kerton of Iowa State University, Ames. “The simple question of ‘Hmm, what’s that?’ led us to this discovery.”

    A thorough analysis by the team led to the conclusion that the yellow balls precede the green bubbles, representing a phase of star formation that takes place before the bubbles form.

    “Basically, if you wind the clock backwards from the bubbles, you get the yellow balls,” said Kerton

    1
    An artist’s concept shows how “yellow balls” fit into the process of star formation.

    Researchers think the green bubble rims are made largely of organic molecules called polycyclic aromatic hydrocarbons (PAHs). PAHs are abundant in the dense molecular clouds where stars coalesce. Blasts of radiation and winds from newborn stars push these PAHs into a spherical shells that look like green bubbles in Spitzer’s images. The red cores of the green bubbles are made of warm dust that has not yet been pushed away from the windy stars.

    How do the yellow balls fit in?

    “The yellow balls are a missing link,” says Wolf-Chase. They represent a transition “between very young embryonic stars buried in dense, dusty clouds and slightly older, newborn stars blowing the bubbles.”

    Essentially, the yellow balls mark places where the PAHs (green) and the dust (red) have not yet separated. The superposition of green and red makes yellow.

    So far, the volunteers have identified more than 900 of these compact, yellow features. The multitude gives researchers plenty of chances to test their hypotheses and learn more about the way stars form.

    Meanwhile, citizen scientists continue to scan Spitzer’s images for new finds. Green bubbles. Red cores. Yellow balls. What’s next? You could be the one who makes the next big discovery. To get involved, go to zooniverse.org and click on “The Milky Way Project.”

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NASA leads the nation on a great journey of discovery, seeking new knowledge and understanding of our planet Earth, our Sun and solar system, and the universe out to its farthest reaches and back to its earliest moments of existence. NASA’s Science Mission Directorate (SMD) and the nation’s science community use space observatories to conduct scientific studies of the Earth from space to visit and return samples from other bodies in the solar system, and to peer out into our Galaxy and beyond. NASA’s science program seeks answers to profound questions that touch us all:

    This is NASA’s science vision: using the vantage point of space to achieve with the science community and our partners a deep scientific understanding of our planet, other planets and solar system bodies, the interplanetary environment, the Sun and its effects on the solar system, and the universe beyond. In so doing, we lay the intellectual foundation for the robotic and human expeditions of the future while meeting today’s needs for scientific information to address national concerns, such as climate change and space weather. At every step we share the journey of scientific exploration with the public and partner with others to substantially improve science, technology, engineering and mathematics (STEM) education nationwide.

    NASA

     
  • richardmitnick 5:18 pm on February 19, 2015 Permalink | Reply
    Tags: , , , Citizen Science,   

    From Symmetry: “Physics for the people” 

    Symmetry

    February 19, 2015
    Manuel Gnida and Kathryn Jepsen

    Citizen scientists dive into particle physics and astrophysics research.

    1
    Illustration by Manuel Gnida, SLAC / Images courtesy of CERN, ESA/Hubble & NASA

    Citizen science, scientific work done by the general public, is having a moment.

    In June 2014, the term “citizen science” was added to the Oxford English Dictionary. This month, the American Association for the Advancement of Science—one of the world’s largest general scientific societies—dedicated several sessions at its annual meeting to the topic. A two-day preconference organized by the year-old Citizen Science Association attracted an estimated 700 participants.

    Citizen scientists interested in taking part in particle physics research have few options at the moment, but they may have a new opportunity on the horizon with the Large Synoptic Survey Telescope.

    LSST Exterior
    LSST Interior
    LSST Camera
    LSST

    Hunting the Higgs

    Citizen science projects have helped researchers predict the structure of proteins, transcribe letters from Albert Einstein, and monitor populations of bees and invasive crabs. The citizen science portal “Zooniverse,” launched in 2007, has attracted 1.3 million users from around the world. According to a report by Oxford University astronomer Brooke Simmons, the first Zooniverse project, “Galaxy Zoo,” has so far published 57 scientific papers with the help of citizen scientists.

    Of the 27 projects on the Zooniverse portal, just one allows volunteers to help with the analysis of real data from a particle physics experiment. “Higgs Hunters,” launched in November 2014, invites citizen scientists to help physicists find evidence of strange particle behavior in images of collisions from the Large Hadron Collider.

    CERN LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC

    When protons collide in the LHC, their energy transfers briefly into matter, forming different types of particles, which then decay into less massive particles and eventually dissipate back into energy. Some particle collisions create Higgs bosons, particles discovered in 2012 at the LHC.

    “We don’t yet know much about how the Higgs boson decays,” says particle physicist Alan Barr at Oxford University in the UK, one of the leads of the Higgs Hunters project. “One hypothesis is that the Higgs decays into new, lighter Higgs particles, which would travel some distance from the center of our detector where LHC’s protons collide. We wouldn’t see these new particles until they decayed themselves into known particles, generating tracks that emerge ‘out of thin air,’ away from the center.”

    So far, almost 5,000 volunteers have participated in the Higgs Hunters project. Over the past three months, they have classified 600,000 particle tracks.

    Why turn to citizen science for this task?

    “It turns out that our current algorithms aren’t trained well enough to identify the tracks we’re interested in,” Barr says. “The human eye can do much better. We hope that we can use the information from our volunteers to train our algorithms and make them better for the second run of LHC.”

    Humans are also good at finding problems an algorithm might miss. Many participants flagged as “weird” an image showing what looked like a shower of particles called muons passing through the detector, Barr says. “When we looked at it in more detail, it turned out that it was a very rare detector artifact, falsely identified as a real event by the algorithms.”

    Volunteers interested in Higgs Hunters have only a couple of months left to participate. Barr estimates that by April, the project will have collected enough data for researchers to proceed with an in-depth analysis.

    Distortions in space

    Armchair astrophysicists can find their own project in the Zooniverse. “SpaceWarps” asks volunteers to look for distortions in images of faraway galaxies—evidence of gravitational lensing.

    Gravitational lensing occurs when the gravitational force of massive galaxies or galaxy clusters bends the space around them so that light rays traveling near them follow curved paths.

    Einstein predicted this effect in his Theory of General Relativity. You can see an approximation of it by looking at a light through the bottom of a wine glass. Gravitational lensing is used to determine distances in the universe—key information in measuring the expansion of the universe and understanding dark energy.

    Recognizing gravitational lensing is a difficult task for a computer program, but a relatively easy one for a human, says Phil Marshall, a scientist at the Kavli Institute for Particle Astrophysics and Cosmology at Stanford University and SLAC National Accelerator Laboratory.

    Marshall, one of three principal investigators for SpaceWarps, says he sees a lot of potential in the interface between humans and machines. “They both have different skills that complement each other.”

    According to the SpaceWarps website, more than 51,000 volunteers have made more than 8 million classifications to date and have discovered dozens of candidates for gravitational lenses that were not detected by algorithms. The project is currently adding new data for people to analyze.

    The Large Synoptic Survey Telescope

    Citizen science may become particularly important for another project Marshall is interested in: the Large Synoptic Survey Telescope, to be built on a mountaintop in Chile.

    Technicians recently completed a giant double mirror for the project, and its groundbreaking will take place this spring. Beginning in 2022, LSST will take a complete image of the entire southern sky every few nights. It is scheduled to run for a decade, collecting 6 million gigabytes of data each year. The information collected may help scientists unravel cosmic mysteries such as dark matter and dark energy.

    “Nobody really knows what citizen science will look like for LSST,” Marshall says. “However, a good approach would be to make use of the fact that humans are very good at understanding confusing things. They could help us inspect images for odd features, potentially spotting new things or pointing out problems with the data.”

    Citizen scientists could also help with the LSST budget.

    Henry Sauermann at the Georgia Institute of Technology and Chiara Franzoni at the Politecnico di Milano in Italy recently studied seven Zooniverse projects started in 2010. They calculated the efforts of unpaid volunteers over just the first 180 days to be worth $1.5 million.

    But the value of citizen science to LSST may depend on whether it can attract a dedicated group of amateur researchers.

    Sauermann and Franzoni’s study showed that 10 percent of contributors to the citizen science projects they studied completed an average of almost 80 percent of all of the work.

    “We also see that with SpaceWarps,” Marshall says. “Most Internet users have a very short attention span.”

    It’s all about how well the researchers design the project, he says.

    “It must be easy to get started and, at the same time, empower the participant enough to make serious contributions to science,” Marshall says. “It’s on us to provide volunteers with interesting things to do.”

    See the full article here.

    I am surprised that the distinguished authors of this essay forgot one of the earliest sets of Citizen Science project, those stemming from the LHC, namely lhc@home, now named Sixtrack@home, and vLHC@home which began life as test4theory@home.

    LHC Sixtrack

    vLHC Logo

    BOINC

    These projects run on software from BOINC at UC Berkeley. BOINC enables the home computer user to participate in a large variety of scientific projects by donating time on their computers running Windows, Mac and Linux software. Dave Anderson and his cohorts at UC Berkeley practically invented Citizen Science. There are many projects of all kinds running on BOINC software. Please visit the BOINC web site and think about helping on some projects. Also, in a related group of projects is World Community Grid (WCG), a section of the Smarter Planet social effort from IBM Corporation. WCG projects also run on BOINC software. Please visit the WCG web site and take a look.

    WCGLarge

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 5:14 pm on January 28, 2015 Permalink | Reply
    Tags: , , Citizen Science, , ,   

    From WCG: “Using grid computing to understand an underwater world” 

    New WCG Logo

    SustainableWater screensaver

    28 Jan 2015
    By: Gerard P. Learmonth Sr., M.B.A., M.S., Ph.D.
    University of Virginia

    The Computing for Sustainable Water (CFSW) project focused on the Chesapeake Bay watershed in the United States. This is the largest watershed in the US and covers all or part of six states (Virginia, West Virginia, Maryland, Delaware, Pennsylvania, and New York) and Washington, D.C., the nation’s capital. The Bay has been under environmental pressure for many years. Previous efforts to address the problem have been unsuccessful. As a result, the size of the Bay’s anoxic region (dead zone) continues to affect the native blue crab (callinectes sapidus) population.

    2
    Callinectes sapidus – the blue crab

    he problem is largely a result of nutrient flow (nitrogen and phosphorous) into the Bay that occurs due to agricultural, industrial, and land development activities. Federal, state, and local agencies attempt to control nutrient flow through a set of incentives known as Best Management Practices (BMPs). Entities adopting BMPs typically receive payments. Each BMP is believed to be helpful in some way for controlling nutrient flow. However, the effectiveness of the various BMPs has not been studied on an appropriately large scale. Indeed, there is no clear scientific evidence for the effectiveness of some BMPs that have already been widely adopted.

    The Computing for Sustainable Water project conducted a set of large-scale simulation experiments of the impact of BMPs on nutrient flow into the Chesapeake Bay and the resulting environmental health of the Bay. Table 1 lists the 23 BMPs tested in this project. Initially, a simulation run with no BMPs was produced as a baseline case. Then each individual BMP was run separately and compared with the baseline. Table 2 shows the results of these statistical comparisons.

    Table 1. Best Management Practices employed in the Chesapeake Bay watershed
    3

    Table 2. Statistical results comparing each BMP to a baseline (no-BMPs) simulation experiment.
    4

    Student’s t-tests of individual BMPs compared to base case of no BMPs * = significant at α = 0.10; ** = significant at α = 0.05; *** = significant at α = 0.01
    For more information about t-statistic, click here. For more information about p-value, click here.

    These results identify several BMPs that are effective in reducing the corresponding nitrogen and phosphorous loads entering the Chesapeake Bay. In particular, BMPs 4, 7, and 23 are highly effective. These results are very informative for policymakers not only in the Chesapeake Bay watershed but globally as well, because many regions of the world experience similar problems and employ similar BMPs.

    In all, World Community Grid members facilitated over 19.1 million experiments. These include various combinations of BMPs to discover the possible effectiveness of combinations of BMPs. The analysis of these experiments continues for combinations of BMPs.

    We would like to once again express our gratitude to the World Community Grid community. A project of this size and scope simply would not have been possible without your help.

    See the full article here.

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    World Community Grid (WCG) brings people together from across the globe to create the largest non-profit computing grid benefiting humanity. It does this by pooling surplus computer processing power. We believe that innovation combined with visionary scientific research and large-scale volunteerism can help make the planet smarter. Our success depends on like-minded individuals – like you.”

    WCG projects run on BOINC software from UC Berkeley.

    BOINC is a leader in the field(s) of Distributed Computing, Grid Computing and Citizen Cyberscience.BOINC is more properly the Berkeley Open Infrastructure for Network Computing.

    CAN ONE PERSON MAKE A DIFFERENCE? YOU BETCHA!!

    “Download and install secure, free software that captures your computer’s spare power when it is on, but idle. You will then be a World Community Grid volunteer. It’s that simple!” You can download the software at either WCG or BOINC.

    Please visit the project pages-
    Outsmart Ebola together

    Outsmart Ebola Together

    Mapping Cancer Markers
    mappingcancermarkers2

    Uncovering Genome Mysteries
    Uncovering Genome Mysteries

    Say No to Schistosoma

    GO Fight Against Malaria

    Drug Search for Leishmaniasis

    Computing for Clean Water

    The Clean Energy Project

    Discovering Dengue Drugs – Together

    Help Cure Muscular Dystrophy

    Help Fight Childhood Cancer

    Help Conquer Cancer

    Human Proteome Folding

    FightAIDS@Home

    Computing for Sustainable Water

     
  • richardmitnick 3:37 pm on January 27, 2015 Permalink | Reply
    Tags: , , , Citizen Science, , ,   

    From JPL: “Citizen Scientists Lead Astronomers to Mystery Objects in Space” 

    JPL

    January 27, 2015
    Whitney Clavin
    Jet Propulsion Laboratory, Pasadena, California
    818-354-4673
    whitney.clavin@jpl.nasa.gov

    1
    Volunteers using the web-based Milky Way Project brought star-forming features nicknamed “yellowballs” to the attention of researchers, who later showed that they are a phase of massive star formation. The yellow balls — which are several hundred to thousands times the size of our solar system — are pictured here in the center of this image taken by NASA’s Spitzer Space Telescope. Infrared light has been assigned different colors; yellow occurs where green and red overlap. The yellow balls represent an intermediary stage of massive star formation that takes place before massive stars carve out cavities in the surrounding gas and dust (seen as green-rimmed bubbles with red interiors in this image).

    Infrared light of 3.6 microns is blue; 8-micron light is green; and 24-micron light is red.

    2
    This series of images show three evolutionary phases of massive star formation, as pictured in infrared images from NASA’s Spitzer Space Telescope. The stars start out in thick cocoon of dust (left), evolve into hotter features dubbed “yellowballs” (center); and finally, blow out cavities in the surrounding dust and gas, resulting in green-rimmed bubbles with red centers (right). The process shown here takes roughly a million years. Even the oldest phase shown here is fairly young, as massive stars live a few million years. Eventually, the stars will migrate away from their birth clouds.

    In this image, infrared light of 3.6 microns is blue; 8-micron light is green; and 24-micron light is red.

    NASA’s Jet Propulsion Laboratory, Pasadena, California, manages the Spitzer Space Telescope mission for NASA’s Science Mission Directorate, Washington. Science operations are conducted at the Spitzer Science Center at the California Institute of Technology in Pasadena. Spacecraft operations are based at Lockheed Martin Space Systems Company, Littleton, Colorado. Data are archived at the Infrared Science Archive housed at the Infrared Processing and Analysis Center at Caltech. Caltech manages JPL for NASA.

    NASA Spitzer Telescope
    Spitzer

    Milkyway@home
    MilkyWay@home

    Milkyway@Home uses the BOINC platform to harness volunteered computing resources, creating a highly accurate three dimensional model of the Milky Way galaxy using data gathered by the Sloan Digital Sky Survey (SDSS). This project enables research in both astroinformatics and computer science.

    SDSS Telescope
    SDSS Telescope

    BOINC

    In computer science, the project is investigating different optimization methods which are resilient to the fault-prone, heterogeneous and asynchronous nature of Internet computing; such as evolutionary and genetic algorithms, as well as asynchronous newton methods. While in astroinformatics, Milkyway@Home is generating highly accurate three dimensional models of the Sagittarius stream, which provides knowledge about how the Milky Way galaxy was formed and how tidal tails are created when galaxies merge.

    Milkyway@Home is a joint effort between Rensselaer Polytechnic Institute‘s departments of Computer Science and Physics, Applied Physics and Astronomy. Feel free to contact us via our forums, or email astro@cs.lists.rpi.edu.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NASA JPL Campus

    Jet Propulsion Laboratory (JPL) is a federally funded research and development center and NASA field center located in the San Gabriel Valley area of Los Angeles County, California, United States. Although the facility has a Pasadena postal address, it is actually headquartered in the city of La Cañada Flintridge [1], on the northwest border of Pasadena. JPL is managed by the nearby California Institute of Technology (Caltech) for the National Aeronautics and Space Administration. The Laboratory’s primary function is the construction and operation of robotic planetary spacecraft, though it also conducts Earth-orbit and astronomy missions. It is also responsible for operating NASA’s Deep Space Network.

    Caltech Logo
    jpl

     
    • academix2015 4:22 pm on January 27, 2015 Permalink | Reply

      Web based Milky Way project would open up new opportunities for amateur astronomers. Thank you.

      Like

    • academix2015 4:22 pm on January 27, 2015 Permalink | Reply

      Reblogged this on Academic Avenue and commented:
      How about studying the intricacies of the astronomical processes and phenomena in the Milky Way?

      Like

  • richardmitnick 4:02 pm on January 6, 2015 Permalink | Reply
    Tags: , , Citizen Science,   

    From NASA Earth: “Finding Floating Forests” An Amazing Story Complete with Citizen Science 

    NASA Earth Observatory

    NASA Earth Observatory

    December 19, 2014
    By Laura Rocchio Design by Paul Przyborski & Mike Carlowicz

    Giant kelp forests are among Earth’s most productive habitats, and their great diversity of plant and animal species supports many fisheries around the world. The kelp, or Macrocystis, that make up these underwater forests truly are giant. They are the world’s largest marine plants and regularly grow up to 35 meters (115 feet) tall; the largest giant kelp on record stood 65 meters (215 feet) tall. Divers have compared swimming through mature kelp forests to walking through redwood forests.

    k

    Unlike redwoods, giant kelp are ephemeral. They live for seven years at most, and often they disappear before that because of winter storms or over-grazing by other species. As fishermen know, giant kelp forests can appear and disappear from season to season, from year to year. But is there a long-term trend or cycle at work?

    A few years ago, Jarrett Byrnes was in a bit of a quandary over these disappearing forests. As part of his postdoctoral research at the University of California–Santa Barbara (UCSB), he was studying giant kelp at four National Science Foundation-funded sites off the coast. Since 2000, biologists had been using this Long-Term Ecological Research (LTER) site to make monthly in situ measurements of giant kelp. But Byrnes and his colleagues found that they often could not make measurements in winter because rough seas made the diving unsafe.

    s
    Kelp are the redwoods of the sea. The world’s largest marine plants regularly grow up to 35 meters (115 feet) tall. (Photograph © Phillip Colla / Oceanlight.com)

    “Storms remove quite a bit of the canopy in the winter. Sometimes they even remove whole forests if the storms are large enough,” Byrnes explained. “But getting to those sites with regularity in the winter gets very challenging.” Most of the diving had to wait until summer, and by then the kelp had largely recovered or changed, making it difficult to measure how much damage the storms had done.

    To complicate matters, kelp forests have different seasonality depending on where they are. For instance, the forests along the Central California coast are at their maximum size in the fall; in Southern California, they often reach their peak in the winter and spring. How could these dynamic habitats be monitored more frequently without putting divers at risk?

    Kyle Cavanaugh, then a UCSB graduate student, had an idea. “These forests change so rapidly and on a variety of different time scales—months to years to decades—so we needed a long record with consistent, repeated observations,” Cavanaugh said. He devised a method to use Landsat satellite data to monitor kelp forests.

    A few things made Landsat an obvious resource. Since the 1970s, the satellites have had a regular collection schedule (twice monthly). Their data and images are managed by the U.S. Geological Survey and are reliably stored in an archive that dates back more than forty years. And Landsat’s images are calibrated, or standardized, across different generations of satellites, making it possible to compare data collected across several decades.

    l
    Landsat 8 can detect near-infrared wavelengths of light that make it easier to spot offshore kelp forests. (NASA Earth Observatory image by Mike Taylor, using Landsat data from the U.S. Geological Survey)

    Landsat measures the energy reflected and emitted from Earth at many different wavelengths. By knowing how features on Earth reflect or absorb energy at certain wavelengths, scientists can map and measure changes to the surface. The most important feature for the kelp researchers is Landsat’s near-infrared band, which measures wavelengths of light that are just outside our visual range. Healthy vegetation strongly reflects near-infrared energy, so this band is often used in plant studies. Also, water absorbs a lot of near-infrared energy and reflects little, making the band particularly good for mapping boundaries between land and water.

    “The near-infrared is key for identifying kelp from surrounding water,” Cavanaugh explained. “Like other types of photosynthesizing vegetation, giant kelp have high reflectance in the near infrared. This makes the kelp canopy really stand out from the surrounding water.”

    For Byrnes, the approach was a breakthrough: “This meant we could see the forests I was analyzing right after storms hit them.”

    Growing Fast and Holding Fast

    Giant kelp are fast growers, and they thrive in cold, nutrient-dense waters, particularly where there is a rocky and shallow seafloor (5 to 30 meters or 15 to 100 feet). They attach to the seafloor with small root-like structures (haptera) also called, appropriately enough, a holdfast. The holdfast supports a stipe, or stalk, and leaf-like blades that float thanks to air-filled pockets (pneumatocysts). The fronds create dense floating canopies on the water surface, yet these massive plants rely on holdfasts barely 60 centimeters (24 inches) wide to keep them rooted and alive.

    Given the right balance of conditions, giant kelp can grow as much as 50 centimeters (1.6 feet) per day, and this robust growth makes it possible for kelp fronds to be commercially harvested. Giant kelp have been plucked from California waters since the early 1900s, and they have long appeared in products like ice cream and toothpaste. At the industry’s peak, large ships using lawnmower-like machinery could harvest more than 200,000 wet tons annually.

    b
    Kelp fronds create dense floating canopies near the water surface. Kelp have been harvested for a century for commercial products; they also pose trouble for boat propellers. (Photo courtesy of Chad King / NOAA MBNMS)

    “The satellite could definitely see the effects of harvesting, but the kelp recovery was very fast,” said Tom Bell, a UCSB researcher and collaborator with Byrnes and Cavanaugh.

    Today, only a few thousand tons of giant kelp are harvested each year, some by hand and some by mechanical harvesters. The kelp can be trimmed no lower than 4 feet below the water surface, and this sustainable harvesting is the equivalent of humans getting a haircut. Studies have shown that negative affects are negligible, although some fish populations are temporarily displaced.

    7
    Giant kelp thrive in cold, nutrient-dense waters, particularly where there is a rocky, shallow seafloor. The California coast provides ideal habitat. (NASA Earth Observatory image by Mike Taylor, using Landsat data from the U.S. Geological Survey)

    For years, scientists debated whether it was nutrient availability or grazers (not human harvesters, but sea urchins) that had the most influence over kelp forest health, size, and longevity. After using Landsat to look at long-term trends, and comparing those trends to known differences between Central and Southern California waters, Cavanaugh and LTER lead Daniel Reed found that a third force—wave disturbance—was the kingmaker of kelp dynamics. Strong waves generated by storms uproot the kelp from their holdfasts and can devastate the forests far more than any grazer.

    Kelp Research Branches Out

    When giant kelp first brought Byrnes and Cavanaugh together at UCSB, their work was largely California-focused. The data they collected from the LTER study sites off Santa Barbara became a tremendous resource for kelp researchers. But that work covered four discrete locations for a species found all over the world.

    Giant kelp can grow anywhere there are cold, shallow, nutrient-rich waters and a rocky seafloor. Conditions for kelp growth have historically been ideal along the west coast of North America, as well as Chile, Peru, the Falkland Islands, South Africa, and around Australia, New Zealand, and the sub-Antarctic islands.

    More and more often these days, though, the conditions are less ideal. Climate change has brought a trifecta of kelp scourges: warmer waters with fewer nutrients; new invasive species; and severe storms.

    8
    Given the right balance of conditions, giant kelp can grow as much as 50 centimeters (1.6 feet) per day. (Photograph © Phillip Colla / Oceanlight.com)

    After a recent meeting on kelp forests and climate change, Byrnes, Cavanaugh, and other colleagues set out to consolidate all of the available kelp forest data from around the world. They wanted to take a step toward understanding how climate change is affecting kelp globally, but they quickly discovered they had a sparse patchwork of information.

    Byrnes was struck with a thought. They had used Landsat to expand their studies across time, so why not use Landsat to expand their studies around the world? Could Landsat be used to establish global trends in kelp forest extent? The answer was yes, but the problem was eyeballs.

    Unlike research on terrestrial vegetation—which uses Landsat data and powerful computer processing arrays to make worldwide calculations—distinguishing kelp forests requires manual interpretation. While kelp forests pop out to the human eye in near-infrared imagery, computers looking at the data numerically can confuse kelp patches with land vegetation. Programs and coded logic that separate aquatic vegetation from land vegetation can be confounded by things like clouds, sunglint, and sea foam.

    4
    9
    Natural color (top) and near-infrared (bottom) images from Landsat 8 show the kelp-rich waters around California’s Channel Islands. Clouds, sunglint, and sea foam make it difficult for computer programs to detect the location of forests. So far, human eyes work better. (NASA Earth Observatory image by Mike Taylor and Jesse Allen, using Landsat data from the U.S. Geological Survey)

    “I’ve spent many, many years staring at satellite imagery trying to come up with new ways to extract the kelp signal from that imagery, and it is very time and work intensive,” said Cavanaugh, now based at the University of California–Los Angeles. “But automated classification methods just don’t produce acceptable levels of accuracy yet.”

    Byrnes, now based at the University of Massachusetts–Boston, realized that the best way to study global kelp changes was to turn to citizen scientists. Byrnes and Cavanaugh put together a science team and joined with Zooniverse, a group that connects professional scientists with citizen scientists in order to help analyze large amounts of data. The result was the Floating Forests project.

    Getting Help from a Few Thousand Friends

    The Floating Forest concept is all about getting more eyeballs on Landsat imagery. Citizen scientists—recruited via the Internet—are instructed in how to hunt for giant kelp in satellite imagery. They are then given Landsat images and asked to outline any giant kelp patches that they find. Their findings are crosschecked with those from other citizen scientists and then passed to the science team for verification. The size and location of these forests are catalogued and used to study global kelp trends.

    In addition to examining the California coast, which Byrnes and Cavanaugh know well, the Floating Forests project has also focused on the waters around Tasmania. Tom Bell and collaborators in Australia and New Zealand have noticed dramatic declines in giant kelp forests there over the past few decades. The decline has been so rapid and extensive that giant kelp are only found now in isolated patches.

    1
    2
    Off the east coast of Tasmania, 95 percent of the kelp has disappeared since the 1940s. False-color Landsat images from September 1999 (top) and September 2014 (bottom) provide evidence of recent kelp forest disturbance. (NASA Earth Observatory image by Mike Taylor, using Landsat data from the U.S. Geological Survey)

    Off Tasmania’s east coast, 95 percent of the kelp has disappeared since the 1940s. The loss has been so stark that the Australian government listed Tasmania’s giant kelp forests as an “endangered ecological community“— the first time the country has given protection to an entire ecological community. The loss is so stunning because this was a place where kelp forests were once so dense that they merited mention on nautical charts.

    Cool, subarctic waters once bathed Tasmania’s east coast, but warmer waters (as much as 2.5ºC (4.3ºF) warmer) have brought many invasive species that feast on giant kelp. Compounding the matter, the overfishing of rock lobsters has removed a key predator of the long-spined sea urchins (which eat kelp). The ecosystem’s new protected status could help curb overfishing and restore the lobsters, which would help diminish the threat from sea urchins.

    5
    This U.S. Hydrographic Service chart from 1925 shows Prosser Bay, Tasmania, and the distribution of giant kelp. (Source: Edyvane at al, 2003)

    Using Landsat to monitor the kelp forests and establish trends may shed more light on what is happening off of Tasmania. “We believe the data from Floating Forests will allow us to better understand the causes of these declines,” said Cavanaugh.

    As of November 2014, more than 2,700 citizen scientists had joined Byrnes and Cavanaugh to look for kelp in 260,000 Landsat images. All combined, the citizen scientists have now made more than one million kelp classifications. The response has exceeded expectations, and the project has been expanded faster than originally planned.

    Already a discovery has been made. A citizen scientist found a large patch of giant kelp on the Cortez Bank, an underwater seamount about 160 kilometers (100 miles) off the coast of San Diego. While giant kelp on this submerged island—which comes within feet of the surface at some points—had been documented by divers and fishermen in the past, the full extent of the kelp beds was unknown.

    9
    A citizen scientist found satellite evidence of an outlying kelp forest that was previously known only to divers and local fishermen. (NASA Earth Observatory image by Mike Taylor, using Landsat data from the U.S. Geological Survey)

    “The first few months of Floating Forests have been a huge success, and we are hopeful that we will soon be able to expand the project to other regions,” Cavanaugh said. “Our ultimate goal is to cover all the coastlines of the world that support giant kelp forests.”

    To learn how to participate in the Floating Forests project, visit their web page.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The Earth Observatory’s mission is to share with the public the images, stories, and discoveries about climate and the environment that emerge from NASA research, including its satellite missions, in-the-field research, and climate models. The Earth Observatory staff is supported by the Climate and Radiation Laboratory, and the Hydrospheric and Biospheric Sciences Laboratory located at NASA Goddard Space Flight Center.

     
  • richardmitnick 10:28 pm on December 3, 2014 Permalink | Reply
    Tags: , , , , , Citizen Science, , , , ,   

    From isgtw: “Volunteer computing: 10 years of supporting CERN through LHC@home” 


    international science grid this week

    December 3, 2014
    Andrew Purcell

    LHC@home recently celebrated a decade since its launch in 2004. Through its SixTrack project, the LHC@home platform harnesses the power of volunteer computing to model the progress of sub-atomic particles traveling at nearly the speed of light around the Large Hadron Collider (LHC) at CERN, near Geneva, Switzerland. It typically simulates about 60 particles whizzing around the collider’s 27km-long ring for ten seconds, or up to one million loops. Results from SixTrack were used to help the engineers and physicists at CERN design stable beam conditions for the LHC, so today the beams stay on track and don’t cause damage by flying off course into the walls of the vacuum tube. It’s now also being used to carry out simulations relevant to the design of the next phase of the LHC, known as the High-Luminosity LHC.

    CERN LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    “The results of SixTrack played an essential role in the design of the LHC, and the high-luminosity upgrades will naturally require additional development work on SixTrack,” explains Frank Schmidt, who works in CERN’s Accelerators and Beam Physics Group of the Beams Department and is the main author of the SixTrack code. “In addition to its use in the design stage, SixTrack is also a key tool for the interpretation of data taken during the first run of the LHC,” adds Massimo Giovannozzi, who also works in CERN’s Accelerators and Beams Physics Group. “We use it to improve our understanding of particle dynamics, which will help us to push the LHC performance even further over the coming years of operation.” He continues: “Managing a project like SixTrack within LHC@home requires resources and competencies that are not easy to find: Igor Zacharov, a senior scientist at the Particle Accelerator Physics Laboratory (LPAP) of the Swiss Federal Institute of Technology in Lausanne (EPFL), provides valuable support for SixTrack by helping with BOINC integration.”

    c
    Volunteer computing is a type of distributed computing through which members of the public donate computing resources (usually processing power) to aid research projects. Image courtesy Eduardo Diez Viñuela, Flickr (CC BY-SA 2.0).

    Before LHC@home was created, SixTrack was run only on desktop computers at CERN, using a platform called the Compact Physics Screen Saver (CPSS). This proved to be a useful tool for a proof of concept, but it was first with the launch of the LHC@home platform in 2004 that things really took off. “I am surprised and delighted by the support from our volunteers,” says Eric McIntosh, who formerly worked in CERN’s IT Department and is now an honorary member of the Beams Department. “We now have over 100,000 users all over the world and many more hosts. Every contribution is welcome, however small, as our strength lies in numbers.”

    Virtualization to the rescue

    Building on the success of SixTrack, the Virtual LHC@home project (formerly known as Test4Theory) was launched in 2011. It enables users to run simulations of high-energy particle physics using their home computers, with the results submitted to a database used as a common resource by both experimental and theoretical scientists working on the LHC.

    Whereas the code for SixTrack was ported for running on Windows, OS X, and Linux, the high-energy-physics code used by each of the LHC experiments is far too large to port in a similar way. It is also being constantly updated. “The experiments at CERN have their own libraries and they all run on Linux, while the majority of people out there have common-or-garden variety Windows machines,” explains CERN honorary staff member of the IT department and chief technology officer of the Citizen Cyberscience Centre Ben Segal. “Virtualization is the way to solve this problem.”

    The birth of the LHC@home platform

    In 2004, Ben Segal and François Grey , who were both members of CERN’s IT department at the time, were asked to plan an outreach event for CERN’s 50th anniversary that would help people around the world to get an impression of the computational challenges facing the LHC. “I had been an early volunteer for SETI@home after it was launched in 1999,” explains Grey. “Volunteer computing was often used as an illustration of what distributed computing means when discussing grid technology. It seemed to me that it ought to be feasible to do something similar for LHC computing and perhaps even combine volunteer computing and grid computing this way.”

    “I contacted David Anderson, the person behind SETI@Home, and it turned out the timing was good, as he was working on an open-source platform called BOINC to enable many projects to use the SETI@home approach,” Grey continues. BOINC (Berkeley Open Infrastructures for Network Computing)is an open-source software platform for computing with volunteered resources. It was first developed at the University of California, Berkeley in the US to manage the SETI@Home project, and uses the unused CPU and GPU cycles on a computer to support scientific research.

    “I vividly remember the day we phoned up David Anderson in Berkeley to see if we could make a SETI-like computing challenge for CERN,” adds Segal. “We needed a CERN application that ran on Windows, as over 90% of BOINC volunteers used that. The SixTrack people had ported their code to Windows and had already built a small CERN-only desktop grid to run it on, as they needed lots of CPU power. So we went with that.”

    A runaway success

    “I was worried that no one would find the LHC as interesting as SETI. Bear in mind that this was well before the whole LHC craziness started with the Angels and Demons movie, and news about possible mini black holes destroying the planet making headlines,” says Grey. “We made a soft launch, without any official announcements, in 2004. To our astonishment, the SETI@home community immediately jumped in, having heard about LHC@home by word of mouth. We had over 1,000 participants in 24 hours, and over 7,000 by the end of the week — our server’s maximum capacity.” He adds: “We’d planned to run the volunteer computing challenge for just three months, at the time of the 50th anniversary. But the accelerator physicists were hooked and insisted the project should go on.”

    Predrag Buncic, who is now coordinator of the offline group within the ALICE experiment, led work to create the CERN Virtual Machine in 2008. He, Artem Harutyunyan (former architect and lead developer of CernVM Co-Pilot), and Segal subsequently adopted this virtualization technology for use within Virtual LHC@home. This has made it significantly easier for the experiments at CERN to create their own volunteer computing applications, since it is no longer necessary for them to port their code. The long-term vision for Virtual LHC@home is to support volunteer-computing applications for each of the large LHC experiments.
    Growth of the platform

    The ATLAS experiment recently launched a project that simulates the creation and decay of supersymmetric bosons and fermions. “ATLAS@Home offers the chance for the wider public to participate in the massive computation required by the ATLAS experiment and to contribute to the greater understanding of our universe,” says David Cameron, a researcher at the University of Oslo in Norway. “ATLAS also gains a significant computing resource at a time when even more resources will be required for the analysis of data from the second run of the LHC.”

    CERN ATLAS New
    ATLAS

    ATLAS@home

    Meanwhile, the LHCb experiment has been running a limited test prototype for over a year now, with an application running Beauty physics simulations set to be launched for the Virtual LHC@home project in the near future. The CMS and ALICE experiments also have plans to launch similar applications.

    CERN LHCb New
    LHCb

    CERN CMS New
    CMS

    CERN ALICE New
    ALICE

    An army of volunteers

    “LHC@home allows CERN to get additional computing resources for simulations that cannot easily be accommodated on regular batch or grid resources,” explains Nils Høimyr, the member of the CERN IT department responsible for running the platform. “Thanks to LHC@home, thousands of CPU years of accelerator beam dynamics simulations for LHC upgrade studies have been done with SixTrack, and billions of events have been simulated with Virtual LHC@home.” He continues: “Furthermore, the LHC@home platform has been an outreach channel, giving publicity to LHC and high-energy physics among the general public.”

    See the full article here.

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    iSGTW is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, iSGTW is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read iSGTW via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
  • richardmitnick 5:23 pm on November 28, 2014 Permalink | Reply
    Tags: , , , , Citizen Science,   

    From CERN: “ATLAS@Home looks for CERN volunteers” 

    ATLAS@home

    ATLAS@home

    Mon 01 Dec 2014
    Rosaria Marraffino

    ATLAS@Home is a CERN volunteer computing project that runs simulated ATLAS events. As the project ramps up, the project team is looking for CERN volunteers to test the system before planning a bigger promotion for the public.

    as
    The ATLAS@home outreach website.

    ATLAS@Home is a large-scale research project that runs ATLAS experiment simulation software inside virtual machines hosted by volunteer computers. “People from all over the world offer up their computers’ idle time to run simulation programmes to help physicists extract information from the large amount of data collected by the detector,” explains Claire Adam Bourdarios of the ATLAS@Home project. “The ATLAS@Home project aims to extrapolate the Standard Model at a higher energy and explore what new physics may look like. Everything we’re currently running is preparation for next year’s run.”

    ATLAS@Home became an official BOINC (Berkeley Open Infrastructure for Network Computing) project in May 2014. After a beta test with SUSY events and Z decays, real production started in the summer with inelastic proton-proton interaction events. Since then, the community has grown remarkably and now includes over 10,000 volunteers spread across five continents. “We’re running the full ATLAS simulation and the resulting output files containing the simulated events are integrated with the experiment standard distributed production,” says Bourdarios.

    Compared to other LHC@Home projects, ATLAS@Home is heavier in terms of network traffic and memory requirements. “From the start, we have been successfully challenging the underlying infrastructure of LHC@Home,” says Bourdarios. “Now we’re looking for CERN volunteers to go one step further before doing a bigger public promotion.”

    e
    This simulated event display is created using ATLAS data.

    If you want to join the community and help the ATLAS experiment, you just need to download and run the necessary free software, VirtualBox and BOINC, which are available on NICE. Find out more about the project and how to join on the ATLAS@Home outreach website.

    “This project has huge outreach potential,” adds Bourdarios. “We hope to demonstrate how big discoveries are often unexpected deviations from existing models. This is why we need simulations. We’re also working on an event display, so that people can learn more about the events they have been producing and capture an image of what they have done.”

    If you have any questions about the ATLAS@Home project, e-mail atlas-comp-contact-home@cern.ch
    .

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    ATLAS@Home is a research project that uses volunteer computing to run simulations of the ATLAS experiment at CERN. You can participate by downloading and running a free program on your computer.

    ATLAS is a particle physics experiment taking place at the Large Hadron Collider at CERN, that searches for new particles and processes using head-on collisions of protons of extraordinary high energy. Petabytes of data were recorded, processed and analyzed during the first three years of data taking, leading to up to 300 publications covering all the aspects of the Standard Model of particle physics, including the discovery of the Higgs boson in 2012.

    Large scale simulation campaigns are a key ingredient for physicists, who permanently compare their data with both “known” physics and “new” phenomena predicted by alternative models of the universe, particles and interactions. This simulation runs on the WLCG Computing Grid and at any one point there are around 150,000 tasks running. You can help us run even more simulation by using your computer’s idle time to run these same tasks.

    No knowledge of particle physics is required, but for those interested in more details, at the moment we simulate the creation and decay of supersymmetric bosons and fermions, new types of particles that we would love to discover next year, as they would help us to shed light on the dark matter mystery!

    This project runs on BOINC software from UC Berkeley.
    Visit BOINC, download and install the software and attach to the project.

    BOINCLarge

     
  • richardmitnick 3:22 pm on November 18, 2014 Permalink | Reply
    Tags: , , Citizen Science, ,   

    From NOVA: “Why There’s No HIV Cure Yet” 

    [After the NOVA article, I tell you how you and your family, friends, and colleagues can help to find a cure for AIDS and other diseases]

    PBS NOVA

    NOVA

    27 Aug 2014
    Alison Hill

    Over the past two years, the phrase “HIV cure” has flashed repeatedly across newspaper headlines. In March 2013, doctors from Mississippi reported that the disease had vanished in a toddler who was infected at birth. Four months later, researchers in Boston reported a similar finding in two previously HIV-positive men. All three were no longer required to take any drug treatments. The media heralded the breakthrough, and there was anxious optimism among HIV researchers. Millions of dollars of grant funds were earmarked to bring this work to more patients.

    But in December 2013, the optimism evaporated. HIV had returned in both of the Boston men. Then, just this summer, researchers announced the same grim results for the child from Mississippi. The inevitable questions mounted from the baffled public. Will there ever be a cure for this disease? As a scientist researching HIV/AIDS, I can tell you there’s no straightforward answer. HIV is a notoriously tricky virus, one that’s eluded promising treatments before. But perhaps just as problematic is the word “cure” itself.

    Science has its fair share of trigger words. Biologists prickle at the words “vegetable” and “fruit”—culinary terms which are used without a botanical basis—chemists wrinkle their noses at “chemical free,” and physicists dislike calling “centrifugal” a force—it’s not; it only feels like one. If you ask an HIV researcher about a cure for the disease, you’ll almost certainly be chastised. What makes “cure” such a heated word?

    t
    HIV hijacks the body’s immune system by attacking T cells.

    It all started with a promise. In the early 1980s, doctors and public health officials noticed large clusters of previously healthy people whose immune systems were completely failing. The new condition became known as AIDS, for “acquired immunodeficiency syndrome.” A few years later, in 1984, researchers discovered the cause—the human immunodeficiency virus, now known commonly as HIV. On the day this breakthrough was announced, health officials assured the public that a vaccine to protect against the dreaded infection was only two years away. Yet here we are, 30 years later, and there’s still no vaccine. This turned out to be the first of many overzealous predictions about controlling the HIV epidemic or curing infected patients.

    The progression from HIV infection to AIDS and eventual death occurs in over 99% of untreated cases—making it more deadly than Ebola or the plague. Despite being identified only a few decades ago, AIDS has already killed 25 million people and currently infects another 35 million, and the World Health Organization lists it as the sixth leading cause of death worldwide.

    HIV disrupts the body’s natural disease-fighting mechanisms, which makes it particularly deadly and complicates efforts to develop a vaccine against it. Like all viruses, HIV gets inside individual cells in the body and highjacks their machinery to make thousands of copies of itself. HIV replication is especially hard for the body to control because the white blood cells it infects, and eventually kills, are a critical part of the immune system. Additionally, when HIV copies its genes, it does so sloppily. This causes it to quickly mutate into many different strains. As a result, the virus easily outwits the body’s immune defenses, eventually throwing the immune system into disarray. That gives other obscure or otherwise innocuous infections a chance to flourish in the body—a defining feature of AIDS.

    Early Hope

    In 1987, the FDA approved AZT as the first drug to treat HIV. With only two years between when the drug was identified in the lab and when it was available for doctors to prescribe, it was—and remains—the fastest approval process in the history of the FDA. AZT was widely heralded as a breakthrough. But as the movie The Dallas Buyer’s Club poignantly retells, AZT was not the miracle drug many hoped. Early prescriptions often elicited toxic side-effects and only offered a temporary benefit, as the virus quickly mutated to become resistant to the treatment. (Today, the toxicity problems have been significantly reduced, thanks to lower doses.) AZT remains a shining example of scientific bravura and is still an important tool to slow the infection, but it is far from the cure the world had hoped for.

    In three decades, over 25 highly-potent drugs have been developed and FDA-approved to treat HIV.

    Then, in the mid-1990s, some mathematicians began probing the data. Together with HIV scientists, they suggested that by taking three drugs together, we could avoid the problem of drug resistance. The chance that the virus would have enough mutations to allow it to avoid all drugs at once, they calculated, would simply be too low to worry about. When the first clinical trials of these “drug cocktails” began, both mathematical and laboratory researchers watched the levels of virus drop steadily in patients until they were undetectable. They extrapolated this decline downwards and calculated that, after two to three years of treatment, all traces of the virus should be gone from a patient’s body. When that happened, scientists believed, drugs could be withdrawn, and finally, a cure achieved. But when the time came for the first patients to stop their drugs, the virus again seemed to outwit modern medicine. Within a few weeks of the last pill, virus levels in patients’ blood sprang up to pre-treatment levels—and stayed there.

    In the three decades since, over 25 more highly-potent drugs have been developed and FDA-approved to treat HIV. When two to five of them are combined into a drug cocktail, the mixture can shut down the virus’s replication, prevent the onset of AIDS, and return life expectancy to a normal level. However, patients must continue taking these treatments for their entire lives. Though better than the alternative, drug regimens are still inconvenient and expensive, especially for patients living in the developing world.

    Given modern medicine’s success in curing other diseases, what makes HIV different? By definition, an infection is cured if treatment can be stopped without the risk of it resurfacing. When you take a week-long course of antibiotics for strep throat, for example, you can rest assured that the infection is on track to be cleared out of your body. But not with HIV.

    A Bad Memory

    The secret to why HIV is so hard to cure lies in a quirk of the type of cell it infects. Our immune system is designed to store information about infections we have had in the past; this property is called “immunologic memory.” That’s why you’re unlikely to be infected with chickenpox a second time or catch a disease you were vaccinated against. When an infection grows in the body, the white blood cells that are best able to fight it multiply repeatedly, perfecting their infection-fighting properties with each new generation. After the infection is cleared, most of these cells will die off, since they are no longer needed. However, to speed the counter-attack if the same infection returns, some white blood cells will transition to a hibernation state. They don’t do much in this state but can live for an extremely long time, thereby storing the “memory” of past infections. If provoked by a recurrence, these dormant cells will reactivate quickly.

    This near-immortal, sleep-like state allows HIV to persist in white blood cells in a patient’s body for decades. White blood cells infected with HIV will occasionally transition to the dormant state before the virus kills them. In the process, the virus also goes temporarily inactive. By the time drugs are started, a typical infected person contains millions of these cells with this “latent” HIV in them. Drug cocktails can prevent the virus from replicating, but they do nothing to the latent virus. Every day, some of the dormant white blood cells wake up. If drug treatment is halted, the latent virus particles can restart the infection.

    Latent HIV’s near-immortal, sleep-like state allows it to persist in white blood cells in a patient’s body for decades.

    HIV researchers call this huge pool of latent virus the “barrier to a cure.” Everyone’s looking for ways to get rid of it. It’s a daunting task, because although a million HIV-infected cells may seem like a lot, there are around a million times that many dormant white blood cells in the whole body. Finding the ones that contain HIV is a true needle-in-a-haystack problem. All that remains of a latent virus is its DNA, which is extremely tiny compared to the entire human genome inside every cell (about 0.001% of the size).
    Defining a Cure

    Around a decade ago, scientists began to talk amongst themselves about what a hypothetical cure could look like. They settled on two approaches. The first would involve purging the body of latent virus so that if drugs were stopped, there would be nothing left to restart the infection. This was often called a “sterilizing cure.” It would have to be done in a more targeted and less toxic way than previous attempts of the late 1990s, which, because they attempted to “wake up” all of the body’s dormant white blood cells, pushed the immune system into a self-destructive overdrive. The second approach would instead equip the body with the ability to control the virus on its own. In this case, even if treatment was stopped and latent virus reemerged, it would be unable to produce a self-sustaining, high-level infection. This approach was referred to as a “functional cure.”

    The functional cure approach acknowledged that latency alone was not the barrier to a cure for HIV. There are other common viruses that have a long-lived latent state, such as the Epstein-Barr virus that causes infectious mononucleosis (“mono”), but they rarely cause full-blown disease when reactivated. HIV is, of course, different because the immune system in most people is unable to control the infection.

    The first hint that a cure for HIV might be more than a pipe-dream came in 2008 in a fortuitous human experiment later known as the “Berlin patient.” The Berlin patient was an HIV-positive man who had also developed leukemia, a blood cancer to which HIV patients are susceptible. His cancer was advanced, so in a last-ditch effort, doctors completely cleared his bone marrow of all cells, cancerous and healthy. They then transplanted new bone marrow cells from a donor.

    Fortunately for the Berlin patient, doctors were able to find a compatible bone marrow donor who carried a unique HIV-resistance mutation in a gene known as CCR5. They completed the transplant with these cells and waited.

    For the last five years, the Berlin patient has remained off treatment without any sign of infection. Doctors still cannot detect any HIV in his body. While the Berlin patient may be cured, this approach cannot be used for most HIV-infected patients. Bone marrow transplants are extremely risky and expensive, and they would never be conducted in someone who wasn’t terminally ill—especially since current anti-HIV drugs are so good at keeping the infection in check.

    Still, the Berlin patient was an important proof-of-principle case. Most of the latent virus was likely cleared out during the transplant, and even if the virus remained, most strains couldn’t replicate efficiently given the new cells with the CCR5 mutation. The Berlin patient case provides evidence that at least one of the two cure methods (sterilizing or functional), or perhaps a combination of them, is effective.

    Researchers have continued to try to find more practical ways to rid patients of the latent virus in safe and targeted ways. In the past five years, they have identified multiple anti-latency drug candidates in the lab. Many have already begun clinical trials. Each time, people grow optimistic that a cure will be found. But so far, the results have been disappointing. None of the drugs have been able to significantly lower levels of latent virus.

    In the meantime, doctors in Boston have attempted to tease out which of the two cure methods was at work in the Berlin patient. They conducted bone marrow transplants on two HIV-infected men with cancer—but this time, since HIV-resistant donor cells were not available, they just used typical cells. Both patients continued their drug cocktails during and after the transplant in the hopes that the new cells would remain HIV-free. After the transplants, no HIV was detectable, but the real test came when these patients volunteered to stop their drug regimens. When they remained HIV-free a few months later, the results were presented at the International AIDS Society meeting in July 2013. News outlets around the world declared that two more individuals had been cured of HIV.

    Latent virus had likely escaped the detection methods available.

    It quickly became clear that everyone had spoken too soon. Six months later, researchers reported that the virus had suddenly and rapidly returned in both individuals. Latent virus had likely escaped the detection methods available—which are not sensitive enough—and persisted at low, but significant levels. Disappointment was widespread. The findings showed that even very small amounts of latent virus could restart an infection. It also meant meant that the anti-latency drugs in development would need to be extremely potent to give any hope of a cure.

    But there was one more hope—the “Mississippi baby.” A baby was born to an HIV-infected mother who had not received any routine prenatal testing or treatment. Tests revealed high levels of HIV in the baby’s blood, so doctors immediately started the infant on a drug cocktail, to be continued for life.

    The mother and child soon lost touch with their health care providers. When they were relocated a few years later, doctors learned that the mother had stopped giving drugs to the child several months prior. The doctors administered all possible tests to look for signs of the virus, both latent and active, but they didn’t find any evidence. They chose not to re-administer drugs, and a year later, when the virus was still nowhere to be found, they presented the findings to the public. It was once again heralded as a cure.

    Again, it was not to be. Just last month, the child’s doctors announced that the virus had sprung back unexpectedly. It seemed that even starting drugs as soon as infection was detected in the newborn could not prevent the infection from returning over two years later.
    Hope Remains

    Despite our grim track record with the disease, HIV is probably not incurable. Although we don’t have a cure yet, we’ve learned many lessons along the way. Most importantly, we should be extremely careful about using the word “cure,” because for now, we’ll never know if a person is cured until they’re not cured.

    Clearing out latent virus may still be a feasible approach to a cure, but the purge will have to be extremely thorough. We need drugs that can carefully reactivate or remove latent HIV, leaving minimal surviving virus while avoiding the problems that befell earlier tests that reactivated the entire immune system. Scientists have proposed multiple, cutting-edge techniques to engineer “smart” drugs for this purpose, but we don’t yet know how to deliver this type of treatment safely or effectively.

    As a result, most investigations focus on traditional types of drugs. Researchers have developed ways to rapidly scan huge repositories of existing medicines for their ability to target latent HIV. These methods have already identified compounds that were previously used to treat alcoholism, cancer, and epilepsy, and researchers are repurposing them to be tested in HIV-infected patients.
    The less latent virus that remains, the less chance there is that the virus will win the game of chance.

    Mathematicians are also helping HIV researchers evaluate new treatments. My colleagues and I use math to take data collected from just a few individuals and fill in the gaps. One question we’re focusing on is exactly how much latent virus must be removed to cure a patient, or at least to let them stop their drug cocktails for a few years. Each cell harboring latent virus is a potential spark that could restart the infection. But we don’t know when the virus will reactivate. Even once a single latent virus awakens, there are still many barriers it must overcome to restart a full-blown infection. The less latent virus that remains, the less chance there is that the virus will win this game of chance. Math allows us to work out these odds very precisely.

    Our calculations show that “apparent cures”—where patients with latent virus levels low enough to escape detection for months or years without treatment—are not a medical anomaly. In fact, math tells us that they are an expected result of these chance dynamics. It can also help researchers determine how good an anti-latency drug should be before it’s worth testing in a clinical trial.

    Many researchers are working to augment the body’s ability to control the infection, providing a functional cure rather than a sterilizing one. Studies are underway to render anyone’s immune cells resistant to HIV, mimicking the CCR5 mutation that gives some people natural resistance. Vaccines that could be given after infection, to boost the immune response or protect the body from the virus’s ill effects, are also in development.

    In the meantime, treating all HIV-infected individuals—which has the added benefit of preventing new transmissions—remains the best way to control the epidemic and reduce mortality. But the promise of “universal treatment” has also not materialized. Currently, even in the U.S., only 25% of HIV-positive people have their viral levels adequately suppressed by treatment. Worldwide, for every two individuals starting treatment, three are newly infected. While there’s no doubt that we’ve made tremendous progress in fighting the virus, we have a long way to go before the word “cure” is not taboo when it comes to HIV/AIDS.

    See the full article here.

    Did you know that you can help in the fight against AIDS? By donating time on your computer to the Fight Aids at Home project of World Community Grid, you can become a part of the solution. The work is called “crunching” because you are crunching computational data the results of which will then be fed back into the necessary lab work. We save researchers literally millions of hours of lab time in this process.
    Vsit World Community Grid (WCG) or Berkeley Open infrastructure for Network Computing (BOINC). Download the BOINC software and install it on your computer. Then visit WCG and attach to the FAAH project. The project will send you computational work units. Your computer will process them and send the results back to the project, the project will then send you more work units. It is that simple. You do nothing, unless you want to get into the nuts and bolts of the BOINC software. If you take up this work, and if you see it as valuable, please tell your family, friends and colleagues, anyone with a computer, even an Android tablet. We found out that my wife’s oncologist’s father in Brazil is a cruncher on two projects from WCG.

    This is the projects web site. Take a look.

    While you are visiting BOINC and WCG, look around at all of the very valuable projects being conducted at some of the worlds most distinguished universities and scientific institutions. You can attach to as many as you like, on one or a number of computers. You can only be a help here, particpating in Citizen Science.

    This is a look at the present and past projects at WCG:

    Please visit the project pages-

    Mapping Cancer Markers
    mappingcancermarkers2

    Uncovering Genome Mysteries
    Uncovering Genome Mysteries

    Say No to Schistosoma

    GO Fight Against Malaria

    Drug Search for Leishmaniasis

    Computing for Clean Water

    The Clean Energy Project

    Discovering Dengue Drugs – Together

    Help Cure Muscular Dystrophy

    Help Fight Childhood Cancer

    Help Conquer Cancer

    Human Proteome Folding

    FightAIDS@Home

    World Community Grid is a social initiative of IBM Corporation
    IBM Corporation
    ibm

    IBM – Smarter Planet
    <img

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NOVA is the highest rated science series on television and the most watched documentary series on public television. It is also one of television’s most acclaimed series, having won every major television award, most of them many times over.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 1:47 pm on November 11, 2014 Permalink | Reply
    Tags: , , , Citizen Science, , ,   

    From DDDT at WCG: “Discovering Dengue Drugs – Together” 

    New WCG Logo

    10 Nov 2014
    By: Dr. Stan Watowich, PhD
    University of Texas Medical Branch (UTMB) in Galveston, Texas

    Summary
    For week five of our decade of discovery celebrations we’re looking back at the Discovering Dengue Drugs – Together project, which helped researchers at the University of Texas Medical Branch at Galveston search for drugs to help combat dengue – a debilitating tropical disease that threatens 40% of the world’s population. Thanks to World Community Grid volunteers, researchers have identified a drug lead that has the potential to stop the virus in its tracks.

    mic

    Dengue fever, also known as “breakbone fever”, causes excruciating joint and muscle pain, high fever and headaches. Severe dengue, known as “dengue hemorrhagic fever”, has become a leading cause of hospitalization and death among children in many Asian and Latin American countries. According to the World Health Organization (WHO), over 40% of the world’s population is at risk from dengue; another study estimated there were 390 million cases in 2010 alone.

    The disease is a mosquito-borne infection found in tropical and sub-tropical regions – primarily in the developing world. It belongs to the flavivirus family of viruses, together with Hepatitis C, West Nile and Yellow Fever.

    Despite the fact dengue represents a critical global health concern, it has received limited attention from affluent countries until recently and is widely considered to be a neglected tropical disease. Currently, no approved vaccines or treatments exist for the disease. We launched Discovering Dengue Drugs – Together on World Community Grid in 2007 to search for drugs to treat dengue infections using a computer-based discovery approach.

    In the first phase of the project, we aimed to identify compounds that could be used to develop dengue drugs. Thanks to the computing power donated by World Community Grid volunteers, my fellow researchers and I at the University of Texas Medical Branch in Galveston, Texas, screened around three million chemical compounds to determine which ones would bind to the dengue virus and disable it.

    By 2009 we had found several thousand promising compounds to take to the next stage of testing. We began identifying the strongest compounds from the thousands of potentials, with the goal of turning these into molecules that could be suitable for human clinical trials.

    We have recently made an exciting discovery using insights from Discovering Dengue Drugs – Together to guide additional calculations on our web portal for advanced computer-based drug discovery, DrugDiscovery@TACC. A molecule has demonstrated success in binding to and disabling a key dengue enzyme that is necessary for the virus to replicate.

    Furthermore, it also shows signs of being able to effectively disable related flaviviruses, such as the West Nile virus. Importantly, our newly discovered drug lead also demonstrates no negative side effects such as adverse toxicity, carcinogenicity or mutagenicity risks, making it a promising antiviral drug candidate for dengue and potentially other flavivirues. We are working with medicinal chemists to synthesize variants of this exciting candidate molecule with the goal of improving its activity for planned pre-clinical and clinical trials.

    I’d like to express my gratitude for the dedication of World Community Grid volunteers. The advances we are making, and our improved understanding of drug discovery software and its current limitations, would not have been possible without your donated computing power.

    If you’d like to help researchers make more ground-breaking discoveries like this – and have the chance of winning some fantastic prizes – take part in our decade of discovery competition by encouraging your friends to sign up to World Community Grid today. There’s a week left and the field is wide open – get started today!

    See the full article here.

    World Community Grid (WCG) brings people together from across the globe to create the largest non-profit computing grid benefiting humanity. It does this by pooling surplus computer processing power. We believe that innovation combined with visionary scientific research and large-scale volunteerism can help make the planet smarter. Our success depends on like-minded individuals – like you.”

    WCG projects run on BOINC software from UC Berkeley.

    BOINC is a leader in the field(s) of Distributed Computing, Grid Computing and Citizen Cyberscience.BOINC is more properly the Berkeley Open Infrastructure for Network Computing.

    CAN ONE PERSON MAKE A DIFFERENCE? YOU BETCHA!!

    “Download and install secure, free software that captures your computer’s spare power when it is on, but idle. You will then be a World Community Grid volunteer. It’s that simple!” You can download the software at either WCG or BOINC.

    Please visit the project pages-

    Mapping Cancer Markers
    mappingcancermarkers2

    Uncovering Genome Mysteries
    Uncovering Genome Mysteries

    Say No to Schistosoma

    GO Fight Against Malaria

    Drug Search for Leishmaniasis

    Computing for Clean Water

    The Clean Energy Project

    Discovering Dengue Drugs – Together

    Help Cure Muscular Dystrophy

    Help Fight Childhood Cancer

    Help Conquer Cancer

    Human Proteome Folding

    FightAIDS@Home

    World Community Grid is a social initiative of IBM Corporation
    IBM Corporation
    ibm

    IBM – Smarter Planet
    sp

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
Follow

Get every new post delivered to your Inbox.

Join 462 other followers

%d bloggers like this: