Tagged: Medicine Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 11:33 am on March 30, 2017 Permalink | Reply
    Tags: , Brian Strom, Eliminating viral hepatitis, Medicine,   

    From Rutgers: “Study Committee Announces Recommendations for Eliminating Viral Hepatitis” 

    Rutgers University
    Rutgers University

    March 28, 2017

    National Academies of Sciences, Engineering, and Medicine committee is headed by Rutgers’ Brian Strom

    1
    Renowned epidemiologist Brian L. Strom, chancellor of Rutgers Biomedical and Health Sciences, headed the committee of scientists studying the possibility of eliminating viral hepatitis. Photo: Nick Romanenko/Rutgers University

    Viral hepatitis could become rare – and approximately 20,000 deaths annually in the U.S. could be eliminated – if federal and state agencies make the disease a priority, according to recommendations of the National Academies of Sciences, Engineering, and Medicine announced today in Washington, D.C.

    Brian L. Strom, a renowned epidemiologist and the chancellor of Rutgers Biomedical and Health Sciences, headed the committee of scientists selected by the Academies, which studied the issue and developed the recommendations.

    Hepatitis, often referred to as the “silent killer,” appears mostly as hepatitis B (HBV), for which a vaccine exists, or hepatitis C (HCV), which can be eliminated with antiviral drugs in more than 90 percent of chronically infected patients.

    The Academies’ recommendations provide a framework for hepatitis elimination. The key, the Academies said, is to support prevention methods – vaccinations for HBV and antiviral drugs to treat HCV, combined with reducing exposure to the virus – with a major effort to identify and educate individuals with the virus.

    “Many people suffering from viral hepatitis are not in contact with the health system, so the elimination strategy must give as much attention to the delivery of services as to the services themselves,” Strom said. “A variety of federal and state agencies should give more explicit attention to bringing hepatitis services to these populations. A system of the same breadth and flexibility as the Ryan White Act, which was passed in response to similar issues in those with HIV, would go far to reaching marginalized viral hepatitis patients.”

    The committee’s recommendations also addressed the major costs of the antiviral drugs in light of the very expensive drug therapies which remain under patents. While these drugs are cost effective compared to other health interventions, the sheer cost of the drug has been “access prohibitive,” Strom said.

    “The committee recommends a voluntary transaction between the government and the companies producing direct-acting antivirals, in which companies compete to license a patented drug to the federal government for use in neglected populations,” he explained.

    The committee suggests licensing rights to the expensive drugs to treat vulnerable populations not currently reached through community health providers, such as prisoners and Medicaid beneficiaries. Such an effort could cost approximately $2 billion, with states paying about $140 million, to reach an estimated 700,000 hepatitis patients, the committee said.

    By comparison, currently it would cost approximately $10 billion over the next 12 years to treat only 240,000 patients among the prisoner and Medicaid populations, the Academies said.

    “It is possible to eliminate hepatitis B and C as a public health problem in the United States, averting about 90,000 deaths by 2030,” Strom predicted.

    Citing the importance of the issue, Strom noted that chronic HBV and HCV infections affect 3 to 5 times more Americans and 10 times more people worldwide than HIV. Viral hepatitis kills more people worldwide each year than HIV, road traffic injuries, or diabetes, Strom said.

    Despite being the seventh-leading cause of death in the world, viral hepatitis consumes less than 1 percent of the National Institutes of Health research budget, Strom pointed out. Approximately 1.3 million Americans have HVB and 2.7 million have hepatitis C.

    HVB and HVC account for approximately 80 percent of the world’s liver cancer. Chronic hepatitis B increases the odds of liver cancer 50 to 100 times, and of hepatitis C, 15 to 20 times, Strom said, adding that viral hepatitis is a driving factor in the 38 percent increase in liver cancer in the U.S. between 2003 and 2012.

    Strom, a member of the Academies’ Institute of Medicine, has led several major institute projects, including the smallpox vaccination program implementation in 2002-2003 and the committee on dietary salt intake in 2012-2013.

    See the full article here .

    Follow Rutgers Research here .

    Please help promote STEM in your local schools.

    rutgers-campus

    STEM Icon

    Stem Education Coalition

    Rutgers, The State University of New Jersey, is a leading national research university and the state’s preeminent, comprehensive public institution of higher education. Rutgers is dedicated to teaching that meets the highest standards of excellence; to conducting research that breaks new ground; and to providing services, solutions, and clinical care that help individuals and the local, national, and global communities where they live.

    Founded in 1766, Rutgers teaches across the full educational spectrum: preschool to precollege; undergraduate to graduate; postdoctoral fellowships to residencies; and continuing education for professional and personal advancement.

    Rutgers smaller
    Please give us back our original beautiful seal which the University stole away from us.

     
  • richardmitnick 8:05 am on March 30, 2017 Permalink | Reply
    Tags: , , , Medicine, , Ticks on the march   

    From NS: “Lyme disease is set to explode, and you can’t protect yourself” 

    NewScientist

    New Scientist

    29 March 2017
    Chelsea Whyte

    A new prediction says 2017 and 2018 will see major Lyme disease outbreaks in new areas. This could lead to lifelong health consequences, so where’s the vaccine?

    1
    Tick tock. Mike Peres/Custom Medical Stock Photo/SPL

    BY THE time he had finished his walk through the woods in New York state, Rick Ostfeld was ready to declare a public health emergency. He could read the warning signs in the acorns that littered the forest floor – seeds of a chain of events that will culminate in an unprecedented outbreak of Lyme disease this year.

    Since that day in 2015, Ostfeld has been publicising the coming outbreak. Thanks to a changing climate it could be one of the worst on record: the ticks that carry the disease have been found in places where it has never before been a problem – and where most people don’t know how to respond. The danger zone isn’t confined to the US: similar signs are flagging potential outbreaks in Europe. Polish researchers predict a major outbreak there in 2018.

    In theory, Ostfeld’s early warning system gives public health officials a two-year window to prepare. In many other cases, this would be enough time to roll out a vaccination programme. But there is no human vaccine for Lyme disease. Why not? And what can you do to protect yourself in the meantime?

    Lyme disease is the most common infection following an insect bite in the US: the Centers for Disease Control estimates that 300,000 Americans contract Lyme disease each year, calling it “a major US public health problem”. While it is easy enough to treat if caught early, we are still getting to grips with lifelong health problems that can stem from not catching it in time (see “Do I have Lyme disease?“).

    This is less of a problem when Lyme is confined to a few small areas of the US, but thanks in part to warmer winters, the disease is spreading beyond its usual territory, extending across the US (see map) and into Europe and forested areas of Asia. In Europe in particular, confirmed cases have been steadily rising for 30 years – today, the World Health Organization estimates that 65,000 people get Lyme disease each year in the region. In the UK, 2000 to 3000 cases are diagnosed each year, up tenfold from 2001, estimates the UK’s National Health Service.

    So how could a floor of acorns two years ago tell Ostfeld, a disease ecologist at the Cary Institute of Ecosystem Studies in Millbrook, New York, that 2017 would see an outbreak of Lyme disease? It’s all down to what happens next.

    A bumper crop of the seeds – “like you were walking on ball bearings” – comes along every two to five years in Millbrook. Crucially, these nutrient-packed meals swell the mouse population: “2016 was a real mouse plague of a year,” he says. And mouse plagues bring tick plagues.

    Soon after hatching, young ticks start “questing” – grasping onto grasses or leaves with their hind legs and waving their forelegs, ready to hitch a ride on whatever passes by, usually a mouse.

    Gut reaction

    Once on board, the feast begins. Just one mouse can carry hundreds of immature ticks in their post-larval nymph stage.

    This is where the problems for us start. Mouse blood carries the Lyme-causing bacterium Borrelia burgdorferi, which passes to a tick’s gut as it feeds. The tick itself is unharmed, but each time it latches onto a new host to feed, the bacteria can move from its gut to the blood – including that of any human passers-by.

    “We predict the mice population based on the acorns and we predict infected nymph ticks with the mice numbers. Each step has a one year lag,” Ostfeld says.

    Ostfeld published his discovery of this chain of causation in 2006 [PLOS Biology]. Last year, researchers in Poland found the same trend there, with the same implications. “Last year we had a lot of oak acorns, so we might expect 2018 will pose a high risk of Lyme,” says Jakub Szymkowiak at Adam Mickiewicz University in Poznan, Poland.

    Those who live in traditional Lyme disease zones are well versed in tick awareness – wear long trousers in the woods, check yourself thoroughly afterwards, and more. But this advice will be less familiar in places that used to sit outside Lyme zones – like Poland. “That’s sort of the perfect storm,” says Ostfeld. “The public is unaware, so they’re not looking for it and they don’t get treated.”

    It’s not obvious when you have been bitten or infected: ticks are the size of a poppy seed, and not everyone gets the classic “bullseye” rash that is supposed to tip you off. The flu-like symptoms that follow are also easy to misdiagnose. And because antibodies to Lyme disease take a few weeks to develop, early tests can miss it. “That’s when you get late-stage, untreated, supremely problematic Lyme disease,” Ostfeld says.

    The best approach would be to vaccinate people at risk – but there is currently no vaccine. We used to have one, but thanks to anti-vaccination activists, that is no longer the case.

    In the late 1990s, a race was on to make the first Lyme disease vaccine. By December 1998, the US Food and Drug Administration approved the release of Lymerix, developed by SmithKline Beecham, now GSK. But the company voluntarily withdrew the drug after only four years.

    This followed a series of lawsuits – including one where recipients claimed Lymerix caused chronic arthritis. Influenced by now-discredited research purporting to show a link between the MMR vaccine and autism, activists raised the question of whether the Lyme disease vaccine could cause arthritis.

    Media coverage and the anti-Lyme-vaccination groups gave a voice to those who believed their pain was due to the vaccine, and public support for the vaccine declined. “The chronic arthritis was not associated with Lyme,” says Stanley Plotkin, an adviser to pharmaceutical company Sanofi Pasteur. “When you’re dealing with adults, all kinds of things happen to them. They get arthritis, they get strokes, heart attacks. So unless you have a control group, you’re in la-la land.”

    But there was a control group – the rest of the US population. And when the FDA reviewed the vaccine’s adverse event reports in a retrospective study, they found only 905 reports for 1.4 million doses. Still, the damage was done, and the vaccine was benched.

    After that, “no one touched it”, says Thomas Lingelbach, CEO at Valneva, a biotech company based in France. Until now: Valneva has a vaccine in early human trials. It will improve on Lymerix, acting against all five strains of the disease instead of just the one most common in the US, and it will be suitable for children.

    Lingelbach knows the battles his firm will face. “It will be hard to convince anti-vax lobbyists,” he says. That fight is still some way off: any public roll-out is at least six years away.

    What makes this wait especially galling for some is that there is a vaccine for your pet. “It’s ironic that you can vaccinate your animal and you can’t vaccinate yourself,” Plotkin says.

    In the animal vaccine, instead of exposing Fido to a weakened version of the antigen to trigger antibodies, it works within the tick, neutralising B. burgdorferi by altering the expression of a protein on the bacterium before it enters the bloodstream. This is how a human version would work. “The underlying scientific principle is not very far away from what it is in the veterinary environment,” says Lingelbach.

    Some people have suggested taking the animal vaccine, but Plotkin doesn’t recommend this as it hasn’t been tested in people so there is insufficient safety data. “You just don’t have classical efficacy data in humans,” he says. It is also illegal in the US and UK for vets to practise medicine on humans.

    While we wait for a human vaccine, you might start keeping track of your local acorn populations – but brush up on your anti-tick measures before you hit the woods.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 9:44 am on March 29, 2017 Permalink | Reply
    Tags: , Medicine, , , ,   

    From MIT: “Progress toward a Zika vaccine” A lot of Zika News Lately 

    MIT News

    MIT Widget

    MIT News

    March 29, 2017
    Anne Trafton

    1
    MIT researchers have devised a new vaccine candidate for the Zika virus. “It functions almost like a synthetic virus, except it’s not pathogenic and it doesn’t spread,” says postdoc Omar Khan. Image: Jose-Luis Olivares/MIT

    Researchers program RNA nanoparticles that could protect against the virus.

    Using a new strategy that can rapidly generate customized RNA vaccines, MIT researchers have devised a new vaccine candidate for the Zika virus.

    The vaccine consists of strands of genetic material known as messenger RNA, which are packaged into a nanoparticle that delivers the RNA into cells. Once inside cells, the RNA is translated into proteins that provoke an immune response from the host, but the RNA does not integrate itself into the host genome, making it potentially safer than a DNA vaccine or vaccinating with the virus itself.

    “It functions almost like a synthetic virus, except it’s not pathogenic and it doesn’t spread,” says Omar Khan, a postdoc at MIT’s Koch Institute for Integrative Cancer Research and an author of the new study. “We can control how long it’s expressed, and it’s RNA so it will never integrate into the host genome.”

    This research also yielded a new benchmark for evaluating the effectiveness of other Zika vaccine candidates, which could help others who are working toward the same goal.

    Jasdave Chahal, a postdoc at MIT’s Whitehead Institute for Biomedical Research, is the first author of the paper, which appears in Scientific Reports. The paper’s senior author is Hidde Ploegh, a former MIT biology professor and Whitehead Institute member who is now a senior investigator in the Program in Cellular and Molecular Medicine at Boston Children’s Hospital.

    Other authors of the paper are Tao Fang and Andrew Woodham, both former Whitehead Institute postdocs in the Ploegh lab; Jingjing Ling, an MIT graduate student; and Daniel Anderson, an associate professor in MIT’s Department of Chemical Engineering and a member of the Koch Institute and MIT’s Institute for Medical Engineering and Science (IMES).

    Programmable vaccines

    The MIT team first reported its new approach to programmable RNA vaccines last year. RNA vaccines are appealing because they induce host cells to produce many copies of the proteins encoded by the RNA. This provokes a stronger immune reaction than if the proteins were administered on their own. However, finding a safe and effective way to deliver these vaccines has proven challenging.

    The researchers devised an approach in which they package RNA sequences into a nanoparticle made from a branched molecule that is based on fractal-patterned dendrimers. This modified-dendrimer-RNA structure can be induced to fold over itself many times, producing a spherical particle about 150 nanometers in diameter. This is similar in size to a typical virus, allowing the particles to enter cells through the same viral entry mechanisms. In their 2016 paper, the researchers used this nanoparticle approach to generate experimental vaccines for Ebola, H1N1 influenza, and the parasite Toxoplasma gondii.

    In the new study, the researchers tackled Zika virus, which emerged as an epidemic centered in Brazil in 2015 and has since spread around the world, causing serious birth defects in babies born to infected mothers. Since the MIT method does not require working with the virus itself, the researchers believe they might be able to explore potential vaccines more rapidly than scientists pursuing a more traditional approach.

    Instead of using viral proteins or weakened forms of the virus as vaccines, which are the most common strategies, the researchers simply programmed their RNA nanoparticles with the sequences that encode Zika virus proteins. Once injected into the body, these molecules replicate themselves inside cells and instruct cells to produce the viral proteins.

    The entire process of designing, producing, and testing the vaccine in mice took less time than it took for the researchers to obtain permission to work with samples of the Zika virus, which they eventually did get.

    “That’s the beauty of it,” Chahal says. “Once we decided to do it, in two weeks we were ready to vaccinate mice. Access to virus itself was not necessary.”

    Measuring response

    When developing a vaccine, researchers usually aim to generate a response from both arms of the immune system — the adaptive arm, mediated by T cells and antibodies, and the innate arm, which is necessary to amplify the adaptive response. To measure whether an experimental vaccine has generated a strong T cell response, researchers can remove T cells from the body and then measure how they respond to fragments of the viral protein.

    Until now, researchers working on Zika vaccines have had to buy libraries of different protein fragments and then test T cells on them, which is an expensive and time-consuming process. Because the MIT researchers could generate so many T cells from their vaccinated mice, they were able to rapidly screen them against this library. They identified a sequence of eight amino acids that the activated T cells in the mouse respond to. Now that this sequence, also called an epitope, is known, other researchers can use it to test their own experimental Zika vaccines in the appropriate mouse models.

    “We can synthetically make these vaccines that are almost like infecting someone with the actual virus, and then generate an immune response and use the data from that response to help other people predict if their vaccines would work, if they bind to the same epitopes,” Khan says. The researchers hope to eventually move their Zika vaccine into tests in humans.

    “The identification and characterization of CD8 T cell epitopes in mice immunized with a Zika RNA vaccine is a very useful reference for all those working in the field of Zika vaccine development,” says Katja Fink, a principal investigator at the A*STAR Singapore Immunology Network. “RNA vaccines have received much attention in the last few years, and while the big breakthrough in humans has not been achieved yet, the technology holds promise to become a flexible platform that could provide rapid solutions for emerging viruses.”

    Fink, who was not involved in the research, added that the “initial data are promising but the Zika RNA vaccine approach described needs further testing to prove efficacy.”

    Another major area of focus for the researchers is cancer vaccines. Many scientists are working on vaccines that could program a patient’s immune system to attack tumor cells, but in order to do that, they need to know what the vaccine should target. The new MIT strategy could allow scientists to quickly generate personalized RNA vaccines based on the genetic sequence of an individual patient’s tumor cells.

    The research was funded by the National Institutes of Health, a Fujifilm/MediVector grant, the Lustgarten Foundation, a Koch Institute and Dana-Farber/Harvard Center Center Bridge Project award, the Department of Defense Office of Congressionally Directed Medical Research’s Joint Warfighter Medical Research Program, and the Cancer Center Support Grant from the National Cancer Institute.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    MIT Seal

    The mission of MIT is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of the MIT community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

    MIT Campus

     
  • richardmitnick 7:02 am on March 27, 2017 Permalink | Reply
    Tags: , Medicine, ,   

    From UC Riverside: “Researchers Crack Structure of Key Protein in Zika Virus” 

    UC Riverside bloc

    UC Riverside

    March 27, 2017
    Iqbal Pittalwala

    1
    The image shows the crystal structure of ZIKV NS5 protein. The regions with different colors represent individual domains or motifs of ZIKV NS5. The black circle marks the location of the potential inhibitor-binding site. Image credit: Song lab, UC Riverside.

    Zika virus (ZIKV), which causes Zika virus disease, is spread to people primarily through the bite of an infected Aedes aegypti or Aedes albopictus mosquito. An infected pregnant woman can pass ZIKV to her fetus during pregnancy or around the time of birth. Sex is yet another way for infected persons to transmit ZIKV to others.

    The genomic replication of the virus is made possible by its “NS5” protein. This function of ZIKV NS5 is unique to the virus, making it an ideal target for anti-viral drug development. Currently, there is no vaccine or medicine to fight ZIKV infection.

    In a research paper just published in Nature Communications, University of California, Riverside scientists report that they have determined the crystal structure of the entire ZIKV NS5 protein and demonstrated that NS5 is functional when purified in vitro. Knowing the structure of ZIKV NS5 helps the researchers understand how ZIKV replicates itself.

    Furthermore, the researchers’ structural analysis of ZIKV NS5 reveals a potential binding site in the protein for an inhibitor, thereby providing a strong basis for developing potential inhibitors against ZIKV NS5 to suppress ZIKV infection. The identification of the inhibitor-binding site of NS5 can now enable scientists to design potential potent drugs to fight ZIKV.

    “We started this work realizing that the full structure of ZIKV NS5 was missing,” said Jikui Song, an assistant professor of biochemistry, who co-led the research with Rong Hai, an assistant professor of plant pathology and microbiology. “The main challenge for us occurred during the protein’s purification process when ZIKV NS5 got degraded – chopped up – by bacterial enzymes.”

    Song, Hai and their colleagues overcame this challenge by developing an efficient protocol for protein purification, which in essence minimizes the purification time for NS5.

    “Our work provides a framework for future studies of ZIKV NS5 and opportunities for drug development against ZIKV based on its structural similarity to the NS5 protein of other flaviviruses, such as the dengue virus,” Hai said. “No doubt, ZIKV therapeutics can benefit from the wealth of knowledge that has already been generated in the dengue virus field.”

    Next, the researchers plan to investigate the antiviral potential on ZIKV NS5 of a chemical compound that has been shown to work effectively in inhibiting the NS5 protein in the dengue virus.

    Song and Hai were joined in the research by graduate students Boxiao Wang (first author), Xiao-Feng Tan, Stephanie Thurmond, Zhi-Min Zhang, and Asher Lin.

    The research was supported by grants to Song from the March of Dimes Foundation, the Sidney Kimmel Foundation for Cancer Research and the National Institutes of Health.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    UC Riverside Campus

    The University of California, Riverside is one of 10 universities within the prestigious University of California system, and the only UC located in Inland Southern California.

    Widely recognized as one of the most ethnically diverse research universities in the nation, UCR’s current enrollment is more than 21,000 students, with a goal of 25,000 students by 2020. The campus is in the midst of a tremendous growth spurt with new and remodeled facilities coming on-line on a regular basis.

    We are located approximately 50 miles east of downtown Los Angeles. UCR is also within easy driving distance of dozens of major cultural and recreational sites, as well as desert, mountain and coastal destinations.

     
  • richardmitnick 6:25 am on March 27, 2017 Permalink | Reply
    Tags: "Cancer Biology Reproducibility Project Sees Mixed Results" Read it and Weep, , , Cancer Biology Reproducibility Project Sees Mixed Results, Medicine,   

    From NOVA: “Cancer Biology Reproducibility Project Sees Mixed Results” Read it and Weep 

    PBS NOVA

    NOVA

    18 Jan 2017 [Don’t know how I missed this, or maybe they never put it up in social media before?]
    Courtney Humphries

    How trustworthy are the findings from scientific studies?

    A growing chorus of researchers says there’s a “reproducibility crisis” in science, with too many discoveries published that may be flukes or exaggerations. Now, an ambitious project to test the reproducibility of top studies in cancer research by independent laboratories has published its first five studies in the open-access journal eLife.

    “These are the first public replication studies conducted in biomedical science, and that in itself is a huge achievement,” says Elizabeth Iorns, CEO of Science Exchange and one of the project’s leaders.

    1
    Cancer biology is just one of many fields being scrutinized for the reproducibility of its studies.

    The Reproducibility Project: Cancer Biology is a collaboration between the non-profit Center for Open Science and the for-profit Science Exchange, which runs a network of laboratories for outsourcing biomedical research. It began in 2013 with the goal of repeating experiments from top-cited cancer papers; all of the work has been planned, executed, and published in the open, in consultation with the studies’ original authors. These papers are the first of many underway and slated to be published in the coming months.

    The outcome so far has been mixed, the project leaders say. While some results are similar, none of the studies looks exactly like the original, says Tim Errington, the project’s manager. “They’re all different in some way. They’re all different in different ways.” In some studies, the experimental system didn’t behave the same. In others, the result was slightly different, or it did not hold up under the statistical scrutiny project leaders used to analyze results. All in all, project leaders report, one study failed to reproduce the original finding, two supported key aspects of the original papers, and two were inconclusive because of technical issues.

    Errington says the goal is not to single out any individual study as replicable or not. “Our intent with this project is to perform these direct replications so that we can understand collectively how reproducible our research is,” he says.

    Indeed, there are no agreed-upon criteria for judging whether a replication is successful. At the project’s end, he says, the team will analyze the replication studies collectively by several different standards—including simply asking scientists what they think. “We’re not going to force an agreement—we’re trying to create a discussion,” he says.

    The project has been controversial; some cancer biologists say it’s designed to make them look bad bad at a time when federal research funding is under threat. Others have praised it for tackling a system that rewards shoddy research. If the first papers are any indication, those arguments won’t be easily settled. So far, the studies provide a window into the challenges of redoing complex laboratory studies. They also underscore the need that, if cancer biologists want to improve the reproducibility of their research, they have to agree on a definition of success.

    An Epidemic?

    A recent survey in Nature of more than 1,500 researchers found that 70% have tried and failed to reproduce others’ experiments, and that half have failed to reproduce their own. But you wouldn’t know it by reading published studies. Academic scientists are under pressure to publish new findings, not replicate old research. There’s little funding earmarked toward repeating studies, and journals favor publishing novel discoveries. Science relies on a gradual accumulation of studies that test hypotheses in new ways. If one lab makes a discovery using cell lines, for instance, the same lab or another lab might investigate the phenomenon in mice. In this way, one study extends and builds on what came before.

    For many researchers, that approach—called conceptual replication, which gives supporting evidence for a previous study’s conclusion using another model—is enough. But a growing number of scientists have been advocating for repeating influential studies. Such direct replications, Errington says, “will allow us to understand how reliable each piece of evidence we have is.” Replications could improve the efficiency of future research by winnowing out false hypotheses early and help scientists recreate others’ work in order to build on it.

    In the field of cancer research, some of the pressure to improve reproducibility has come from the pharmaceutical industry, where investing in a spurious hypothesis or therapy can threaten profits. In a 2012 commentary in Nature, cancer scientists Glenn Begley and Lee Ellis wrote that they had tried to reproduce 53 high-profile cancer studies while working at the pharmaceutical company Amgen, and succeeded with just six. A year earlier, scientists at Bayer HealthCare announced that they could replicate only 20–25% of 47 cancer studies. But confidentiality rules prevented both teams from sharing data from those attempts, making it difficult for the larger scientific community to assess their results.

    ‘No Easy Task’

    Enter the Reproducibility Project: Cancer Biology. It was launched with a $1.3 million grant from the Laura and John Arnold Foundation to redo key experiments from 50 landmark cancer papers from 2010 to 2012. The work is carried out in the laboratory network of Science Exchange, a Palo Alto-based startup, and the results tracked and made available through a data-sharing platform developed by the Center for Open Science. Statisticians help design the experiments to yield rigorous results. The protocols of each experiment have been peer-reviewed and published separately as a registered report beforehand, which advocates say prevents scientists from manipulating the experiment or changing their hypothesis midstream.

    The group has made painstaking efforts to redo experiments with the same methods and materials, reaching out to original laboratories for advice, data, and resources. The labs that originally wrote the studies have had to assemble information from years-old research. Studies have been delayed because of legal agreements for transferring materials from one lab to another. Faced with financial and time constraints, the team has scaled back its project; so far 29 studies have been registered, and Errington says the plan is to do as much as they can over the next year and issue a final paper.

    “This is no easy task, and what they’ve done is just wonderful,” says Begley, who is now chief scientific officer at Akriveia Therapeutics and was originally on the advisory board for the project but resigned because of time constraints. His overall impression of the studies is that they largely flunked replication, even though some data from individual experiments matched. He says that for a study to be valuable, the major conclusion should be reproduced, not just one or two components of the study. This would demonstrate that the findings are a good foundation for future work. “It’s adding evidence that there’s a challenge in the scientific community we have to address,” he says.

    Begley has argued that early-stage cancer research in academic labs should follow methods that clinical trials use, like randomizing subjects and blinding investigators as to which ones are getting a treatment or not, using large numbers of test subjects, and testing positive and negative controls. He says that when he read the original papers under consideration for replication, he assumed they would fail because they didn’t follow these methods, even though they are top papers in the field.. “This is a systemic problem; it’s not one or two labs that are behaving badly,” he says.

    Details Matter

    For the researchers whose work is being scrutinized, the details of each study matter. Although the project leaders insist they are not designing the project to judge individual findings—that would require devoting more resources to each study—cancer researchers have expressed concern that the project might unfairly cast doubt on their discoveries. The responses of some of those scientists so far raise issues about how replication studies should be carried out and analyzed.

    One study, for instance, replicated a 2010 paper led by Erkki Ruoslahti, a cancer researcher at Sanford Burnham Prebys Medical Discovery Institute in San Diego, which identified a peptide that could stick to and penetrate tumors. Ruoslahti points to a list of subsequent studies by his lab and others that support the finding and suggest that the peptide could help deliver cancer drugs to tumors. But the replication study found that the peptide did not make tumors more permeable to drugs in mice. Ruoslahti says there could be a technical reason for the problem, but the replication team didn’t try to troubleshoot it. He’s now working to finish preclinical studies and secure funding to move the treatment into human trials through a company called Drugcendr. He worries that replication studies that fail without fully exploring why could derail efforts to develop treatments. “This has real implications to what will happen to patients,” he says.

    Atul Butte, a computational biologist at the University of California San Francisco, who led one of the original studies that was reproduced, praises the diligence of the team. “I think what they did is unbelievably disciplined,” he says. But like some other scientists, he’s puzzled by the way the team analyzed results, which can make a finding that subjectively seems correct appear as if it failed. His original study used a data-crunching model to sort through open-access genetic information and identify potential new uses for existing drugs. Their model predicted that the antiulcer medication cimetidine would have an effect against lung cancer, and his team validated the model by testing the drug against lung cancer tumors in mice. The replication found very similar effects. “It’s unbelievable how well it reproduces our study,” Butte says. But the replication team used a statistical technique to analyze the results that found them not statistically significant. Butte says it’s odd that the project went to such trouble to reproduce experiments exactly, only to alter the way the results are interpreted.

    Errington and Iorns acknowledge that such a statistical analysis is not common in biological research, but they say it’s part of the group’s effort to be rigorous. “The way we analyzed the result is correct statistically, and that may be different from what the standards are in the field, but they’re what people should aspire to,” Iorns says.

    In some cases, results were complicated by inconsistent experimental systems. One study tested a type of experimental drug called a BET inhibitor against multiple myeloma in mice. The replication found that the drug improved the survival of diseased mice compared to controls, consistent with the original study. But the disease developed differently in the replication study, and statistical analysis of the tumor growth did not yield a significant finding. Constantine Mitsiades, the study’s lead author and a cancer researcher at the Dana-Farber Cancer Institute, says that despite the statistical analysis, the replication study’s data “are highly supportive of and consistent with our original study and with subsequent studies that also confirmed it.”

    A Fundamental Debate

    These papers will undoubtedly provoke debate about what the standards of replication should be. Mitsiades and other scientists say that complex biological systems like tumors are inherently variable, so it’s not surprising if replication studies don’t exactly match their originals. Inflexible study protocols and rigid statistics may not be appropriate for evaluating such systems—or needed.

    Some scientists doubt the need to perform copycat studies at all. “I think science is self-correcting,” Ruoslahti says. “Yes, there’s some loss of time and money, but that’s just part of the process.” He says that, on the positive side, this project might encourage scientists to be more careful, but he also worries that it might discourage them from publishing new discoveries.

    Though the researchers who led these studies are, not surprisingly, focused on the correctness of the findings, Errington says that the variability of experimental models and protocols is important to document. Advocates for replication say that current published research reflects an edited version of what happened in the lab. That’s why the Reproducibility Project has made a point to publish all of its raw data and include experiments that seemed to go awry, when most researchers would troubleshoot them and try again.

    “The reason to repeat experiments is to get a handle on the intrinsic variability that happens from experiment to experiment,” Begley says. With a better understanding of biology’s true messiness, replication advocates say, scientists might have a clearer sense of whether or not to put credence in a single study. And if more scientists published the full data from every experiment, those original results may look less flashy to begin with, leading fewer labs to chase over-hyped hypotheses and therapies that never pan out. An ultimate goal of the project is to identify factors that make it easier to produce replicable research, like publishing detailed protocols and validating that materials used in a study, such as antibodies, are working properly.


    Access mp4 video here .

    Beyond this project, the scientific community is already taking steps to address reproducibility. Many scientific journals are making stricter requirements for studies and publishing registered reports of studies before they’re carried out. The National Institutes of Health has launched training and funding initiatives to promote robust and reproducible research. F1000Research, an open-access, online publisher launched a Preclinical Reproducibility and Robustness Channel in 2016 for researchers to publish results from replication studies. Last week several scientists published a reproducibility manifesto in the journal Human Behavior that lays out a broad series of steps to improve the reliability of research findings, from the way studies are planned to the way scientists are trained and promoted.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NOVA is the highest rated science series on television and the most watched documentary series on public television. It is also one of television’s most acclaimed series, having won every major television award, most of them many times over.

     
  • richardmitnick 1:48 pm on March 24, 2017 Permalink | Reply
    Tags: , Medicine,   

    From Help Stop TB at WCG: “Help Stop TB Team Selects Data Analysis Tools” 

    New WCG Logo

    WCGLarge

    World Community Grid (WCG)

    By: The Help Stop TB Team
    University of Nottingham
    28 Feb 2017

    Summary
    The Help Stop TB team is hard at work analyzing the data they’ve received so far from World Community Grid. They recently chose two new data analysis tools, which will help them better understand the behavior of the bacterium which causes tuberculosis.

    Hello everyone, and thank you for contributing your computer time to Help Stop TB! We would have never completed so many simulations if it wasn’t for you!

    Background

    Help Stop TB was created to examine a particular aspect of the highly resistant and adaptable bacterium that causes tuberculosis. The bacterium has an unusual coat which protects it from many drugs and the patient’s immune system. Among the fats, sugars and proteins in this coat, the TB bacterium contains a type of fatty molecules called mycolic acids. Our project simulates the behavior of these molecules in their many configurations to better understand how they offer protection to the TB bacteria. With the resulting information, scientists may be able to design better ways to attack this protective layer and therefore develop better treatments for this deadly disease.

    Choosing Data Analysis Tools and Methods

    Since our previous mini update in November, Athina has been focusing on analysing our simulation data, and at the same time she is writing up her PhD thesis. As a team, we have now achieved our main goal, which was to come up with a robust and efficient analytical strategy. This will enable us to efficiently process the heaps of data we’re receiving from the simulations conducted by World Community Grid volunteers, and will answer our questions about mycolic acids’ conformational behaviour and its biological implications.

    The analysis protocol that we have decided on combines a variety of different analytical tools and methods. One of the tools we are using is a PCA (principal components analysis) clustering technique developed at the School of Pharmacy at the University of Nottingham. This tool has helped us categorise the shapes that mycolic acids adopt throughout the simulations. In turn, this gives us a clearer idea about which shapes are the most dominant ones.

    Figures 1 and 2 below are examples of how we are looking at the shapes of mycolic acids. These structures are important as we are looking at all the possible conformations that mycolic acids can assume in order to try to understand how those molecules work, how their conformations dictate any biological implications and/or affect the disease itself, in the hope to find any links and discover more for prevention methods.

    Because it has been shown that mycolic acids tend to demonstrate complex conformational behaviours with frequent folding and unfolding events, it is important to assess the frequency of those events. Understanding the frequency in which mycolic acids change from one folding conformation into another may help underpin important aspects of their biological behaviour.

    1
    Figure 1. Summary of a mycolic acid predominant clusters and their respective representative structures. This figure outlines the clusters’ transitions and dependency as well as their relative percentages.

    Additionally, the length of time that the molecules choose to remain in a certain adopted conformational pattern may also elucidate further biological implications. Each molecule assumes different shapes throughout its folding pathway and these shapes can be very dependent to each other. From the PCA clustering tool data, we have extracted important information regarding the dependency (Figure 1) between the different shapes the molecules assume.

    Another analytical approach that we employed was the distance matrix analysis. We created and analysed matrices (Figure 2) of the distances of all the carbon atoms along the mycolic acid chain. This method can provide further insight into the frequency of the folding events and can also help us understand more about the flexibility of each structure.

    2
    Figure 2. Distance matrices of two very different conformations of a mycolic acid. These matrices show the distances of carbon atoms along the mycolic acid chain in these two conformations, and provide a good visual idea of how different the various folding patterns are.

    We have also tested the suitability of a dihedral angle clustering tool which was developed at the Centre for Molecular Design (CMD) at the University of Portsmouth. This tool was computationally less intensive than the distance matrix analysis, but unfortunately it could not address the frequent refolding events that the mycolic acids demonstrate, thus making it challenging to extract meaningful data. However, the test cases that we analysed with this technique confirmed the predominant clusters that we had found with our PCA tools. We will now use the best choice of analysis options to build a picture for all the different mycolic acids, and will subsequently link the individual behaviour with experimental data on mycolic acid population in bacterial cell walls and their individual roles therein.

    That was all our news for now! Thank you again for your contributions, and let’s all wish good luck to Athina with her writing! Until the next time, happy crunching!

    See the full article here.

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    World Community Grid (WCG) brings people together from across the globe to create the largest non-profit computing grid benefiting humanity. It does this by pooling surplus computer processing power. We believe that innovation combined with visionary scientific research and large-scale volunteerism can help make the planet smarter. Our success depends on like-minded individuals – like you.”
    WCG projects run on BOINC software from UC Berkeley.
    BOINCLarge

    BOINC is a leader in the field(s) of Distributed Computing, Grid Computing and Citizen Cyberscience.BOINC is more properly the Berkeley Open Infrastructure for Network Computing.

    BOINC WallPaper

    CAN ONE PERSON MAKE A DIFFERENCE? YOU BET!!

    MyBOINC

    “Download and install secure, free software that captures your computer’s spare power when it is on, but idle. You will then be a World Community Grid volunteer. It’s that simple!” You can download the software at either WCG or BOINC.

    Please visit the project pages-

    FightAIDS@home Phase II

    FAAH Phase II
    OpenZika

    Rutgers Open Zika

    Help Stop TB
    WCG Help Stop TB
    Outsmart Ebola together

    Outsmart Ebola Together

    Mapping Cancer Markers
    mappingcancermarkers2

    Uncovering Genome Mysteries
    Uncovering Genome Mysteries

    Say No to Schistosoma

    GO Fight Against Malaria

    Drug Search for Leishmaniasis

    Computing for Clean Water

    The Clean Energy Project

    Discovering Dengue Drugs – Together

    Help Cure Muscular Dystrophy

    Help Fight Childhood Cancer

    Help Conquer Cancer

    Human Proteome Folding

    FightAIDS@Home

    faah-1-new-screen-saver

    faah-1-new

    World Community Grid is a social initiative of IBM Corporation
    IBM Corporation
    ibm

    IBM – Smarter Planet
    sp

     
  • richardmitnick 12:50 pm on March 24, 2017 Permalink | Reply
    Tags: , , Medicine, Producing Radioisotopes for Medical Imaging and Disease Treatment   

    From BNL: “Producing Radioisotopes for Medical Imaging and Disease Treatment” 

    Brookhaven Lab

    March 21, 2017
    Karen McNulty Walsh

    Brookhaven’s high-energy proton accelerator and a group led by Cathy Cutler team up to meet the nation’s demand for medical isotopes.

    1
    Cathy Cutler, Lisa Muench, Tatjana Klaric, Weimin Zhou, Vicky Litton, and Anna Goldberg in the hot cell area where BLIP targets are processed to extract desired isotope products.

    The before and after images are stunning: A prostate cancer patient riddled with metastatic tumors that disappear after just three, potent treatments.

    “Two patients underwent these treatments and they were cured,” said Cathy Cutler, director of the Medical Isotope Research and Production Program at the U.S. Department of Energy’s Brookhaven National Laboratory. “Their cancer was gone.

    “This is what we want to do—supply this material so that more patients can get this treatment,” she said.

    3
    Medical applications of isotopes produced at BLIP Top: BLIP produces Strontium-82, a relatively stable isotope that can be transported and used in hospitals to generate Rubidium-82, a radiotracer that reveals reduced blood flow in heart muscle under stress. This precision scanning points physicians to coronary arteries that need treatment. Credit: Washington University School of Medicine. Bottom: Before and after images show how a molecule labeled with Actinium-225 delivers cell-killing alpha particles directly to tumors, eradicating metastatic prostate cancer. The BLIP team aims to increase the production of Ac-225 so scientists can conduct large-scale trials and get this potentially lifesaving treatment to more patients. Credit: ©SNMMI: C. Kratochwil. J. Nucl. Med., 2016; 57 (12); 1941.

    The material is a molecule tagged with Actinium-225, a radioactive isotope. When designed to specifically bind with a protein on the surface of cancer cells, the radiolabeled molecule delivers a lethal, localized punch—alpha particles that kill the cancer with minimal damage to surrounding tissues.

    Actinium-225 can only be produced in the large quantities needed to support clinical applications at facilities that have high-energy particle accelerators.

    “This is why I came to Brookhaven,” Cutler said in a recent talk she gave to highlight her group’s work.* “We can make these alpha emitters and this is really giving doctors a chance to treat these patients!”

    Radiochemistry redux

    Brookhaven Lab and the Department of Energy Isotope Program have a long history of developing radioisotopes for uses in medicine and other applications. These radioactive forms of chemical elements can be used alone or attached to a variety of molecules to track and target disease.

    “If it wasn’t for the U.S. Department of Energy and its isotope development program, I’m not sure we’d have nuclear medicine,” Cutler said.

    Among the notable Brookhaven Lab successes are the development in the 1950s and 60s, respectively, of the Technetium-99m generator and a radioactively labeled form of glucose known as 18FDG—two radiotracers that went on to revolutionize medical imaging.

    As an example, 18FDG emits positrons (positively charged cousins of electrons) that can be picked up by a positron emission tomography (PET) scanner. Because rapidly growing cancer cells take up glucose faster than healthy tissue, doctors can use PET and 18FDG to detect and monitor the disease.

    “FDG turned around oncology,” Cutler said. Instead of taking a drug for months and suffering toxic side effects before knowing if a treatment is working, “patients can be scanned to look at the impact of treatment on tumors within 24 hours, and again over time, to see if the drug is effective—and also if it stops working.”

    Symbiotic operations

    While Tc-99m and 18FDG are now widely available in hospital settings and used in millions of scans a year, other isotopes are harder to make. They require the kind of high-energy particle accelerator you can find only at world-class physics labs.

    “Brookhaven is one of just a few facilities in the DOE Isotope Program that can produce certain critical medical isotopes,” Cutler said.

    Brookhaven’s linear accelerator (“linac”) was designed to feed beams of energetic protons into physics experiments at the Relativistic Heavy Ion Collider (RHIC), where physicists are exploring the properties of the fundamental building blocks of matter and the forces through which they interact.

    6
    Brookhaven’s linear accelerator (“linac”)

    7
    The Solenoidal Tracker at the Relativistic Heavy Ion Collider (RHIC) is a detector which specializes in tracking the thousands of particles produced by each ion collision at RHIC. Weighing 1,200 tons and as large as a house, STAR is a massive detector. It is used to search for signatures of the form of matter that RHIC was designed to create: the quark-gluon plasma. It is also used to investigate the behavior of matter at high energy densities by making measurements over a large area. | Photo courtesy of Brookhaven National Lab.

    But because the linac produces the protons in pulses, Cutler explained, it can deliver them pulse-by-pulse to different facilities. Operators in Brookhaven’s Collider-Accelerator Department deliver alternating pulses to RHIC and the Brookhaven Linac Isotope Producer (BLIP).

    “We operate these two programs symbiotically at the same time,” Cutler said. “We combine our resources to support the running of the linear accelerator; it’s cheaper for both programs to share this resource than it would cost if each of us had to use it alone.”


    Access mp4 video here .

    Tuning and targets

    BLIP operators aim the precisely controlled beams of energetic protons at small puck-shaped targets. The protons knock subatomic particles from the targets’ atoms, transforming them into the desired radioactive elements.

    “We stack different targets sequentially to make use of the beam’s reduced energy as it exits one target and enters the next in line, so we can produce multiple radionuclides at once,” Cutler said.

    Transformed targets undergo further chemical processing to yield a pure product that can be injected into patients, or a precursor chemical that can easily be transformed into the desired isotope or tracer on site at a hospital.

    “A lot of our work goes into producing these targets,” Cutler said. “You would be shocked at all the chemistry, engineering, and physics that goes into designing one of these pucks—to make sure it survives the energy and high current of the beam, gives you the isotope you are interested in with minimal impurities, and allows you to do the chemistry to extract that isotope efficiently.”

    Cutler recently oversaw installation of a new “beam raster” system designed to maximize the use of target materials and increase radioisotope production. With this upgrade, a series of magnets steers BLIP’s energetic particle beam to “paint” the target, rather than depositing all the energy in one spot. This cuts down on the buildup of target-damaging heat, allowing operators to increase beam current and transform more target material into the desired product.

    Meeting increasing demand

    The new raster system and ramped up current helped increase production of one of BLIP’s main products, Strontium-82, by more than 50 percent in 2016. Sr-82 has a relatively long half-life, allowing it to be transported to hospitals in a form that can generate a short-lived radiotracer, Rubidium-82, which has greatly improved the precision of cardiac imaging.

    4
    Weimin Zhou, Anna Goldberg, and Lisa Muench in the isotope-processing area.

    “Rb-82 mimics potassium, which is taken up by muscles, including the heart,” Cutler explained. “You can inject Rubidium into a patient in a PET scanner and measure the uptake of Rb-82 in heart muscle to precisely pinpoint areas of decreased blood flow when the heart is under stress. Then surgeons can go in and unblock that coronary artery to increase blood flow before the patient has a heart attack. Hundreds of thousands of patients receive this life-saving test because of what we’re doing here at Brookhaven.”

    BLIP also produces several isotopes with improved capabilities for detecting cancer, including metastatic tumors, and monitoring response to treatment.

    But rising to meet the demand for isotopes that have the potential to cure cancer may be BLIP’s highest calling—and has been a key driver of Cutler’s career.

    5
    Jason Nalepa, a BLIP operator, prepares targets to be installed in the BLIP beamline for irradiation

    “This is where I started as a chemist at the University of Missouri—designing molecules that have the right charges, the right size, and the right characteristics that determine where they go in the body so we can use them for imaging and therapy,” she said. “If we can target receptors that are overexpressed on tumor cells, we can selectively image these cells. And if there are enough of these receptors expressed, we can deliver radionuclides to those tumor cells very selectively and destroy them.”

    Radionuclides that emit alpha particles are among the most promising isotopes because alpha particles deliver a lot of energy and traverse very small distances. Targeted delivery of alphas would deposit very high doses—“like driving an 80-ton semi truck into a tumor”—while minimizing damage to surrounding healthy cells, Cutler said.

    “Our problem isn’t that we can’t cure cancer; we can ablate the cancer. Our problem is saving the patient. The toxicity of the treatments in many cases is so significant that we can’t get the levels in to kill the cancer without actually harming the patient. With alpha particles, because of the short distance and high impact, they are enabling us to treat these patients with minimal side effects and giving doctors the opportunity to really cure cancer.”

    Making the case for a cure

    One experimental treatment Cutler developed using Lutetium-177 while still at the University of Missouri worked favorably in treating neuroendocrine tumors, but didn’t get to a cure state. Actinium-225, one of the isotopes that is trickier to make, has shown more promise—as demonstrated by the prostate cancer results published in 2016 by researchers at University Hospital Heidelberg.

    Right now, according to Cutler, DOE’s Oak Ridge National Laboratory (ORNL) makes enough Ac-225 to treat about 50 patients each year. But almost 30 times that much is needed to conduct the clinical trials required to prove that such a strategy works before it can move from the laboratory to medical practice.

    “With the accelerator we have here at Brookhaven, the expertise in radiochemistry, and experience producing isotopes for medical applications, we—together with partners at ORNL and DOE’s Los Alamos National Laboratory—are looking to meet this unmet need to get this material out to patients,” Cutler said.

    The work at BLIP is funded by the DOE Isotope Program, managed by the Office of Science’s Nuclear Physics program. RHIC is a DOE Office of Science User Facility.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    BNL Campus

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 1:27 pm on March 21, 2017 Permalink | Reply
    Tags: Alex Perryman, Medicine, , ,   

    From OpenZika at WCG: “OpenZika Researchers Continue Calculations and Prepare for Next Stage” 

    New WCG Logo

    WCGLarge

    World Community Grid (WCG)


    By: The OpenZika research team
    21 Mar 2017

    Summary
    The OpenZika researchers are continuing to screen millions of chemical compounds as they look for potential treatments for the Zika virus. In this update, they report on the status of their calculations and their continuing work to spread the word about the project.

    Project Background

    While the Zika virus may not be getting the continuous press coverage that it received in 2015 and 2016, it is still a threat to the health of people across the globe. New infections continue to be reported in both South America and North America, and medical workers are just beginning to assess the effects of the virus on young children whose mothers were infected while pregnant.

    The search for effective treatments is crucial to stemming the tide of the virus. In addition to the OpenZika project, several other labs are doing cell-based screens with drugs already approved by the US Food and Drug Administration (FDA) agency, but few to none of the “hit” compounds that have been identified thus far are both potent enough against Zika virus and also safe for pregnant women.

    Also, there are a number of efforts underway to develop a vaccine against the Zika virus. However, vaccines do not help people who already have the infection. It will be several years before they are proven effective and safe, and before enough doses can be mass produced and distributed. And even after approved vaccines are available and distributed to the public, not all people will be vaccinated. Consequently, in the meantime and in the future, cures for Zika infections are needed.


    ZIKV NS3 helicase bound to RNA with the predicted binding modes of five approved drugs (from our second set of candidates) selected by virtual screening. These candidates are shown as surfaces with different shades of green. The identification of these candidates and the video were made by Dr. Alexander L. Perryman at RWJ Rutgers University.

    3
    Alex Perryman

    We began the analysis phase of the project by focusing on the results against the apo NS3 helicase crystal structure (apo means that the protein was not bound to anything else, such as a cofactor, inhibitor, or nucleic acid) to select our first set of candidates, which are currently being assayed by our collaborator at University of California San Diego, Dr. Jair L. Siqueira-Neto, using cell-based assays. The NS3 helicase is a component of the Zika virus that is required for it to replicate itself.

    In the second set of screening results that we recently examined, we used the new crystal structure of NS3 helicase bound to RNA as the target (see the images / animation above). Similar to the first set of candidates, we docked approximately 7,600 compounds in a composite library composed of the US Food and Drug Administration-approved drugs, the drugs approved in the European Union, and the US National Institutes of Health clinical collection library against the new RNA-bound structure of the helicase. Below are the results of this second screening:

    232 compounds passed the larger collection of different energetic and interaction-based docking filters, and their predicted binding modes were inspected and measured in detail.
    Of the compounds that were inspected in detail, 19 unique compounds passed this visual inspection stage of their docked modes.
    From the compounds that passed the visual inspection, 9 passed subsequent medicinal chemistry-based inspection and will be ordered soon.

    Status of the calculations

    In total, we have submitted 2.56 billion docking jobs, which involved the virtual screening of 6 million compounds versus 427 different target sites. We have already received approximately 1.9 billion of these results on our server. (There is some lag time between when the calculations are performed on your volunteered machines and when we get the results, since all of the results per “package” of approximately 10,000 different docking jobs need to be returned to World Community Grid, re-organized, and then compressed before sending them to our server.)

    Except for a few stragglers, we have received all of the results for our experiments that involve docking 6 million compounds versus the proteins NS1, NS3 helicase (both the RNA binding site and the ATP site), and NS5 (both the RNA polymerase and the methyltransferase domains). We are currently receiving the results from our most recent experiments against the NS2B / NS3 protease.

    A new stage of the project

    We just finished preparing and testing the docking input files that will be used for the second stage of this project. Instead of docking 6 million compounds, we will soon be able to start screening 30.2 million compounds against these targets. This new, massive library was originally obtained in a different type of format from the ZINC15 server. It represents almost all of “commercially available chemical space” (that is, almost all of the “small molecule” drug-like and hit-like compounds that can be purchased from reputable chemical vendors).

    The ZINC15 server provided these files as “multi-molecule mol2” files (that is, many different compounds were contained in each “mol2” formatted file). These files had to be re-formatted (we used the Raccoon program from Dr. Stefano Forli, who is part of the FightAIDS@Home team) by splitting them into individual mol2 files (1 compound per file) and then converting them into the “pdbqt” docking input format.

    We then ran a quick quality control test to make sure that the software used for the project, called AutoDock Vina, could properly use each pdbqt file as an input. Many compounds had to be rejected, because they had types of atoms that cause Vina to crash (such as silicon or boron), and we obviously don’t want to waste the computer time that you donate by submitting calculations that will crash.

    By splitting, reformatting, and testing hundreds of thousands of compounds per day, day after day, after approximately six months this massive new library of compounds is ready to be used in our OpenZika calculations. Without the tremendous resources that World Community Grid volunteers provide for this project, we would not even dream of trying to dock over 30 million compounds against many different targets from the Zika virus. Thank you all very much!!!

    For more information about these experiments, please visit our website.

    Publications and Collaborations

    Our PLoS Neglected Tropical Diseases paper, OpenZika: An IBM World Community Grid Project to Accelerate Zika Virus Drug Discovery, was published on October 20, and it has already been viewed over 4,000 times. Anyone can access and read this paper for free. Another research paper Illustrating and homology modeling the proteins of the Zika virus has been accepted by F1000Research and viewed > 3800 times.

    A group from Brazil, coordinated by Prof. Glaucius Oliva, has contacted us because of our PLoS Neglected Tropical Diseases paper to discuss a new collaboration to test the selected candidate compounds directly on enzymatic assays with the NS5 protein of Zika virus. They have solved two high-resolution crystal structures of ZIKV NS5, which have been recently released on the PDB (Protein Data Bank) (PDB ID: 5TIT and 5U04).

    Our paper entitled “Molecular Dynamics simulations of Zika Virus NS3 helicase: Insights into RNA binding site activity” was just accepted for publication in a special issue on Flaviviruses for the journal Biochemical and Biophysical Research Communications. This study of the NS3 helicase system helped us learn more about this promising target for blocking Zika replication. The results will help guide how we analyze the virtual screens that we already performed against NS3 helicase, and the molecular dynamics simulations generated new conformations of this protein that we will use as input targets in new virtual screens that we perform as part of OpenZika.

    These articles are helping to bring additional attention to the project and to encourage the formation of new collaborations.

    Additional News

    We have applied and been accepted to present “OpenZika: Opening the Discovery of New Antiviral candidates against Zika Virus and Insights into Dynamic behavior of NS3 Helicase” to the 46th World Chemistry Congress. The conference will be held in Sao Paulo, Brazil, on July 7-14.

    Dr. Sean Ekins has hired a postdoc and a master level scientist who will get involved with the OpenZika project. We have also started to collate literature inhibitors from Zika papers.

    Also, Drs. Sean Ekins and Carolina Andrade have offered to buy some of the candidate compounds that we identified in the virtual screens from OpenZika, so that they can be assayed in the next round of tests.

    See the full article here.

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    World Community Grid (WCG) brings people together from across the globe to create the largest non-profit computing grid benefiting humanity. It does this by pooling surplus computer processing power. We believe that innovation combined with visionary scientific research and large-scale volunteerism can help make the planet smarter. Our success depends on like-minded individuals – like you.”
    WCG projects run on BOINC software from UC Berkeley.
    BOINCLarge

    BOINC is a leader in the field(s) of Distributed Computing, Grid Computing and Citizen Cyberscience.BOINC is more properly the Berkeley Open Infrastructure for Network Computing.

    BOINC WallPaper

    CAN ONE PERSON MAKE A DIFFERENCE? YOU BET!!

    MyBOINC

    “Download and install secure, free software that captures your computer’s spare power when it is on, but idle. You will then be a World Community Grid volunteer. It’s that simple!” You can download the software at either WCG or BOINC.

    Please visit the project pages-

    FightAIDS@home Phase II

    FAAH Phase II
    OpenZika

    Rutgers Open Zika

    Help Stop TB
    WCG Help Stop TB
    Outsmart Ebola together

    Outsmart Ebola Together

    Mapping Cancer Markers
    mappingcancermarkers2

    Uncovering Genome Mysteries
    Uncovering Genome Mysteries

    Say No to Schistosoma

    GO Fight Against Malaria

    Drug Search for Leishmaniasis

    Computing for Clean Water

    The Clean Energy Project

    Discovering Dengue Drugs – Together

    Help Cure Muscular Dystrophy

    Help Fight Childhood Cancer

    Help Conquer Cancer

    Human Proteome Folding

    FightAIDS@Home

    faah-1-new-screen-saver

    faah-1-new

    World Community Grid is a social initiative of IBM Corporation
    IBM Corporation
    ibm

    IBM – Smarter Planet
    sp

     
  • richardmitnick 10:18 am on March 21, 2017 Permalink | Reply
    Tags: , Cell phone technology, DNA biomarkers, Intercalator dyes, Medicine, , UCLA researchers make DNA detection portable affordable using cellphones   

    From UCLA: “UCLA researchers make DNA detection portable, affordable using cellphones” 

    UCLA bloc

    UCLA

    March 20, 2017
    Matthew Chin

    System achieved comparable results to equipment costing tens of thousands of dollars more.

    1
    The combined dye/cellphone reader system achieved comparable results to equipment costing tens of thousands of dollars more. Dino Di Carlo/UCLA

    Researchers at UCLA have developed an improved method to detect the presence of DNA biomarkers of disease that is compatible with use outside of a hospital or lab setting. The new technique leverages the sensors and optics of cellphones to read light produced by a new detector dye mixture that reports the presence of DNA molecules with a signal that is more than 10-times brighter.

    Nucleic acids, such as DNA or RNA, are used in tests for infectious diseases, genetic disorders, cancer mutations that can be targeted by specific drugs, and fetal abnormality tests. The samples used in standard diagnostic tests typically contain only tiny amounts of a disease’s related nucleic acids. To assist optical detection, clinicians amplify the number of nucleic acids making them easier to find with the fluorescent dyes.

    Both the amplification and the optical detection steps have in the past required costly and bulky equipment, largely limiting their use to laboratories.

    In a study published online in the journal ACS Nano, researchers from three UCLA entities — the Henry Samueli School of Engineering and Applied Science, the California NanoSystems Institute, and the David Geffen School of Medicine — showed how to take detection out of the lab and for a fraction of the cost.

    The collaborative team of researchers included lead author Janay Kong, a UCLA Ph.D. student in bioengineering; Qingshan Wei, a post-doctoral researcher in electrical engineering; Aydogan Ozcan, Chancellor’s Professor of Electrical Engineering and Bioengineering; Dino Di Carlo, professor of bioengineering and mechanical and aerospace engineering; and Omai Garner, assistant professor of pathology and medicine at the David Geffen School of Medicine at UCLA.

    The UCLA researchers focused on the challenges with low-cost optical detection. Small changes in light emitted from molecules that associate with DNA, called intercalator dyes, are used to identify DNA amplification, but these dyes are unstable and their changes are too dim for standard cellphone camera sensors.

    But the team discovered an additive that stabilized the intercalator dyes and generated a large increase in fluorescent signal above the background light level, enabling the test to be integrated with inexpensive cellphone based detection methods. The combined novel dye/cellphone reader system achieved comparable results to equipment costing tens of thousands of dollars more.

    To adapt a cellphone to detect the light produced from dyes associated with amplified DNA while those samples are in standard laboratory containers, such as well plates, the team developed a cost-effective, field-portable fiber optic bundle. The fibers in the bundle routed the signal from each well in the plate to a unique location of the camera sensor area. This handheld reader is able to provide comparable results to standard benchtop readers, but at a fraction of the cost, which the authors suggest is a promising sign that the reader could be applied to other fluorescence-based diagnostic tests.

    “Currently nucleic acid amplification tests have issues generating a stable and high signal, which often necessitates the use of calibration dyes and samples which can be limiting for point-of-care use,” Di Carlo said. “The unique dye combination overcomes these issues and is able to generate a thermally stable signal, with a much higher signal to noise ratio. The DNA amplification curves we see look beautiful — without any of the normalization and calibration, which is usually performed, to get to the point that we start at.”

    Additionally, the authors emphasized that the dye combinations discovered should be able to be used universally to detect any nucleic acid amplification, allowing for their use in a multitude of other amplification approaches and tests.

    The team demonstrated the approach using a process called loop-mediated isothermal amplification, or LAMP, with DNA from lambda phage as the target molecule, as a proof of concept, and now plan to adapt the assay to complex clinical samples and nucleic acids associated with pathogens such as influenza.

    The newest demonstration is part of a suite of technologies aimed at democratizing disease diagnosis developed by the UCLA team. Including low-cost optical readout and diagnostics based on consumer-electronic devices, microfluidic-based automation and molecular assays leveraging DNA nanotechnology.

    This interdisciplinary work was supported through a team science grant from the National Science Foundation Emerging Frontiers in Research and Innovation program.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    UC LA Campus

    For nearly 100 years, UCLA has been a pioneer, persevering through impossibility, turning the futile into the attainable.

    We doubt the critics, reject the status quo and see opportunity in dissatisfaction. Our campus, faculty and students are driven by optimism. It is not naïve; it is essential. And it has fueled every accomplishment, allowing us to redefine what’s possible, time after time.

    This can-do perspective has brought us 12 Nobel Prizes, 12 Rhodes Scholarships, more NCAA titles than any university and more Olympic medals than most nations. Our faculty and alumni helped create the Internet and pioneered reverse osmosis. And more than 100 companies have been created based on technology developed at UCLA.

     
  • richardmitnick 7:58 am on March 21, 2017 Permalink | Reply
    Tags: , Incomplete descriptions are a waste of healthcare research, Medicine,   

    From Nature Index: “Incomplete descriptions are a waste of healthcare research” 

    Nature Mag
    Nature

    10 March 2017
    Tammy Hoffmann

    1
    No image caption. No image credit

    Most of us have probably tried to recreate a meal we’ve enjoyed in a restaurant. But would you attempt it without a recipe? And if you had to guess most of the ingredients, how confident would you be about the end result?

    Worryingly, a similar situation frequently occurs in healthcare. Health professional often have to guess the details of the intervention that will help their patients. The intervention might be a drug, or a non-drug treatment, such as an exercise, psychosocial, or dietary advice.

    Part of this problem lies with researchers who inadequately describe interventions in research reports — one of the contributors of waste in research. Estimates suggest up to 85% of health research is wasted.

    For an intervention to be useful in practice, clinicians need to know details such as: when and how much of the intervention (e.g. intensity, number and schedule of sessions), details (including any training) of the intervention provider, the actual steps involved in providing the intervention, and any materials (information or physical) needed as part of it.

    But these crucial details are missing in up to 60% of trials of non-drug interventions. While this problem is more common in studies of non-drug interventions, the problem also occurs in drug studies. In a 2015 study of cancer chemotherapy trials, only 11% reported all the essential elements of the interventions.

    Why does this problem occur?

    Authors of research reports are often unaware of what a complete description of an intervention means. Upon request, they will often provide missing details.

    But deficiencies in intervention reporting are often not detected by peer reviewers or editors. Some authors do not provide intervention details as they may not have access to all the details, or have concerns about word limits or the copyright implications of sharing intervention materials.

    For such a prevalent problem, the issue of inadequate intervention reporting has received little attention until recently.

    In 2014, my colleagues and I published in BMJ a reporting guide to increase authors’ awareness of the problem, and provide them with a checklist for the essential elements of a intervention description.

    The guide makes it easier for authors to describe their interventions, for reviewers and editors to assess the descriptions, and for readers to use the information.

    When including all intervention details in the main paper itself isn’t possible, the guide encourages authors to use other means to provide this information and to state where it can be located. This includes online supplementary materials, permitted in about 75% of journals, additional journal articles, study or university websites, or online repositories such as Figshare.

    While the guide implores researchers to include intervention details in new papers, the information remains missing in many studies that have already been published. For drug interventions, doctors at least have access to drug formularies, where they can look up basic information, such as the active ingredient, dose, and route of administration.

    Equivalent resources for non-drug interventions typically do not exist. One exception is a new handbook recently developed by the Royal Australian College of General Practitioners, which gives general practitioners the details they need to provide evidence-based non-drug interventions.

    Authors, reviewers, and editors have a responsibility to improve the comprehensivity of intervention reporting. Without adequate details, effective interventions cannot reliably be used by others. Clinicians, patients, and policymakers can’t reliably use effective interventions and researchers are unable to replicate or build upon the research. Given the cost of conducting trials, this is an enormous waste.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Nature is a weekly international journal publishing the finest peer-reviewed research in all fields of science and technology on the basis of its originality, importance, interdisciplinary interest, timeliness, accessibility, elegance and surprising conclusions. Nature also provides rapid, authoritative, insightful and arresting news and interpretation of topical and coming trends affecting science, scientists and the wider public.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: