Tagged: Applied Research Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 7:38 am on March 30, 2017 Permalink | Reply
    Tags: Applied Research, , ,   

    From U Washington: “Tackling resilience: Finding order in chaos to help buffer against climate change” 

    U Washington

    University of Washington

    March 29, 2017
    Michelle Ma

    1
    Lotus flowers on a delta island on the outer reaches of the Mississippi delta, which is in danger of drastically shrinking or disappearing. The islands are actually quite resilient, as seen in part by the vegetation growth. Britta Timpane-Padgham/NWFSC

    “Resilience” is a buzzword often used in scientific literature to describe how animals, plants and landscapes can persist under climate change. It’s typically considered a good quality, suggesting that those with resilience can withstand or adapt as the climate continues to change.

    But when it comes to actually figuring out what makes a species or an entire ecosystem resilient ― and how to promote that through restoration or management ― there is a lack of consensus in the scientific community.

    A new paper by the University of Washington and NOAA’s Northwest Fisheries Science Center aims to provide clarity among scientists, resource managers and planners on what ecological resilience means and how it can be achieved. The study, published this month in the journal PLOS ONE, is the first to examine the topic in the context of ecological restoration and identify ways that resilience can be measured and achieved at different scales.

    “I was really interested in translating a broad concept like resilience into management or restoration actions,” said lead author Britta Timpane-Padgham, a fisheries biologist at Northwest Fisheries Science Center who completed the study as part of her graduate degree in marine and environmental affairs at the UW.

    “I wanted to do something that addressed impacts of climate change and connected the science with management and restoration efforts.”

    Timpane-Padgham scoured the scientific literature for all mentions of ecological resilience, then pared down the list of relevant articles to 170 examined for this study. She then identified in each paper the common attributes, or metrics, that contribute to resilience among species, populations or ecosystems. For example, genetic diversity and population density were commonly mentioned in the literature as attributes that help populations either recover from or resist disturbance.

    Timpane-Padgham along with co-authors Terrie Klinger, professor and director of the UW’s School of Marine and Environmental Affairs, and Tim Beechie, research biologist at Northwest Fisheries Science Center, grouped the various resilience attributes into five large categories, based on whether they affected individual plants or animals; whole populations; entire communities of plants and animals; ecosystems; or ecological processes. They then listed how many times each attribute was cited, which is one indicator of how well-suited a particular attribute is for measuring resilience.

    2
    The Kissimmee River in central Florida. This ecosystem-scale restoration project began two decades ago and is used as an example in the study. South Florida Water Management District

    “It’s a very nice way of organizing what was sort of a confused body of literature,” Beechie said. “It will at least allow people to get their heads around resilience and understand what it really is and what things you can actually measure.”

    The researchers say this work could be useful for people who manage ecosystem restoration projects and want to improve the chances of success under climate change. They could pick from the ordered list of attributes that relate specifically to their project and begin incorporating tactics that promote resilience from the start.

    “Specifying resilience attributes that are appropriate for the system and that can be measured repeatably will help move resilience from concept to practice,” Klinger said.

    or example, with Puget Sound salmon recovery, managers are asking how climate change will alter various rivers’ temperatures, flow levels and nutrient content. Because salmon recovery includes individual species, entire populations and the surrounding ecosystem, many resilience attributes are being used to monitor the status of the fish and recovery of the river ecosystems that support them.

    The list of attributes that track resilience can be downloaded and sorted by managers to find the most relevant measures for the type of restoration project they are tackling. It is increasingly common to account for climate change in project plans, the researchers said, but more foresight and planning at the start of a project is crucial.

    “The threat of climate change and its impacts is a considerable issue that should be looked at from the beginning of a restoration project. It needs to be its own planning objective,” Timpane-Padgham said. “With this paper, I don’t want to have something that will be published and collect dust. It’s about providing something that will be useful for people.”

    No external funding was used for this study.

    Download the spreadsheet to find the best resilience measures for your project (click on the second file in the carousal titled Interactive decision support table)

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    u-washington-campus
    The University of Washington is one of the world’s preeminent public universities. Our impact on individuals, on our region, and on the world is profound — whether we are launching young people into a boundless future or confronting the grand challenges of our time through undaunted research and scholarship. Ranked number 10 in the world in Shanghai Jiao Tong University rankings and educating more than 54,000 students annually, our students and faculty work together to turn ideas into impact and in the process transform lives and our world. For more about our impact on the world, every day.

    So what defines us — the students, faculty and community members at the University of Washington? Above all, it’s our belief in possibility and our unshakable optimism. It’s a connection to others, both near and far. It’s a hunger that pushes us to tackle challenges and pursue progress. It’s the conviction that together we can create a world of good. Join us on the journey.

     
  • richardmitnick 7:22 am on March 29, 2017 Permalink | Reply
    Tags: Applied Research, , , , Stanford Extreme Environment Microsystems Laboratory   

    From Stanford: “New nano devices could withstand extreme environments in space and on earth” 

    Stanford University Name
    Stanford University

    March 28, 2017
    Ula Chrobak

    1
    Professor Debbie Senesky, left, works with graduate student Caitlin Chapin on electronics that can resist extreme environments. (Image credit: L.A. Cicero)

    Behind its thick swirling clouds, Venus is hiding a hot surface pelted with sulfuric acid rains. At 480 degrees C, the planet’s atmosphere would fry any of today’s electronics, posing a challenge to scientists hoping to study this extreme environment.

    Researchers at the Stanford Extreme Environment Microsystems Laboratory, or the XLab, are on a mission to conquer these conditions. By developing heat-, corrosion- and radiation-resistant electronics, they hope to move research into extreme places in the universe – including here on Earth. And it all starts with tiny, nano-scale slices of material.

    “I think it’s important to understand and gain new insight through probing these unique environments,” said Debbie Senesky, assistant professor of aeronautics and astronautics and principal investigator at the XLab.

    Senesky hopes that by studying Venus we can better understand our own world. While it’s hard to imagine that hot and corrosive Venus ever looked like Earth, scientists think that it used to be much cooler. Billions of years ago, a runaway greenhouse effect may have caused the planet to absorb far more heat than it could reflect, creating today’s scorching conditions. Understanding how Venus got so hot can help us learn about our atmosphere.

    “If we can understand the history of Venus, maybe we can understand and positively impact the future evolution of our own habitat,” said Senesky.

    What’s more, devices that can withstand the rigors of space travel could also monitor equally challenging conditions here on earth, such as in our cars.

    Scorching heat

    One hurdle to studying extreme environments is the heat. Silicon-based semiconductors, which power our smartphones and laptops, stop working at about 300 degrees C. As they heat up, the metal parts begin to melt into neighboring semiconductor and don’t move electricity as efficiently.

    Ateeq Suria, graduate student in mechanical engineering, is one of the people at the XLab working to overcome this temperature barrier. To do that, he hopped into his bunny suit — overall lab apparel that prevents contamination — and made use of ultra-clean work spaces to create an atoms-thick, heat-resistant layer that can coat devices and allow them to work at up to 600 degrees C in air [sorry, no image].

    “The diameter of human hair is about 70 micrometers,” said Suria. “These coatings are about a hundredth of that width.”

    Suria and others at the XLab are working to improve these nano-devices, testing materials at temperatures of up to 900 C degrees. For space electronics, it’s a key step in understanding how they survive for long periods of time. Although a device might not be exposed to such temperature extremes in space, the test conditions rapidly age materials, indicating how long they could last.

    The team at XLab tests materials and nano-devices they create either in-house in high-temperature probe stations or in a Venus simulator at the NASA Glenn Research Center in Cleveland, Ohio. That simulator mimics the pressure, chemistry and temperature of Venus. To mirror the effects of space radiation, they also test materials at Los Alamos National Laboratory and at NASA Ames Research Center.

    Radiation damage

    More than just surviving on Venus, getting there is important, too. Objects in space are pounded by a flurry of gamma and proton radiation that knock atoms around and degrade materials. Preliminary work at the XLab demonstrates that sensors they’ve developed could survive up to 50 years of radiation bombardment while in Earth’s orbit.

    Senesky said that if their fabrication process for nano-scale materials proves effective it could get incorporated into technologies being launched into space.

    “I’m super excited about the possibility of NASA adopting our technology in the design of their probes and landers,” said Senesky.

    Hot electronics at home

    While space is an exciting frontier, Suria said that interest in understanding car engines initially fueled this research. Inside an engine, temperatures reach up to 1000 degrees C, and the outer surface of a piston is 600 degrees C. Current technology to monitor and optimize engine performance can’t handle this heat, introducing error because measuring devices have to be placed far away from the pistons.

    Electronics designed to survive the intense conditions of space could be placed next to the engine’s pistons to directly monitor performance and improve efficiency.

    “You just put the sensor right in the engine and get much better information out,” said Suria.

    Other fiery, high pressure earth-bound environments that would benefit from these robust electronics include oil and gas wellbores, geothermal vents, aircraft engines, gas turbines and hypersonic structures.

    Media Contacts

    Amy Adams, Stanford News Service; (650) 796-3695, amyadams@stanford.edu

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    Leland and Jane Stanford founded the University to “promote the public welfare by exercising an influence on behalf of humanity and civilization.” Stanford opened its doors in 1891, and more than a century later, it remains dedicated to finding solutions to the great challenges of the day and to preparing our students for leadership in today’s complex world. Stanford, is an American private research university located in Stanford, California on an 8,180-acre (3,310 ha) campus near Palo Alto. Since 1952, more than 54 Stanford faculty, staff, and alumni have won the Nobel Prize, including 19 current faculty members

    Stanford University Seal

     
  • richardmitnick 11:44 am on March 28, 2017 Permalink | Reply
    Tags: Applied Research, , Data Science,   

    From Harvard: “Harvard launches data science initiative” 

    Harvard University
    Harvard University

    March 28, 2017

    Francesca Dominici and David Parkes named co-directors

    1
    The new Harvard Data Science Initiative, led by co-directors David C. Parkes (left), George F. Colony Professor of Computer Science, and Francesca Dominici, senior associate dean for research at the Harvard Chan School, will unite efforts across the University to enable the development of cross-disciplinary methodologies and discovery of new applications. Kris Snibbe/Harvard Staff Photographer

    A statistician and a computer scientist have been named co-leaders of Harvard’s new Data Science Initiative, the Harvard University Office of the Vice Provost for Research announced today.

    A University-wide program that will aid cross-disciplinary collaboration, the initiative will be led by Francesca Dominici, professor of biostatistics at the Harvard T.H. Chan School of Public Health, and David C. Parkes, George F. Colony Professor and area dean for computer science at the Harvard John A. Paulson School of Engineering and Applied Sciences.

    “With its diversity of disciplines, Harvard has access to large data sets that record a staggering array of phenomena,” said Provost Alan Garber. “Researchers in our Schools of medicine, public health, business, law, arts and sciences, government, education, and engineering are already gaining deep insights from their analyses of large data sets. This initiative will connect research efforts across the University in this emerging field. It will facilitate cross-fertilization in both teaching and research, paving the way to methodological innovations and to applications of these new tools to a wide range of societal and scientific challenges.”

    As massive amounts of data are generated from science, engineering, social sciences, and medicine — and even from digitally augmented daily lives — researchers are grappling with how to make sense of all this information, and how to use it to benefit people. Data science applies the theory and practice of statistics and computer science to extract useful knowledge from complex and often messy information sources. Applications span health care, the environment, commerce, government services, urban planning, and finance. The initiative will make it possible to take methodology and tools from one domain to another and discover new applications.

    Until now, Harvard’s growth in data science has been organic, occurring in distinct domains and an increasing array of applications. The initiative will unite efforts. A steering committee led the planning, involving 55 faculty members and many of Harvard’s data science leaders.

    The initiative already has launched the Harvard Data Science Postdoctoral Fellowship program, which will support up to seven scholars over two years, whose interests are in data science, broadly construed, and include researchers with a methodological and applications focus.

    The first cohort of fellows will arrive in the fall; they will direct their own research while forging collaborations around the University. The program will offer numerous opportunities to engage with the broader data science community through events such as seminar series, informal lunches, mentoring, and fellow-led and other networking opportunities.

    The initiative has also launched the Harvard Data Science Initiative Competitive Research Fund, which invites innovative ideas from those with interests that span data science, including methodological foundations and the development of quantitative methods and tools motivated by application challenges.

    In addition, three master’s degree programs have been approved. The Medical School offers a master’s degree in biomedical informatics, and the Harvard Chan School has a master’s of science in health data science. A master’s in data science (Faculty of Arts and Sciences) and jointly offered by Computer Science and Statistics is planned for the fall of 2018.

    “The ability to apply the power of new analytics and new methodologies in revolutionary ways makes this the era of data science, and Harvard faculty have been at the forefront of this emerging field,” said Vice Provost for Research Rick McCullough. “Our researchers not only develop new methodologies, but also apply those methodologies to incredible effect. I am delighted that Francesca Dominici and David Parkes will be co-directing this new effort. They are both extraordinary scientists and exemplary colleagues.”

    Dominici specializes in developing statistical methods to analyze large and complex data sets. She leads multiple interdisciplinary groups of scientists addressing questions in environmental health science, climate change, and health policy.

    “Harvard’s Data Science Initiative will build on the collaborations that already exist across the University to foster a rich and cohesive data science community that brings together scholars from across disciplines and schools,” Dominici said. “I am delighted to be a part of an effort that pushes the frontiers of this important discipline and extends our ability to use data science for the good of people everywhere.”

    Parkes leads research at the interface between economics and computer science, with a focus on multi-agent systems, artificial intelligence, and machine learning.

    “The Data Science Initiative will strengthen the fabric of connections among departments to create an integrated data science community,” Parkes said. “Through these efforts, we seek to empower research progress and education across the University, and work toward solutions for the world’s most important challenges. I look forward to being a part of this exciting work.”

    The Data Science Steering Committee, in addition to Dominici and Parkes, includes:

    Alyssa Goodman, professor of applied astronomy, Faculty of Arts and Sciences
    Gary King, director, Harvard Institute for Quantitative Social Science;
    Zak Kohane, chair of the Department of Biomedical Informatics, Harvard Medical School;
    Xihong Lin, chair of the Department of Biostatistics, Harvard Chan School;
    Anne Margulies, University chief information officer;
    Hanspeter Pfister, professor of computer science, Harvard Paulson School;
    Neil Shephard, chair of the Department of Economics and of Statistics, Faculty of Arts and Sciences.

    For more information about the initiative, visit datascience.harvard.edu.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Harvard University campus
    Harvard is the oldest institution of higher education in the United States, established in 1636 by vote of the Great and General Court of the Massachusetts Bay Colony. It was named after the College’s first benefactor, the young minister John Harvard of Charlestown, who upon his death in 1638 left his library and half his estate to the institution. A statue of John Harvard stands today in front of University Hall in Harvard Yard, and is perhaps the University’s best known landmark.

    Harvard University has 12 degree-granting Schools in addition to the Radcliffe Institute for Advanced Study. The University has grown from nine students with a single master to an enrollment of more than 20,000 degree candidates including undergraduate, graduate, and professional students. There are more than 360,000 living alumni in the U.S. and over 190 other countries.

     
  • richardmitnick 7:19 am on March 28, 2017 Permalink | Reply
    Tags: Applied Research, , , , ,   

    From NYT: “A Dream of Clean Energy at a Very High Price”, a Now Too Old Subject 

    New York Times

    The New York Times

    MARCH 27, 2017
    HENRY FOUNTAIN

    1
    Source: ITER Organization Mika Gröndahl/The New York Times

    SAINT-PAUL-LEZ-DURANCE, France — At a dusty construction site here amid the limestone ridges of Provence, workers scurry around immense slabs of concrete arranged in a ring like a modern-day Stonehenge.

    It looks like the beginnings of a large commercial power plant, but it is not. The project, called ITER, is an enormous, and enormously complex and costly, physics experiment. But if it succeeds, it could determine the power plants of the future and make an invaluable contribution to reducing planet-warming emissions.

    ITER, short for International Thermonuclear Experimental Reactor (and pronounced EAT-er), is being built to test a long-held dream: that nuclear fusion, the atomic reaction that takes place in the sun and in hydrogen bombs, can be controlled to generate power.

    First discussed in 1985 at a United States-Soviet Union summit, the multinational effort, in which the European Union has a 45 percent stake and the United States, Russia, China and three other partners 9 percent each, has long been cited as a crucial step toward a future of near-limitless electric power.

    ITER will produce heat, not electricity. But if it works — if it produces more energy than it consumes, which smaller fusion experiments so far have not been able to do — it could lead to plants that generate electricity without the climate-affecting carbon emissions of fossil-fuel plants or most of the hazards of existing nuclear reactors that split atoms rather than join them.

    Success, however, has always seemed just a few decades away for ITER. The project has progressed in fits and starts for years, plagued by design and management problems that have led to long delays and ballooning costs.

    ITER is moving ahead now, with a director-general, Bernard Bigot, who took over two years ago after an independent analysis that was highly critical of the project. Dr. Bigot, who previously ran France’s atomic energy agency, has earned high marks for resolving management problems and developing a realistic schedule based more on physics and engineering and less on politics.

    “I do believe we are moving at full speed and maybe accelerating,” Dr. Bigot said in an interview.

    The site here is now studded with tower cranes as crews work on the concrete structures that will support and surround the heart of the experiment, a doughnut-shaped chamber called a tokamak. This is where the fusion reactions will take place, within a plasma, a roiling cloud of ionized atoms so hot that it can be contained only by extremely strong magnetic fields.

    2
    By The New York Times

    Pieces of the tokamak and other components, including giant superconducting electromagnets and a structure that at approximately 100 feet in diameter and 100 feet tall will be the largest stainless-steel vacuum vessel ever made, are being fabricated in the participating countries. Assembly is set to begin next year in a giant hall erected next to the tokamak site.

    3
    At the ITER construction site, immense slabs of concrete lie in a ring like a modern-day Stonehenge. Credit ITER Organization

    There are major technical hurdles in a project where the manufacturing and construction are on the scale of shipbuilding but the parts need to fit with the precision of a fine watch.

    “It’s a challenge,” said Dr. Bigot, who devotes much of his time to issues related to integrating parts from various countries. “We need to be very sensitive about quality.”

    Even if the project proceeds smoothly, the goal of “first plasma,” using pure hydrogen that does not undergo fusion, would not be reached for another eight years. A so-called burning plasma, which contains a fraction of an ounce of fusible fuel in the form of two hydrogen isotopes, deuterium and tritium, and can be sustained for perhaps six or seven minutes and release large amounts of energy, would not be achieved until 2035 at the earliest.

    That is a half century after the subject of cooperating on a fusion project came up at a meeting in Geneva between President Ronald Reagan and the Soviet leader Mikhail S. Gorbachev. A functional commercial fusion power plant would be even further down the road.

    “Fusion is very hard,” said Riccardo Betti, a researcher at the University of Rochester who has followed the ITER project for years. “Plasma is not your friend. It tries to do everything it can to really displease you.”

    Fusion is also very expensive. ITER estimates the cost of design and construction at about 20 billion euros (currently about $22 billion). But the actual cost of components may be higher in some of the participating countries, like the United States, because of high labor costs. The eventual total United States contribution, which includes an enormous central electromagnet capable, it is said, of lifting an aircraft carrier, has been estimated at about $4 billion.

    Despite the recent progress there are still plenty of doubts about ITER, especially in the United States, which left the project for five years at the turn of the century and where funding through the Energy Department has long been a political football.

    The department confirmed its support for ITER in a report last year and Congress approved $115 million for it. It is unclear, though, how the project will fare in the Trump administration, which has proposed a cut of roughly 20 percent to the department’s Office of Science, which funds basic research including ITER. (The department also funds another long-troubled fusion project, which uses lasers, at Lawrence Livermore National Laboratory in California.)

    Dr. Bigot met with the new energy secretary, Rick Perry, last week in Washington, and said he found Mr. Perry “very open to listening” about ITER and its long-term goals. “But he has to make some short-term choices” with his budget, Dr. Bigot said.

    Energy Department press aides did not respond to requests for comment.

    Some in Congress, including Senator Lamar Alexander, Republican of Tennessee, while lauding Dr. Bigot’s efforts, argue that the project already consumes too much of the Energy Department’s basic research budget of about $5 billion.

    “I remain concerned that continuing to support the ITER project would come at the expense of other Office of Science priorities that the Department of Energy has said are more important — and that I consider more important,” Mr. Alexander said in a statement.

    While it is not clear what would happen to the project if the United States withdrew, Dr. Bigot argues that it is in every participating country’s interest to see it through. “You have a chance to know if fusion works or not,” he said. “If you miss this chance, maybe it will never come again.”

    But even scientists who support ITER are concerned about the impact it has on other research.

    “People around the country who work on projects that are the scientific basis for fusion are worried that they’re in a no-win situation,” said William Dorland, a physicist at the University of Maryland who is chairman of the plasma science committee of the National Academy of Sciences. “If ITER goes forward, it might eat up all the money. If it doesn’t expand and the U.S. pulls out, it may pull down a lot of good science in the downdraft.”

    In the ITER tokamak, deuterium and tritium nuclei will fuse to form helium, losing a small amount of mass that is converted into a huge amount of energy. Most of the energy will be carried away by neutrons, which will escape the plasma and strike the walls of the tokamak, producing heat.

    In a fusion power plant, that heat would be used to make steam to turn a turbine to generate electricity, much as existing power plants do using other sources of heat, like burning coal. ITER’s heat will be dissipated through cooling towers.

    There is no risk of a runaway reaction and meltdown as with nuclear fission and, while radioactive waste is produced, it is not nearly as long-lived as the spent fuel rods and irradiated components of a fission reactor.

    To fuse, atomic nuclei must move very fast — they must be extremely hot — to overcome natural repulsive forces and collide. In the sun, the extreme gravitational field does much of the work. Nuclei need to be at a temperature of about 15 million degrees Celsius.

    In a tokamak, without such a strong gravitational pull, the atoms need to be about 10 times hotter. So enormous amounts of energy are required to heat the plasma, using pulsating magnetic fields and other sources like microwaves. Just a few feet away, on the other hand, the windings of the superconducting electromagnets need to be cooled to a few degrees above absolute zero. Needless to say, the material and technical challenges are extreme.

    Although all fusion reactors to date have produced less energy than they use, physicists are expecting that ITER will benefit from its larger size, and will produce about 10 times more power than it consumes. But they will face many challenges, chief among them developing the ability to prevent instabilities in the edges of the plasma that can damage the experiment.

    Even in its early stages of construction, the project seems overwhelmingly complex. Embedded in the concrete surfaces are thousands of steel plates. They seem to be scattered at random throughout the structure, but actually are precisely located. ITER is being built to French nuclear plant standards, which prohibit drilling into concrete. So the plates — eventually about 80,000 of them — are where other components of the structure will be attached as construction progresses.

    A mistake or two now could wreak havoc a few years down the road, but Dr. Bigot said that in this and other work on ITER, the key to avoiding errors was taking time.

    “People consider that it’s long,” he said, referring to critics of the project timetable. “But if you want full control of quality, you need time.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 7:02 am on March 27, 2017 Permalink | Reply
    Tags: Applied Research, , ,   

    From UC Riverside: “Researchers Crack Structure of Key Protein in Zika Virus” 

    UC Riverside bloc

    UC Riverside

    March 27, 2017
    Iqbal Pittalwala

    1
    The image shows the crystal structure of ZIKV NS5 protein. The regions with different colors represent individual domains or motifs of ZIKV NS5. The black circle marks the location of the potential inhibitor-binding site. Image credit: Song lab, UC Riverside.

    Zika virus (ZIKV), which causes Zika virus disease, is spread to people primarily through the bite of an infected Aedes aegypti or Aedes albopictus mosquito. An infected pregnant woman can pass ZIKV to her fetus during pregnancy or around the time of birth. Sex is yet another way for infected persons to transmit ZIKV to others.

    The genomic replication of the virus is made possible by its “NS5” protein. This function of ZIKV NS5 is unique to the virus, making it an ideal target for anti-viral drug development. Currently, there is no vaccine or medicine to fight ZIKV infection.

    In a research paper just published in Nature Communications, University of California, Riverside scientists report that they have determined the crystal structure of the entire ZIKV NS5 protein and demonstrated that NS5 is functional when purified in vitro. Knowing the structure of ZIKV NS5 helps the researchers understand how ZIKV replicates itself.

    Furthermore, the researchers’ structural analysis of ZIKV NS5 reveals a potential binding site in the protein for an inhibitor, thereby providing a strong basis for developing potential inhibitors against ZIKV NS5 to suppress ZIKV infection. The identification of the inhibitor-binding site of NS5 can now enable scientists to design potential potent drugs to fight ZIKV.

    “We started this work realizing that the full structure of ZIKV NS5 was missing,” said Jikui Song, an assistant professor of biochemistry, who co-led the research with Rong Hai, an assistant professor of plant pathology and microbiology. “The main challenge for us occurred during the protein’s purification process when ZIKV NS5 got degraded – chopped up – by bacterial enzymes.”

    Song, Hai and their colleagues overcame this challenge by developing an efficient protocol for protein purification, which in essence minimizes the purification time for NS5.

    “Our work provides a framework for future studies of ZIKV NS5 and opportunities for drug development against ZIKV based on its structural similarity to the NS5 protein of other flaviviruses, such as the dengue virus,” Hai said. “No doubt, ZIKV therapeutics can benefit from the wealth of knowledge that has already been generated in the dengue virus field.”

    Next, the researchers plan to investigate the antiviral potential on ZIKV NS5 of a chemical compound that has been shown to work effectively in inhibiting the NS5 protein in the dengue virus.

    Song and Hai were joined in the research by graduate students Boxiao Wang (first author), Xiao-Feng Tan, Stephanie Thurmond, Zhi-Min Zhang, and Asher Lin.

    The research was supported by grants to Song from the March of Dimes Foundation, the Sidney Kimmel Foundation for Cancer Research and the National Institutes of Health.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    UC Riverside Campus

    The University of California, Riverside is one of 10 universities within the prestigious University of California system, and the only UC located in Inland Southern California.

    Widely recognized as one of the most ethnically diverse research universities in the nation, UCR’s current enrollment is more than 21,000 students, with a goal of 25,000 students by 2020. The campus is in the midst of a tremendous growth spurt with new and remodeled facilities coming on-line on a regular basis.

    We are located approximately 50 miles east of downtown Los Angeles. UCR is also within easy driving distance of dozens of major cultural and recreational sites, as well as desert, mountain and coastal destinations.

     
  • richardmitnick 6:25 am on March 27, 2017 Permalink | Reply
    Tags: "Cancer Biology Reproducibility Project Sees Mixed Results" Read it and Weep, Applied Research, , Cancer Biology Reproducibility Project Sees Mixed Results, ,   

    From NOVA: “Cancer Biology Reproducibility Project Sees Mixed Results” Read it and Weep 

    PBS NOVA

    NOVA

    18 Jan 2017 [Don’t know how I missed this, or maybe they never put it up in social media before?]
    Courtney Humphries

    How trustworthy are the findings from scientific studies?

    A growing chorus of researchers says there’s a “reproducibility crisis” in science, with too many discoveries published that may be flukes or exaggerations. Now, an ambitious project to test the reproducibility of top studies in cancer research by independent laboratories has published its first five studies in the open-access journal eLife.

    “These are the first public replication studies conducted in biomedical science, and that in itself is a huge achievement,” says Elizabeth Iorns, CEO of Science Exchange and one of the project’s leaders.

    1
    Cancer biology is just one of many fields being scrutinized for the reproducibility of its studies.

    The Reproducibility Project: Cancer Biology is a collaboration between the non-profit Center for Open Science and the for-profit Science Exchange, which runs a network of laboratories for outsourcing biomedical research. It began in 2013 with the goal of repeating experiments from top-cited cancer papers; all of the work has been planned, executed, and published in the open, in consultation with the studies’ original authors. These papers are the first of many underway and slated to be published in the coming months.

    The outcome so far has been mixed, the project leaders say. While some results are similar, none of the studies looks exactly like the original, says Tim Errington, the project’s manager. “They’re all different in some way. They’re all different in different ways.” In some studies, the experimental system didn’t behave the same. In others, the result was slightly different, or it did not hold up under the statistical scrutiny project leaders used to analyze results. All in all, project leaders report, one study failed to reproduce the original finding, two supported key aspects of the original papers, and two were inconclusive because of technical issues.

    Errington says the goal is not to single out any individual study as replicable or not. “Our intent with this project is to perform these direct replications so that we can understand collectively how reproducible our research is,” he says.

    Indeed, there are no agreed-upon criteria for judging whether a replication is successful. At the project’s end, he says, the team will analyze the replication studies collectively by several different standards—including simply asking scientists what they think. “We’re not going to force an agreement—we’re trying to create a discussion,” he says.

    The project has been controversial; some cancer biologists say it’s designed to make them look bad bad at a time when federal research funding is under threat. Others have praised it for tackling a system that rewards shoddy research. If the first papers are any indication, those arguments won’t be easily settled. So far, the studies provide a window into the challenges of redoing complex laboratory studies. They also underscore the need that, if cancer biologists want to improve the reproducibility of their research, they have to agree on a definition of success.

    An Epidemic?

    A recent survey in Nature of more than 1,500 researchers found that 70% have tried and failed to reproduce others’ experiments, and that half have failed to reproduce their own. But you wouldn’t know it by reading published studies. Academic scientists are under pressure to publish new findings, not replicate old research. There’s little funding earmarked toward repeating studies, and journals favor publishing novel discoveries. Science relies on a gradual accumulation of studies that test hypotheses in new ways. If one lab makes a discovery using cell lines, for instance, the same lab or another lab might investigate the phenomenon in mice. In this way, one study extends and builds on what came before.

    For many researchers, that approach—called conceptual replication, which gives supporting evidence for a previous study’s conclusion using another model—is enough. But a growing number of scientists have been advocating for repeating influential studies. Such direct replications, Errington says, “will allow us to understand how reliable each piece of evidence we have is.” Replications could improve the efficiency of future research by winnowing out false hypotheses early and help scientists recreate others’ work in order to build on it.

    In the field of cancer research, some of the pressure to improve reproducibility has come from the pharmaceutical industry, where investing in a spurious hypothesis or therapy can threaten profits. In a 2012 commentary in Nature, cancer scientists Glenn Begley and Lee Ellis wrote that they had tried to reproduce 53 high-profile cancer studies while working at the pharmaceutical company Amgen, and succeeded with just six. A year earlier, scientists at Bayer HealthCare announced that they could replicate only 20–25% of 47 cancer studies. But confidentiality rules prevented both teams from sharing data from those attempts, making it difficult for the larger scientific community to assess their results.

    ‘No Easy Task’

    Enter the Reproducibility Project: Cancer Biology. It was launched with a $1.3 million grant from the Laura and John Arnold Foundation to redo key experiments from 50 landmark cancer papers from 2010 to 2012. The work is carried out in the laboratory network of Science Exchange, a Palo Alto-based startup, and the results tracked and made available through a data-sharing platform developed by the Center for Open Science. Statisticians help design the experiments to yield rigorous results. The protocols of each experiment have been peer-reviewed and published separately as a registered report beforehand, which advocates say prevents scientists from manipulating the experiment or changing their hypothesis midstream.

    The group has made painstaking efforts to redo experiments with the same methods and materials, reaching out to original laboratories for advice, data, and resources. The labs that originally wrote the studies have had to assemble information from years-old research. Studies have been delayed because of legal agreements for transferring materials from one lab to another. Faced with financial and time constraints, the team has scaled back its project; so far 29 studies have been registered, and Errington says the plan is to do as much as they can over the next year and issue a final paper.

    “This is no easy task, and what they’ve done is just wonderful,” says Begley, who is now chief scientific officer at Akriveia Therapeutics and was originally on the advisory board for the project but resigned because of time constraints. His overall impression of the studies is that they largely flunked replication, even though some data from individual experiments matched. He says that for a study to be valuable, the major conclusion should be reproduced, not just one or two components of the study. This would demonstrate that the findings are a good foundation for future work. “It’s adding evidence that there’s a challenge in the scientific community we have to address,” he says.

    Begley has argued that early-stage cancer research in academic labs should follow methods that clinical trials use, like randomizing subjects and blinding investigators as to which ones are getting a treatment or not, using large numbers of test subjects, and testing positive and negative controls. He says that when he read the original papers under consideration for replication, he assumed they would fail because they didn’t follow these methods, even though they are top papers in the field.. “This is a systemic problem; it’s not one or two labs that are behaving badly,” he says.

    Details Matter

    For the researchers whose work is being scrutinized, the details of each study matter. Although the project leaders insist they are not designing the project to judge individual findings—that would require devoting more resources to each study—cancer researchers have expressed concern that the project might unfairly cast doubt on their discoveries. The responses of some of those scientists so far raise issues about how replication studies should be carried out and analyzed.

    One study, for instance, replicated a 2010 paper led by Erkki Ruoslahti, a cancer researcher at Sanford Burnham Prebys Medical Discovery Institute in San Diego, which identified a peptide that could stick to and penetrate tumors. Ruoslahti points to a list of subsequent studies by his lab and others that support the finding and suggest that the peptide could help deliver cancer drugs to tumors. But the replication study found that the peptide did not make tumors more permeable to drugs in mice. Ruoslahti says there could be a technical reason for the problem, but the replication team didn’t try to troubleshoot it. He’s now working to finish preclinical studies and secure funding to move the treatment into human trials through a company called Drugcendr. He worries that replication studies that fail without fully exploring why could derail efforts to develop treatments. “This has real implications to what will happen to patients,” he says.

    Atul Butte, a computational biologist at the University of California San Francisco, who led one of the original studies that was reproduced, praises the diligence of the team. “I think what they did is unbelievably disciplined,” he says. But like some other scientists, he’s puzzled by the way the team analyzed results, which can make a finding that subjectively seems correct appear as if it failed. His original study used a data-crunching model to sort through open-access genetic information and identify potential new uses for existing drugs. Their model predicted that the antiulcer medication cimetidine would have an effect against lung cancer, and his team validated the model by testing the drug against lung cancer tumors in mice. The replication found very similar effects. “It’s unbelievable how well it reproduces our study,” Butte says. But the replication team used a statistical technique to analyze the results that found them not statistically significant. Butte says it’s odd that the project went to such trouble to reproduce experiments exactly, only to alter the way the results are interpreted.

    Errington and Iorns acknowledge that such a statistical analysis is not common in biological research, but they say it’s part of the group’s effort to be rigorous. “The way we analyzed the result is correct statistically, and that may be different from what the standards are in the field, but they’re what people should aspire to,” Iorns says.

    In some cases, results were complicated by inconsistent experimental systems. One study tested a type of experimental drug called a BET inhibitor against multiple myeloma in mice. The replication found that the drug improved the survival of diseased mice compared to controls, consistent with the original study. But the disease developed differently in the replication study, and statistical analysis of the tumor growth did not yield a significant finding. Constantine Mitsiades, the study’s lead author and a cancer researcher at the Dana-Farber Cancer Institute, says that despite the statistical analysis, the replication study’s data “are highly supportive of and consistent with our original study and with subsequent studies that also confirmed it.”

    A Fundamental Debate

    These papers will undoubtedly provoke debate about what the standards of replication should be. Mitsiades and other scientists say that complex biological systems like tumors are inherently variable, so it’s not surprising if replication studies don’t exactly match their originals. Inflexible study protocols and rigid statistics may not be appropriate for evaluating such systems—or needed.

    Some scientists doubt the need to perform copycat studies at all. “I think science is self-correcting,” Ruoslahti says. “Yes, there’s some loss of time and money, but that’s just part of the process.” He says that, on the positive side, this project might encourage scientists to be more careful, but he also worries that it might discourage them from publishing new discoveries.

    Though the researchers who led these studies are, not surprisingly, focused on the correctness of the findings, Errington says that the variability of experimental models and protocols is important to document. Advocates for replication say that current published research reflects an edited version of what happened in the lab. That’s why the Reproducibility Project has made a point to publish all of its raw data and include experiments that seemed to go awry, when most researchers would troubleshoot them and try again.

    “The reason to repeat experiments is to get a handle on the intrinsic variability that happens from experiment to experiment,” Begley says. With a better understanding of biology’s true messiness, replication advocates say, scientists might have a clearer sense of whether or not to put credence in a single study. And if more scientists published the full data from every experiment, those original results may look less flashy to begin with, leading fewer labs to chase over-hyped hypotheses and therapies that never pan out. An ultimate goal of the project is to identify factors that make it easier to produce replicable research, like publishing detailed protocols and validating that materials used in a study, such as antibodies, are working properly.


    Access mp4 video here .

    Beyond this project, the scientific community is already taking steps to address reproducibility. Many scientific journals are making stricter requirements for studies and publishing registered reports of studies before they’re carried out. The National Institutes of Health has launched training and funding initiatives to promote robust and reproducible research. F1000Research, an open-access, online publisher launched a Preclinical Reproducibility and Robustness Channel in 2016 for researchers to publish results from replication studies. Last week several scientists published a reproducibility manifesto in the journal Human Behavior that lays out a broad series of steps to improve the reliability of research findings, from the way studies are planned to the way scientists are trained and promoted.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NOVA is the highest rated science series on television and the most watched documentary series on public television. It is also one of television’s most acclaimed series, having won every major television award, most of them many times over.

     
  • richardmitnick 3:08 pm on March 24, 2017 Permalink | Reply
    Tags: Applied Research, , , Nov. 2016 Kaikoura earthquake   

    From JPL-Caltech: “Study of Complex 2016 Quake May Alter Hazard Models” 

    NASA JPL Banner

    JPL-Caltech

    March 23, 2017
    Alan Buis
    Jet Propulsion Laboratory, Pasadena, Calif.
    818-354-0474
    alan.buis@jpl.nasa.gov

    Ian Hamling
    GNS Science, Avalon, New Zealand
    011-04-570-4568

    1
    Two ALOS-2 satellite images show ground displacements from the Nov. 2016 Kaikoura earthquake as colors proportional to the surface motion in two directions. The purple areas in the left image moved up and east 13 feet (4 meters); purple areas in the right image moved north up to 30 feet (9 meters). Credit: NASA/JPL-Caltech/JAXA

    Last November’s magnitude 7.8 Kaikoura earthquake in New Zealand was so complex and unusual, it is likely to change how scientists think about earthquake hazards in plate boundary zones around the world, finds a new international study.

    The study, led by GNS Science, Avalon, New Zealand, with NASA participation, is published this week in the journal Science. The team found that the Nov. 14, 2016, earthquake was the most complex earthquake in modern history. The quake ruptured at least 12 major crustal faults, and there was also evidence of slip along the southern end of the Hikurangi subduction zone plate boundary, which lies about 12 miles (20 kilometers) below the North Canterbury and Marlborough coastlines.

    Lead author and geodesy specialist Ian Hamling of GNS Science says the quake has underlined the importance of re-evaluating how rupture scenarios are defined for seismic hazard models in plate boundary zones worldwide.

    “This complex earthquake defies many conventional assumptions about the degree to which earthquake ruptures are controlled by individual faults, and provides additional motivation to re-think these issues in seismic hazard models,” Hamling says.

    The research team included 29 co-authors from 11 national and international institutes. To conduct the study, they combined multiple datasets, including satellite radar interferometry and GPS data that measure the amount of ground movement associated with the earthquake, along with field observations and coastal uplift data. The team found that parts of New Zealand’s South Island moved more than 16 feet (5 meters) closer to New Zealand’s North Island and were uplifted by as much as 26 feet (8 meters).

    The Kaikoura earthquake rupture began in North Canterbury and propagated northward for more than 106 miles (170 kilometers) along both well-known and previously unknown faults. It straddled two distinct active fault domains, rupturing faults in both the North Canterbury Fault zone and the Marlborough Fault system.

    The largest movement during the earthquake occurred on the Kekerengu fault, where pieces of Earth’s crust were displaced relative to each other by up to 82 feet (25 meters), at a depth of about 9 miles (15 kilometers). Maximum rupture at the surface was measured at 39 feet (12 meters) of horizontal displacement.

    Hamling says there is growing evidence internationally that conventional seismic hazard models are too simple and restrictive. “Even in the New Zealand modeling context, the Kaikoura event would not have been included because so many faults linked up unexpectedly,” he said. “The message from Kaikoura is that earthquake science should be more open to a wider range of possibilities when rupture propagation models are being developed.”

    The scientists analyzed interferometric synthetic aperture radar (InSAR) data from the Copernicus Sentinel-1A and -1B satellites, which are operated by the European Space Agency, along with InSAR data from the Japan Aerospace Exploration Agency’s ALOS-2 satellite. They compared pre- and post-earthquake images of Earth’s surface to measure land movement across large areas and infer movement on faults at depth. The Sentinel and ALOS-2 satellites orbit Earth in near-polar orbits at altitudes of 373 and 434 miles (600 and 700 kilometers), respectively, and image the same point on Earth at repeat intervals ranging from six to 30 days. The Sentinel and ALOS-2 satellites use different wavelengths, which means they pick up different aspects of surface deformation, adding to the precision and completeness of the investigation.

    In the spirit of international cooperation, both space agencies had re-prioritized their satellites immediately after the quake to collect more images of New Zealand to help with research and support the emergency response activities.

    Before the earthquake, coauthors Cunren Liang and Eric Fielding of NASA’s Jet Propulsion Laboratory, Pasadena, California, developed new InSAR data processing techniques to measure the ground deformation in the satellite flight direction using wide-swath images acquired by the ALOS-2 satellite. This is the first time this new approach has been successfully used in earthquake research.

    “We were surprised by the amazing complexity of the faults that ruptured in the Kaikoura earthquake when we processed the satellite radar images,” said Fielding. “Understanding how all these faults moved in one event will improve seismic hazard models.”

    The authors say the Kaikoura earthquake was one of the most recorded large earthquakes anywhere in the world, enabling scientists to undertake analysis in an unprecedented level of detail. This paper is the first in a series of studies to be published on the rich array of data collected from this earthquake.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NASA JPL Campus

    Jet Propulsion Laboratory (JPL) is a federally funded research and development center and NASA field center located in the San Gabriel Valley area of Los Angeles County, California, United States. Although the facility has a Pasadena postal address, it is actually headquartered in the city of La Cañada Flintridge [1], on the northwest border of Pasadena. JPL is managed by the nearby California Institute of Technology (Caltech) for the National Aeronautics and Space Administration. The Laboratory’s primary function is the construction and operation of robotic planetary spacecraft, though it also conducts Earth-orbit and astronomy missions. It is also responsible for operating NASA’s Deep Space Network.

    Caltech Logo

    NASA image

     
  • richardmitnick 1:05 pm on March 24, 2017 Permalink | Reply
    Tags: "Tree on a chip", Applied Research, may be used to make small robots move., Microfluidic device generates passive hydraulic power, ,   

    From MIT: “Engineers design “tree-on-a-chip” 

    MIT News

    MIT Widget

    MIT News

    March 20, 2017
    Jennifer Chu

    1
    Engineers have designed a microfluidic device they call a “tree-on-a-chip,” which mimics the pumping mechanism of trees and other plants.

    2
    Like its natural counterparts, the chip operates passively, requiring no moving parts or external pumps. It is able to pump water and sugars through the chip at a steady flow rate for several days.
    Courtesy of the researchers

    Microfluidic device generates passive hydraulic power, may be used to make small robots move.

    Trees and other plants, from towering redwoods to diminutive daisies, are nature’s hydraulic pumps. They are constantly pulling water up from their roots to the topmost leaves, and pumping sugars produced by their leaves back down to the roots. This constant stream of nutrients is shuttled through a system of tissues called xylem and phloem, which are packed together in woody, parallel conduits.

    Now engineers at MIT and their collaborators have designed a microfluidic device they call a “tree-on-a-chip,” which mimics the pumping mechanism of trees and plants. Like its natural counterparts, the chip operates passively, requiring no moving parts or external pumps. It is able to pump water and sugars through the chip at a steady flow rate for several days. The results are published this week in Nature Plants.

    Anette “Peko” Hosoi, professor and associate department head for operations in MIT’s Department of Mechanical Engineering, says the chip’s passive pumping may be leveraged as a simple hydraulic actuator for small robots. Engineers have found it difficult and expensive to make tiny, movable parts and pumps to power complex movements in small robots. The team’s new pumping mechanism may enable robots whose motions are propelled by inexpensive, sugar-powered pumps.

    “The goal of this work is cheap complexity, like one sees in nature,” Hosoi says. “It’s easy to add another leaf or xylem channel in a tree. In small robotics, everything is hard, from manufacturing, to integration, to actuation. If we could make the building blocks that enable cheap complexity, that would be super exciting. I think these [microfluidic pumps] are a step in that direction.”

    Hosoi’s co-authors on the paper are lead author Jean Comtet, a former graduate student in MIT’s Department of Mechanical Engineering; Kaare Jensen of the Technical University of Denmark; and Robert Turgeon and Abraham Stroock, both of Cornell University.

    A hydraulic lift

    The group’s tree-inspired work grew out of a project on hydraulic robots powered by pumping fluids. Hosoi was interested in designing hydraulic robots at the small scale, that could perform actions similar to much bigger robots like Boston Dynamic’s Big Dog, a four-legged, Saint Bernard-sized robot that runs and jumps over rough terrain, powered by hydraulic actuators.

    “For small systems, it’s often expensive to manufacture tiny moving pieces,” Hosoi says. “So we thought, ‘What if we could make a small-scale hydraulic system that could generate large pressures, with no moving parts?’ And then we asked, ‘Does anything do this in nature?’ It turns out that trees do.”

    The general understanding among biologists has been that water, propelled by surface tension, travels up a tree’s channels of xylem, then diffuses through a semipermeable membrane and down into channels of phloem that contain sugar and other nutrients.

    The more sugar there is in the phloem, the more water flows from xylem to phloem to balance out the sugar-to-water gradient, in a passive process known as osmosis. The resulting water flow flushes nutrients down to the roots. Trees and plants are thought to maintain this pumping process as more water is drawn up from their roots.

    “This simple model of xylem and phloem has been well-known for decades,” Hosoi says. “From a qualitative point of view, this makes sense. But when you actually run the numbers, you realize this simple model does not allow for steady flow.”

    In fact, engineers have previously attempted to design tree-inspired microfluidic pumps, fabricating parts that mimic xylem and phloem. But they found that these designs quickly stopped pumping within minutes.

    It was Hosoi’s student Comtet who identified a third essential part to a tree’s pumping system: its leaves, which produce sugars through photosynthesis. Comtet’s model includes this additional source of sugars that diffuse from the leaves into a plant’s phloem, increasing the sugar-to-water gradient, which in turn maintains a constant osmotic pressure, circulating water and nutrients continuously throughout a tree.

    Running on sugar

    With Comtet’s hypothesis in mind, Hosoi and her team designed their tree-on-a-chip, a microfluidic pump that mimics a tree’s xylem, phloem, and most importantly, its sugar-producing leaves.

    To make the chip, the researchers sandwiched together two plastic slides, through which they drilled small channels to represent xylem and phloem. They filled the xylem channel with water, and the phloem channel with water and sugar, then separated the two slides with a semipermeable material to mimic the membrane between xylem and phloem. They placed another membrane over the slide containing the phloem channel, and set a sugar cube on top to represent the additional source of sugar diffusing from a tree’s leaves into the phloem. They hooked the chip up to a tube, which fed water from a tank into the chip.

    With this simple setup, the chip was able to passively pump water from the tank through the chip and out into a beaker, at a constant flow rate for several days, as opposed to previous designs that only pumped for several minutes.

    “As soon as we put this sugar source in, we had it running for days at a steady state,” Hosoi says. “That’s exactly what we need. We want a device we can actually put in a robot.”

    Hosoi envisions that the tree-on-a-chip pump may be built into a small robot to produce hydraulically powered motions, without requiring active pumps or parts.

    “If you design your robot in a smart way, you could absolutely stick a sugar cube on it and let it go,” Hosoi says.

    This research was supported, in part, by the Defense Advance Research Projects Agency.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    MIT Seal

    The mission of MIT is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of the MIT community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

    MIT Campus

     
  • richardmitnick 12:50 pm on March 24, 2017 Permalink | Reply
    Tags: Applied Research, , , Producing Radioisotopes for Medical Imaging and Disease Treatment   

    From BNL: “Producing Radioisotopes for Medical Imaging and Disease Treatment” 

    Brookhaven Lab

    March 21, 2017
    Karen McNulty Walsh

    Brookhaven’s high-energy proton accelerator and a group led by Cathy Cutler team up to meet the nation’s demand for medical isotopes.

    1
    Cathy Cutler, Lisa Muench, Tatjana Klaric, Weimin Zhou, Vicky Litton, and Anna Goldberg in the hot cell area where BLIP targets are processed to extract desired isotope products.

    The before and after images are stunning: A prostate cancer patient riddled with metastatic tumors that disappear after just three, potent treatments.

    “Two patients underwent these treatments and they were cured,” said Cathy Cutler, director of the Medical Isotope Research and Production Program at the U.S. Department of Energy’s Brookhaven National Laboratory. “Their cancer was gone.

    “This is what we want to do—supply this material so that more patients can get this treatment,” she said.

    3
    Medical applications of isotopes produced at BLIP Top: BLIP produces Strontium-82, a relatively stable isotope that can be transported and used in hospitals to generate Rubidium-82, a radiotracer that reveals reduced blood flow in heart muscle under stress. This precision scanning points physicians to coronary arteries that need treatment. Credit: Washington University School of Medicine. Bottom: Before and after images show how a molecule labeled with Actinium-225 delivers cell-killing alpha particles directly to tumors, eradicating metastatic prostate cancer. The BLIP team aims to increase the production of Ac-225 so scientists can conduct large-scale trials and get this potentially lifesaving treatment to more patients. Credit: ©SNMMI: C. Kratochwil. J. Nucl. Med., 2016; 57 (12); 1941.

    The material is a molecule tagged with Actinium-225, a radioactive isotope. When designed to specifically bind with a protein on the surface of cancer cells, the radiolabeled molecule delivers a lethal, localized punch—alpha particles that kill the cancer with minimal damage to surrounding tissues.

    Actinium-225 can only be produced in the large quantities needed to support clinical applications at facilities that have high-energy particle accelerators.

    “This is why I came to Brookhaven,” Cutler said in a recent talk she gave to highlight her group’s work.* “We can make these alpha emitters and this is really giving doctors a chance to treat these patients!”

    Radiochemistry redux

    Brookhaven Lab and the Department of Energy Isotope Program have a long history of developing radioisotopes for uses in medicine and other applications. These radioactive forms of chemical elements can be used alone or attached to a variety of molecules to track and target disease.

    “If it wasn’t for the U.S. Department of Energy and its isotope development program, I’m not sure we’d have nuclear medicine,” Cutler said.

    Among the notable Brookhaven Lab successes are the development in the 1950s and 60s, respectively, of the Technetium-99m generator and a radioactively labeled form of glucose known as 18FDG—two radiotracers that went on to revolutionize medical imaging.

    As an example, 18FDG emits positrons (positively charged cousins of electrons) that can be picked up by a positron emission tomography (PET) scanner. Because rapidly growing cancer cells take up glucose faster than healthy tissue, doctors can use PET and 18FDG to detect and monitor the disease.

    “FDG turned around oncology,” Cutler said. Instead of taking a drug for months and suffering toxic side effects before knowing if a treatment is working, “patients can be scanned to look at the impact of treatment on tumors within 24 hours, and again over time, to see if the drug is effective—and also if it stops working.”

    Symbiotic operations

    While Tc-99m and 18FDG are now widely available in hospital settings and used in millions of scans a year, other isotopes are harder to make. They require the kind of high-energy particle accelerator you can find only at world-class physics labs.

    “Brookhaven is one of just a few facilities in the DOE Isotope Program that can produce certain critical medical isotopes,” Cutler said.

    Brookhaven’s linear accelerator (“linac”) was designed to feed beams of energetic protons into physics experiments at the Relativistic Heavy Ion Collider (RHIC), where physicists are exploring the properties of the fundamental building blocks of matter and the forces through which they interact.

    6
    Brookhaven’s linear accelerator (“linac”)

    7
    The Solenoidal Tracker at the Relativistic Heavy Ion Collider (RHIC) is a detector which specializes in tracking the thousands of particles produced by each ion collision at RHIC. Weighing 1,200 tons and as large as a house, STAR is a massive detector. It is used to search for signatures of the form of matter that RHIC was designed to create: the quark-gluon plasma. It is also used to investigate the behavior of matter at high energy densities by making measurements over a large area. | Photo courtesy of Brookhaven National Lab.

    But because the linac produces the protons in pulses, Cutler explained, it can deliver them pulse-by-pulse to different facilities. Operators in Brookhaven’s Collider-Accelerator Department deliver alternating pulses to RHIC and the Brookhaven Linac Isotope Producer (BLIP).

    “We operate these two programs symbiotically at the same time,” Cutler said. “We combine our resources to support the running of the linear accelerator; it’s cheaper for both programs to share this resource than it would cost if each of us had to use it alone.”


    Access mp4 video here .

    Tuning and targets

    BLIP operators aim the precisely controlled beams of energetic protons at small puck-shaped targets. The protons knock subatomic particles from the targets’ atoms, transforming them into the desired radioactive elements.

    “We stack different targets sequentially to make use of the beam’s reduced energy as it exits one target and enters the next in line, so we can produce multiple radionuclides at once,” Cutler said.

    Transformed targets undergo further chemical processing to yield a pure product that can be injected into patients, or a precursor chemical that can easily be transformed into the desired isotope or tracer on site at a hospital.

    “A lot of our work goes into producing these targets,” Cutler said. “You would be shocked at all the chemistry, engineering, and physics that goes into designing one of these pucks—to make sure it survives the energy and high current of the beam, gives you the isotope you are interested in with minimal impurities, and allows you to do the chemistry to extract that isotope efficiently.”

    Cutler recently oversaw installation of a new “beam raster” system designed to maximize the use of target materials and increase radioisotope production. With this upgrade, a series of magnets steers BLIP’s energetic particle beam to “paint” the target, rather than depositing all the energy in one spot. This cuts down on the buildup of target-damaging heat, allowing operators to increase beam current and transform more target material into the desired product.

    Meeting increasing demand

    The new raster system and ramped up current helped increase production of one of BLIP’s main products, Strontium-82, by more than 50 percent in 2016. Sr-82 has a relatively long half-life, allowing it to be transported to hospitals in a form that can generate a short-lived radiotracer, Rubidium-82, which has greatly improved the precision of cardiac imaging.

    4
    Weimin Zhou, Anna Goldberg, and Lisa Muench in the isotope-processing area.

    “Rb-82 mimics potassium, which is taken up by muscles, including the heart,” Cutler explained. “You can inject Rubidium into a patient in a PET scanner and measure the uptake of Rb-82 in heart muscle to precisely pinpoint areas of decreased blood flow when the heart is under stress. Then surgeons can go in and unblock that coronary artery to increase blood flow before the patient has a heart attack. Hundreds of thousands of patients receive this life-saving test because of what we’re doing here at Brookhaven.”

    BLIP also produces several isotopes with improved capabilities for detecting cancer, including metastatic tumors, and monitoring response to treatment.

    But rising to meet the demand for isotopes that have the potential to cure cancer may be BLIP’s highest calling—and has been a key driver of Cutler’s career.

    5
    Jason Nalepa, a BLIP operator, prepares targets to be installed in the BLIP beamline for irradiation

    “This is where I started as a chemist at the University of Missouri—designing molecules that have the right charges, the right size, and the right characteristics that determine where they go in the body so we can use them for imaging and therapy,” she said. “If we can target receptors that are overexpressed on tumor cells, we can selectively image these cells. And if there are enough of these receptors expressed, we can deliver radionuclides to those tumor cells very selectively and destroy them.”

    Radionuclides that emit alpha particles are among the most promising isotopes because alpha particles deliver a lot of energy and traverse very small distances. Targeted delivery of alphas would deposit very high doses—“like driving an 80-ton semi truck into a tumor”—while minimizing damage to surrounding healthy cells, Cutler said.

    “Our problem isn’t that we can’t cure cancer; we can ablate the cancer. Our problem is saving the patient. The toxicity of the treatments in many cases is so significant that we can’t get the levels in to kill the cancer without actually harming the patient. With alpha particles, because of the short distance and high impact, they are enabling us to treat these patients with minimal side effects and giving doctors the opportunity to really cure cancer.”

    Making the case for a cure

    One experimental treatment Cutler developed using Lutetium-177 while still at the University of Missouri worked favorably in treating neuroendocrine tumors, but didn’t get to a cure state. Actinium-225, one of the isotopes that is trickier to make, has shown more promise—as demonstrated by the prostate cancer results published in 2016 by researchers at University Hospital Heidelberg.

    Right now, according to Cutler, DOE’s Oak Ridge National Laboratory (ORNL) makes enough Ac-225 to treat about 50 patients each year. But almost 30 times that much is needed to conduct the clinical trials required to prove that such a strategy works before it can move from the laboratory to medical practice.

    “With the accelerator we have here at Brookhaven, the expertise in radiochemistry, and experience producing isotopes for medical applications, we—together with partners at ORNL and DOE’s Los Alamos National Laboratory—are looking to meet this unmet need to get this material out to patients,” Cutler said.

    The work at BLIP is funded by the DOE Isotope Program, managed by the Office of Science’s Nuclear Physics program. RHIC is a DOE Office of Science User Facility.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    BNL Campus

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 11:05 am on March 24, 2017 Permalink | Reply
    Tags: , Applied Research, , , nicotinamide adenine dinucleotide (NAD+)   

    From COSMOS: “Can ageing be held at bay by injections and pills?” 

    Cosmos Magazine bloc

    COSMOS

    24 March 2017
    Elizabeth Finkel

    1
    Two fast ageing mice. The one on the left was treated with a FOXO4 peptide, which targets senescent cells and leads to hair regrowth in 10 days.
    Peter L.J. de Keizer

    The day we pop up a pill or get a jab to stave off ageing is closer, thanks to two high profile papers just published today.

    A Science paper from a team, led by David Sinclair from Harvard Medical School and the University of NSW, shows how popping a pill that raises the levels of a natural molecule called nicotinamide adenine dinucleotide (NAD+) staves off the DNA damage that leads to aging.

    The other paper, published in Cell, led by Peter de Keizer’s group at Erasmus University in the Netherlands, shows how a short course of injections to kill off defunct “senescent cells” reversed kidney damage, hair loss and muscle weakness in aged mice.

    Taken together, the two reports give a glimpse of how future medications might work together to forestall ageing when we are young, and delete damaged cells as we grow old. “This is what we in the field are planning”, says Sinclair.

    Sinclair has been searching for factors that might slow the clock of ageing for decades. His group stumbled upon the remarkable effects of NAD+ in the course of studying powerful anti-ageing molecules known as sirtuins, a family of seven proteins that mastermind a suite of anti-ageing mechanisms, including protecting DNA and proteins.

    Resveratrol, a compound found in red wine, stimulates their activity. But back in 2000, Sinclair’s then boss Lenny Guarente at MIT discovered a far more powerful activator of sirtuins – NAD+. It was a big surprise.

    “It would have to be the most boring molecule in the world”, notes Sinclair.

    It was regarded as so common and boring that no-one thought it could play a role in something as profound as tweaking the ageing clock. But Sinclair found that NAD+ levels decline with age.

    “By the time you’re 50, the levels are halved,” he notes.

    And in 2013, his group showed [Cell] that raising NAD+ levels in old mice restored the performance of their cellular power plants, mitochondria.

    One of the key findings of the Science paper is identifying the mechanism by which NAD+ improves the ability to repair DNA. It acts like a basketball defence, staying on the back of a troublesome protein called DBC1 to keep it away from the key player PARP1– a protein that repairs DNA.

    When NAD+ levels fall, DBC1 tackles PARP1. End result: DNA damage goes unrepaired and the cell ‘ages’.

    “We ‘ve discovered the reason why DNA repair declines as we get older. After 100 years that’s exciting,” says Sinclair .

    His group has helped developed a compound, nicotinamide mono nucleotide (NMN), that raises NAD+ levels. As reported in the Science paper, when injected into aged mice it restored the ability of their liver cells to repair DNA damage. In young mice that had been exposed to DNA-damaging radiation, it also boosted their ability to repair it. The effects were seen within a week of the injection.

    These kinds of results have impressed NASA. The organisation is looking for methods to protect its astronauts from radiation damage during their one-year trip to Mars. Last December it hosted a competition for the best method of preventing that damage. Out of 300 entries, Sinclair’s group won.

    As well as astronauts, children who have undergone radiation therapy for cancer might also benefit from this treatment. According to Sinclair, clinical trials for NMN should begin in six months. While many claims have been made for NAD+ to date, and compounds are being sold to raise its levels, this will be the first clinical trial, says Sinclair.

    By boosting rates of DNA repair, Sinclair’s drug holds the hope of slowing down the ageing process itself. The work from de Keizer’s lab, however, offers the hope of reversing age-related damage.

    His approach stems from exploring the role of senescent cells. Until 2001, these cells were not really on the radar of researchers who study ageing. They were considered part of a protective mechanism that mothballs damaged cells, preventing them from ever multiplying into cancer cells.

    The classic example of senescent cells is a mole. These pigmented skin cells have incurred DNA damage, usually triggering dangerous cancer-causing genes. To keep them out of action, the cells are shut down.

    If humans lived only the 50-year lifespan they were designed for, there’d be no problem. But because we exceed our use-by date, senescent cells end up doing harm.

    As Judith Campisi at the Buck Institute, California, showed in 2001, they secrete inflammatory factors that appear to age the tissues around them.

    But cells have another option. They can self-destruct in a process dubbed apoptosis. It’s quick and clean, and there are no nasty compounds to deal with.

    So what relegates some cells to one fate over another? That’s the question Peter de Keizer set out to solve when he did a post-doc in Campisi’s lab back in 2009.

    Finding the answer didn’t take all that long. A crucial protein called p53 was known to give the order for the coup de grace. But sometimes it showed clemency, relegating the cell to senesce instead.

    De Keizer used sensitive new techniques to identify that in senescent cells, it was a protein called FOXO4 that tackled p53, preventing it from giving the execution order.

    The solution was to interfere with this liaison. But it’s not easy to wedge proteins apart; not something that small diffusible molecules – the kind that make great drugs – can do.

    De Keizer, who admits to “being stubborn” was undaunted. He began developing a protein fragment that might act as a wedge. It resembled part of the normal FOXO4 protein, but instead of being built from normal L- amino acids it was built from D-amino acids. It proved to be a very powerful wedge.

    Meanwhile other researchers were beginning to show that executing senescent cells was indeed a powerful anti-ageing strategy. For instance, a group from the Mayo Clinic last year showed that mice genetically engineered to destroy 50-70% of their senescent cells in response to a drug experienced a greater “health span”.

    Compared to their peers they were more lively and showed less damage to their kidney and heart muscle. Their average lifespan was also boosted by 20%.

    But humans are not likely to undergo mass genetic engineering. To achieve similar benefits requires a drug that works on its own. Now de Keizer’s peptide looks like it could be the answer.

    As the paper in Cell shows, in aged mice, three injections of the peptide per week had dramatic effects. After three weeks, the aged balding mice regrew hair and showed improvements to kidney function. And while untreated aged mice could be left to flop onto the lab bench while the technician went for coffee, treated mice would scurry away.

    “It’s remarkable. it’s the best result I’ve seen in age reversal,” says Sinclair of his erstwhile competitor’s paper.

    Dollops of scepticism are healthy when it comes to claims of a fountain of youth – even de Keizer admits his work “sounds too good to be true”. Nevertheless some wary experts are impressed.

    “It raises my optimism that in our lifetime we will see treatments that can ameliorate multiple age-related diseases”, says Campisi.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: