Tagged: Biochemistry Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 4:42 pm on October 21, 2014 Permalink | Reply
    Tags: , , Biochemistry, ,   

    From astrobio.net: “Scientists create possible precursor to life” 

    Astrobiology Magazine

    Astrobiology Magazine

    Oct 21, 2014
    University of Southern Denmark
    Contact Professor, Head of FLINT Center, Steen Rasmussen. Email: steen@sdu.dk. Mobile: +45 60112507

    How did life originate? And can scientists create life? These questions not only occupy the minds of scientists interested in the origin of life, but also researchers working with technology of the future. If we can create artificial living systems, we may not only understand the origin of life – we can also revolutionize the future of technology.

    pro
    Model of a protocell. Image by Janet Iwasa

    Protocells are the simplest, most primitive living systems, you can think of. The oldest ancestor of life on Earth was a protocell, and when we see, what it eventually managed to evolve into, we understand why science is so fascinated with protocells. If science can create an artificial protocell, we get a very basic ingredient for creating more advanced artificial life.

    However, creating an artificial protocell is far from simple, and so far no one has managed to do that. One of the challenges is to create the information strings that can be inherited by cell offspring, including protocells. Such information strings are like modern DNA or RNA strings, and they are needed to control cell metabolism and provide the cell with instructions about how to divide.

    Essential for life

    If one daughter cell after a division has a slightly altered information (maybe it provides a slightly faster metabolism), they may be more fit to survive. Therefore it may be selected and an evolution has started.

    Now researchers from the Center for Fundamental Living Technology (FLINT), Department of Physics, Chemistry and Pharmacy, University of Southern Denmark, describe in the journal Europhysics Letters, how they, in a virtual computer experiment, have discovered information strings with peculiar properties.

    Professor and head of FLINT, Steen Rasmussen, says: “Finding mechanisms to create information strings are essential for researchers working with artificial life.”

    auto
    An autocatalytic network is a network of molecules, which catalyze each other’s production. Each molecule can be formed by at least one chemical reaction in the network, and each reaction can be catalyzed by at least one other molecule in the network. This process will create a network that exhibits a primitive form of metabolism and an information system that replicates itself from generation to generation. Credit University of Southern Denmark.

    Steen Rasmussen and his colleagues know they face two problems:

    Firstly long molecular strings are decomposed in water. This means that long information strings “break” quickly in water and turn into many short strings. Thus it is very difficult to maintain a population of long strings over time.

    Secondly, it is difficult to make these molecules replicate without the use of modern enzymes, whereas it is easier to make a so-called ligation. A ligation is to connect any combination of two shorter strings into a longer string, assisted by another matching longer string. Ligation is the mechanism used by the SDU-researchers.

    “In our computer simulation – our virtual molecular laboratory – information strings began to replicate quickly and efficiently as expected. However, we were struck to see that the system quickly developed an equal number of short and long information strings and further that a strong pattern selection on the strings had occurred. We could see that only very specific information patterns on the strings were to be seen in the surviving strings. We were puzzled: How could such a coordinated selection of strings occur, when we knew that we had not programmed it. The explanation had to be found in the way the strings interacted with each other”, explains Steen Rasmussen.

    It is like society

    According to Steen Rasmussen, a so-called self-organizing autocatalytic network was created in the virtual pot, into which he and his colleagues poured the ingredients for information strings.

    “An autocatalytic network works like a community; each molecule is a citizen who interacts with other citizens and together they help create a society”, explains Steen Rasmussen.

    This autocatalytic set quickly evolved into a state where strings of all lengths existed in equal concentrations, which is not what is usually found. Further, the selected strings had strikingly similar patterns, which is also unusual.

    “We might have discovered a process similar to the processes that initially sparked the first life. We of course don’t know if life actually was created this way – but it could have been one of the steps. Perhaps a similar process created sufficiently high concentrations of longer information strings when the first protocell was created”, explains Steen Rasmussen.

    Basis for new technology

    The mechanisms underlying the formation and selection of effective information strings are not only interesting for the researchers who are working to create protocells. They also have value to researchers working with tomorrow’s technology, like they do at the FLINT Center.

    “We seek ways to develop technology that’s based on living and life-like processes. If we succeed, we will have a world where technological devices can repair themselves, develop new properties and be re-used. For example a computer made of biological materials poses very different – and less environmentally stressful – requirements for production and disposal”, says Steen Rasmussen.

    Ref: http://epljournal.edpsciences.org/articles/epl/abs/2014/14/epl16388/epl16388.html

    See the full article here.

    NASA

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 2:18 pm on October 10, 2014 Permalink | Reply
    Tags: , Biochemistry, , ,   

    From BNL: “Researchers Pump Up Oil Accumulation in Plant Leaves” 

    Brookhaven Lab

    October 7, 2014
    Karen McNulty Walsh, (631) 344-8350 or Peter Genzer, (631) 344-3174

    Increasing the oil content of plant biomass could help fulfill the nation’s increasing demand for renewable energy feedstocks. But many of the details of how plant leaves make and break down oils have remained a mystery. Now a series of detailed genetic studies conducted at the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory and published in The Plant Cell reveals previously unknown biochemical details about those metabolic pathways—including new ways to increase the accumulation of oil in leaves, an abundant source of biomass for fuel production.

    Using these methods, the scientists grew experimental Arabidopsis plants whose leaves accumulated 9 percent oil by dry weight, which represents an approximately 150-fold increase in oil content compared to wild type leaves.

    “This is an unusually high level of oil accumulation for plant vegetative tissue,” said Brookhaven Lab biochemist Changcheng Xu, who led the research team. “In crop plants, whose growth time is longer, if the rate of oil accumulation is the same we could get much higher oil content—possibly as high as 40 percent by weight,” he said.

    And when it comes to growing plants for biofuels, packing on the calories is the goal, because energy-dense oils give more “bang per bushel” than less-energy-dense leaf carbohydrates.
    Deciphering biochemical pathways

    The key to increasing oil accumulation in these studies was to unravel the details of the biochemical pathways involved in the conversion of carbon into fatty acids, the storage of fatty acids as oil, and the breakdown of oil in leaves. Prior to this research, scientists did not know that these processes were so intimately related.

    “Our method resulted in an unusually high level of oil accumulation in plant vegetative tissue.”
    — Brookhaven Lab biochemist Changcheng Xu

    “We previously thought that oil storage and oil degradation were alternative fates for newly synthesized fatty acids—the building blocks of oils,” said Brookhaven biochemist John Shanklin, a collaborator on the studies.

    To reveal the connections, Brookhaven’s Jillian Fan and other team members used a series of genetic tricks to systematically disable an alphabet soup of enzymes—molecules that mediate a cell’s chemical reactions—to see whether and how each had an effect in regulating the various biochemical conversions. They also used radiolabeled versions of fatty acids to trace their paths and learn how quickly they move through the pathway. They then used the findings to map out how the processes take place inside different subcellular structures, some of which you might recognize from high school science classes: the chloroplast, endoplasmic reticulum, storage droplets, and the peroxisome.

    team
    Brookhaven researchers Jilian Fan, John Shanklin, and Changcheng Xu have developed a method for getting experimental plants to accumulate more leaf oil. Their strategy could have a significant impact on the production of biofuels.

    “Our goal was to test and understand all the components of the system to fully understand how fatty acids, which are produced in the chloroplasts, are broken down in the peroxisome,” Xu said.

    Key findings

    syn
    Details of the oil synthesis and breakdown pathways within plant leaf cells: Fatty acids (FA) synthesized within chloroplasts go through a series of reactions to be incorporated into lipids (TAG) within the endoplasmic reticulum (ER); lipid droplets (LD) store lipids such as oils until they are broken down to release fatty acids into the cytoplasm; the fatty acids are eventually transported into the peroxisome for oxidation. This detailed metabolic map pointed to a new way to dramatically increase the accumulation of oil in plant leaves — blocking the SDP1 enzyme that releases fatty acids from lipid droplets in plants with elevated fatty acid synthesis. If this strategy works in biofuel crops, it could dramatically increase the energy content of biomass used to make biofuels.

    The research revealed that there is no direct pathway for fatty acids to move from the chloroplasts to the peroxisome as had previously been assumed. Instead, many complex reactions occur within the endoplasmic reticulum to first convert the fatty acids through a series of intermediates into plant oils. These oils accumulate in storage droplets within the cytoplasm until another enzyme breaks them down to release the fatty acid building blocks. Yet another enzyme must transport the fatty acids into the peroxisome for the final stages of degradation via oxidation. The amount of oil that accumulates at any one time represents a balance between the pathways of synthesis and degradation.

    Some previous attempts to increase oil accumulation in leaves have focused on disrupting the breakdown of oils by blocking the action of the enzyme that transports fatty acids into the peroxisome. The reasoning was that the accumulation of fatty acids would have a negative feedback on oil droplet breakdown. High levels of fatty acids remaining in the cytoplasm would inhibit the further breakdown of oil droplets, resulting in higher oil accumulation.

    That idea works to some extent, Xu said, but the current research shows it has negative effects on the overall health of the plants. “Plants don’t grow as well and there can be other defects,” he said.

    Based on their new understanding of the detailed biochemical steps that lead to oil breakdown, Xu and his collaborators explored another approach—namely disabling the enzyme one step back in the metabolic process, the one that breaks down oil droplets to release fatty acids.

    “If we knock out this enzyme, known as SDP1, we get a large amount of oil accumulating in the leaves,” he said, “and without substantial detrimental effects on plant growth.”

    “This research points to a new and different way to accumulate oil in leaves from that being tried in other labs,” Xu said. “In addition, the strategy differs fundamentally from other strategies that are based on adding genes, whereas our strategy is based on disabling or inactivating genes through simple mutations. This work provides a very promising platform for engineering oil production in a non-genetically modified way.”

    “This work provides another example of how research into basic biochemical mechanisms can lead to knowledge that has great promise to help solve real world problems,” concluded Shanklin.

    This research was conducted by Xu in collaboration with Jilian Fan and Chengshi Yan and John Shanklin of Brookhaven’s Biosciences Department, and Rebecca Roston, now at the University of Nebraska, Lincoln. The work was funded by the DOE Office of Science and made use of a confocal microscope at Brookhaven Lab’s Center for Functional Nanomaterials, a DOE Office of Science user facility.

    See the full article here.

    BNL Campus

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world.Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 10:11 pm on September 25, 2014 Permalink | Reply
    Tags: , Biochemistry, ,   

    From NOVA: “Genetically Engineering Almost Anything” 

    PBS NOVA

    NOVA

    Thu, 17 Jul 2014
    Tim De Chant and Eleanor Nelsen

    When it comes to genetic engineering, we’re amateurs. Sure, we’ve known about DNA’s structure for more than 60 years, we first sequenced every A, T, C, and G in our bodies more than a decade ago, and we’re becoming increasingly adept at modifying the genes of a growing number of organisms.

    But compared with what’s coming next, all that will seem like child’s play. A new technology just announced today has the potential to wipe out diseases, turn back evolutionary clocks, and reengineer entire ecosystems, for better or worse. Because of how deeply this could affect us all, the scientists behind it want to start a discussion now, before all the pieces come together over the next few months or years. This is a scientific discovery being played out in real time.

    dna repair
    Scientists have figured out how to use a cell’s DNA repair mechanisms to spread traits throughout a population.

    Today, researchers aren’t just dropping in new genes, they’re deftly adding, subtracting, and rewriting them using a series of tools that have become ever more versatile and easier to use. In the last few years, our ability to edit genomes has improved at a shockingly rapid clip. So rapid, in fact, that one of the easiest and most popular tools, known as CRISPR-Cas9, is just two years old. Researchers once spent months, even years, attempting to rewrite an organism’s DNA. Now they spend days.

    Soon, though, scientists will begin combining gene editing with gene drives, so-called selfish genes that appear more frequently in offspring than normal genes, which have about a 50-50 chance of being passed on. With gene drives—so named because they drive a gene through a population—researchers just have to slip a new gene into a drive system and let nature take care of the rest. Subsequent generations of whatever species we choose to modify—frogs, weeds, mosquitoes—will have more and more individuals with that gene until, eventually, it’s everywhere.

    Cas9-based gene drives could be one of the most powerful technologies ever discovered by humankind. “This is one of the most exciting confluences of different theoretical approaches in science I’ve ever seen,” says Arthur Caplan, a bioethicist at New York University. “It merges population genetics, genetic engineering, molecular genetics, into an unbelievably powerful tool.”

    We’re not there yet, but we’re extraordinarily close. “Essentially, we have done all of the pieces, sometimes in the same relevant species.” says Kevin Esvelt, a postdoc at Harvard University and the wunderkind behind the new technology. “It’s just no one has put it all together.”

    It’s only a matter of time, though. The field is progressing rapidly. “We could easily have laboratory tests within the next few months and then field tests not long after that,” says George Church, a professor at Harvard University and Esvelt’s advisor. “That’s if everybody thinks it’s a good idea.”

    It’s likely not everyone will think this is a good idea. “There are clearly people who will object,” Caplan says. “I think the technique will be incredibly controversial.” Which is why Esvelt, Church, and their collaborators are publishing papers now, before the different parts of the puzzle have been assembled into a working whole.

    “If we’re going to talk about it at all in advance, rather than in the past tense,” Church says, “now is the time.”

    “Deleterious Genes”

    The first organism Esvelt wants to modify is the malaria-carrying mosquito Anopheles gambiae. While his approach is novel, the idea of controlling mosquito populations through genetic modification has actually been around since the late 1970s. Then, Edward F. Knipling, an entomologist with the U.S. Department of Agriculture, published a substantial handbook with a chapter titled Use of Insects for Their Own Destruction. One technique, he wrote, would be to modify certain individuals to carry “deleterious genes” that could be passed on generation after generation until they pervaded the entire population. It was an idea before its time. Knipling was on the right track, but he and his contemporaries lacked the tools to see it through.

    The concept surfaced a few more times before being picked up by Austin Burt, an evolutionary biologist and population geneticist at Imperial College London. It was the late 1990s, and Burt was busy with his yeast cells, studying their so-called homing endonucleases, enzymes that facilitate the copying of genes that code for themselves. Self-perpetuating genes, if you will. “Through those studies, gradually, I became more and more familiar with endonucleases, and I came across the idea that you might be able to change them to recognize new sequences,” Burt recalls.

    Other scientists were investigating endonucleases, too, but not in the way Burt was. “The people who were thinking along those lines, molecular biologists, were thinking about using these things for gene therapy,” Burt says. “My background in population biology led me to think about how they could be used to control populations that were particularly harmful.”
    “There’s a lot to be done still, but on the scale of years, not months or decades.”

    In 2003, Burt penned an influential article that set the course for an entire field: We should be using homing endonucleases, a type of gene drive, to modify malaria-carrying mosquitoes, he said, not ourselves. Burt saw two ways of going about it—one, modify a mosquito’s genome to make it less hospitable to malaria, and two, skew the sex ratio of mosquito populations so there are no females for the males to reproduce with. In the following years, Burt and his collaborators tested both in the lab and with computer models before they settled on sex ratio distortion. (Making mosquitoes less hospitable to malaria would likely be a stopgap measure at best; the Plasmodium protozoans could evolve to cope with the genetic changes, just like they have evolved resistance to drugs.)

    Burt has spent the last 11 years refining various endonucleases, playing with different scenarios of inheritance, and surveying people in malaria-infested regions. Now, he finally feels like he is closing in on his ultimate goal.“There’s a lot to be done still,” he says. “But on the scale of years, not months or decades.”

    Cheating Natural Selection

    Cas9-based gene drives could compress that timeline even further. One half of the equation—gene drives—are the literal driving force behind proposed population-scale genetic engineering projects. They essentially let us exploit evolution to force a desired gene into every individual of a species. “To anthropomorphize horribly, the goal of a gene is to spread itself as much as possible,” Esvelt says. “And in order to do that, it wants to cheat inheritance as thoroughly as it can.” Gene drives are that cheat.

    Without gene drives, traits in genetically-engineered organisms released into the wild are vulnerable to dilution through natural selection. For organisms that have two parents and two sets of chromosomes (which includes humans, many plants, and most animals), traits typically have only a 50-50 chance of being inherited, give or take a few percent. Genes inserted by humans face those odds when it comes time to being passed on. But when it comes to survival in the wild, a genetically modified organism’s odds are often less than 50-50. Engineered traits may be beneficial to humans, but ultimately they tend to be detrimental to the organism without human assistance. Even some of the most painstakingly engineered transgenes will be gradually but inexorably eroded by natural selection.

    Some naturally occurring genes, though, have over millions of years learned how to cheat the system, inflating their odds of being inherited. Burt’s “selfish” endonucleases are one example. They take advantage of the cell’s own repair machinery to ensure that they show up on both chromosomes in a pair, giving them better than 50-50 odds when it comes time to reproduce.

    gene drive
    A gene drive (blue) always ends up in all offspring, even if only one parent has it. That means that, given enough generations, it will eventually spread through the entire population.

    Here’s how it generally works. The term “gene drive” is fairly generic, describing a number of different systems, but one example involves genes that code for an endonuclease—an enzyme which acts like a pair of molecular scissors—sitting in the middle of a longer sequence of DNA that the endonculease is programmed to recognize. If one chromosome in a pair contains a gene drive but the other doesn’t, the endonuclease cuts the second chromosome’s DNA where the endonuclease code appears in the first.

    The broken strands of DNA trigger the cell’s repair mechanisms. In certain species and circumstances, the cell unwittingly uses the first chromosome as a template to repair the second. The repair machinery, seeing the loose ends that bookend the gene drive sequence, thinks the middle part—the code for the endonuclease—is missing and copies it onto the broken chromosome. Now both chromosomes have the complete gene drive. The next time the cell divides, splitting its chromosomes between the two new cells, both new cells will end up with a copy of the gene drive, too. If the entire process works properly, the gene drive’s odds of inheritance aren’t 50%, but 100%.

    gene drive
    Here, a mosquito with a gene drive (blue) mates with a mosquito without one (grey). In the offspring, one chromosome will have the drive. The endonuclease then slices into the drive-free DNA. When the strand gets repaired, the cell’s machinery uses the drive chromosome as a template, unwittingly copying the drive into the break.

    Most natural gene drives are picky about where on a strand of DNA they’ll cut, so they need to be modified if they’re to be useful for genetic engineering. For the last few years, geneticists have tried using genome-editing tools to build custom gene drives, but the process was laborious and expensive. With the discovery of CRISPR-Cas9 as a genome editing tool in 2012, though, that barrier evaporated. CRISPR is an ancient bacterial immune system which identifies the DNA of invading viruses and sends in an endonuclease, like Cas9, to chew it up. Researchers quickly realized that Cas9 could easily be reprogrammed to recognize nearly any sequence of DNA. All that’s needed is the right RNA sequence—easily ordered and shipped overnight—which Cas9 uses to search a strand of DNA for where to cut. This flexibility, Esvelt says, “lets us target, and therefore edit, pretty much anything we want.” And quickly.

    Gene drives and Cas9 are each powerful on their own, but together they could significantly change biology. CRISRP-Cas9 allows researchers to edit genomes with unprecedented speed, and gene drives allow engineered genes to cheat the system, even if the altered gene weakens the organism. Simply by being coupled to a gene drive, an engineered gene can race throughout a population before it is weeded out. “Eventually, natural selection will win,” Esvelt says, but “gene drives just let us get ahead of the game.”
    Beyond Mosquitoes

    If there’s anywhere we could use a jump start, it’s in the fight against malaria. Each year, the disease kills over 200,000 people and sickens over 200 million more, most of whom are in Africa. The best new drugs we have to fight it are losing ground; the Plasmodium parasite is evolving resistance too quickly. And we’re nowhere close to releasing an effective vaccine. The direct costs of treating the disease are estimated at $12 billion, and the economies of affected countries grew 1.3% less per year, a substantial amount.

    Which is why Esvelt and Burt are both so intently focused on the disease. “If we target the mosquito, we don’t have to face resistance on the parasite itself. The idea is, we can just take out the vector and stop all transmission. It might even lead to eradication,” Esvelt says.

    Esvelt initially mulled over the idea of building Cas9-based gene drives in mosquitoes to do just that. He took the idea to to Flaminia Catteruccia, a professor who studies malaria at the Harvard School of Public Health, and the two grew increasingly certain that such a system would not only work, but work well. As their discussions progressed, though, Esvelt realized they were “missing the forest for the trees.” Controlling malaria-carrying mosquitoes was just the start. Cas9-based gene drives were the real breakthrough. “If it let’s us do this for mosquitos, what is to stop us from potentially doing it for almost anything that is sexually reproducing?” he realized.
    “What is to stop us from potentially doing it for almost anything that is sexually reproducing?”

    In theory, nothing. But in reality, the system works best on fast-reproducing species, Esvelt says. Short generation times allow the trait to spread throughout a population more quickly. Mosquitoes are a perfect test case. If everything were to work perfectly, deleterious traits could sweep through populations of malaria-carrying mosquitoes in as few as five years, wiping them off the map.

    Other noxious species could be candidates, too. Certain invasive species, like mosquitoes in Hawaii or Asian carp in the Great Lakes, could be targeted with Cas9-based gene drives to either reduce their numbers or eliminate them completely. Agricultural weeds like horseweed that have evolved resistance to glyphosate, a herbicide that is broken down quickly in the soil, could have their susceptibility to the compound reintroduced, enabling more farmers to adopt no-till practices, which help conserve topsoil. And in the more distant future, Esvelt says, weeds could even be engineered to introduce vulnerabilities to completely benign substances, eliminating the need for toxic pesticides. The possibilities seem endless.

    The Decision

    Before any of that can happen, though, Esvelt and Church are adamant that the public help decide whether the research should move forward. “What we have here is potentially a general tool for altering wild populations,” Esvelt says. “We really want to make sure that we proceed down this path—if we decide to proceed down this path—as safely and responsibly as possible.”

    To kickstart the conversation, they partnered with the MIT political scientist Kenneth Oye and others to convene a series of workshops on the technology. “I thought it might be useful to get into the room people with slightly different material interests,” Oye says, so they invited regulators, nonprofits, companies, and environmental groups. The idea, he says, was to get people to meet several times, to gain trust and before “decisions harden.” Despite the diverse viewpoints, Oye says there was surprising agreement among participants about what the important outstanding questions were.

    As the discussion enters the public sphere, tensions are certain to intensify. “I don’t care if it’s a weed or a blight, people still are going to say this is way too massive a genetic engineering project,” Caplan says. “Secondly, it’s altering things that are inherited, and that’s always been a bright line for genetic engineering.” Safety, too, will undoubtedly be a concern. As the power of a tool increases, so does its potential for catastrophe, and Cas9-based gene drives could be extraordinarily powerful.

    There’s also little in the way of precedent that we can use as a guide. Our experience with genetically modified foods would seem to be a good place to start, but they are relatively niche organisms that are heavily dependent on water and fertilizer. It’s pretty easy to keep them contained to a field. Not so with wild organisms; their potential to spread isn’t as limited.
    There’s little in the way of precedent that we can use as a guide.

    Aware of this, Esvelt and his colleagues are proposing a number of safeguards, including reversal drives that can undo earlier engineered genes. “We need to really make sure those work if we’re proposing to build a drive that is intended to modify a wild population,” Esvelt says.

    There are still other possible hurdles to surmount—lab-grown mosquitoes may not interbreed with wild ones, for example—but given how close this technology is to prime time, Caplan suggests researchers hew to a few initial ethical guidelines. One, use species that are detrimental to human health and don’t appear to fill a unique niche in the wild. (Malaria-carrying mosquitoes seem fit that description.) Two, do as much work as possible using computer models. And three, researchers should continue to be transparent about their progress, as they have been. “I think the whole thing is hugely exciting,” Caplan says. “But the time to really get cracking on the legal/ethical infrastructure for this technology is right now.”

    Church agrees, though he’s also optimistic about the potential for Cas9-based gene drives. “I think we need to be cautious with all new technologies, especially all new technologies that are messing with nature in some way or another. But there’s also a risk of doing nothing,” Church says. “We have a population of 7 billion people. You have to deal with the environmental consequences of that.”

    See the full article here.

    NOVA is the highest rated science series on television and the most watched documentary series on public television. It is also one of television’s most acclaimed series, having won every major television award, most of them many times over.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 7:25 am on September 25, 2014 Permalink | Reply
    Tags: , Biochemistry,   

    From DG: “Electricity at the Quantum Level -‘Played a Strong Role in the Creation of Life'” 

    Daily Galaxy
    The Daily Galaxy

    September 23, 2014
    No Writer Credit

    In 1953, American chemist Stanley Miller had famously electrified a mixture of simple gas and water to simulate lightning and the atmosphere of early Earth. The revolutionary experiment—which yielded a brownish soup of amino acids—offered a simple potential scenario for the origin of life’s building blocks. Miller’s work gave birth to modern research on pre-biotic chemistry and the origins of life.

    fire

    For the past 60 years, scientists have investigated other possible energy sources for the formation of life’s building blocks, including ultra violet light, meteorite impacts, and deep sea hydrothermal vents. Now. for the first time, researchers have reproduced the results of the [Stanley] Miller-[Harold] Urey experiment in a computer simulation, yielding new insight into the effect of electricity on the formation of life’s building blocks at the quantum level.

    In this new study, Antonino Marco Saitta, of the Université Pierre et Marie Curie, Sorbonne, in Paris, France and his colleagues wanted to revisit Miller’s result with electric fields, but from a quantum perspective. Saitta and study co-author Franz Saija, two theoretical physicists, had recently applied a new quantum model to study the effects of electric fields on water, which had never been done before. After coming across a documentary on Miller’s work, they wondered whether the quantum approach might work for the famous spark-discharge experiment.

    The method would also allow them to follow individual atoms and molecules through space and time—and perhaps yield new insight into the role of electricity in Miller’s work.

    “The spirit of our work was to show that the electric field is part of it,” Saitta said, “without necessarily involving lightning or a spark.”

    As in the original Miller experiment, Saitta and Saija subjected a mixture of molecules containing carbon, nitrogen, oxygen and hydrogen atoms to an electric field. As expected, the simulation yielded glycine, an amino acid that is one of the simplest building blocks for proteins, and one the most abundant products in the original Miller experiment.

    A typical intermediate in the formation of amino acids is the small molecule formaldehyde.
    Formaldehyde – A typical intermediate in the formation of amino acids.

    But their approach also yielded some unexpected results. In particular, their model suggested that the formation of amino acids in the Miller scenario might have occurred via a more complex chemical pathway than previously thought.

    A typical intermediate in the formation of amino acids is the small molecule formaldehyde. But their simulation showed that when subjected to an electric field, the reaction favored a different intermediate, the molecule formamide.

    It turns out, formamide could have not only played a crucial role in the formation of life’s building blocks on Earth, but also elsewhere.

    “We weren’t looking for it, or expecting it,” Saitta said. “We only learned after the fact, by reviewing the scientific literature, that it’s an important clue in prebiotic chemistry.”

    For instance, formamide has recently been shown to be a key ingredient in making some of the building blocks of RNA, notably guanine, in the presence of ultra violet light.

    Formamide has also recently been observed in space—notably in a comet and in a solar-type proto star. Earlier research has also shown that formamide can form when comets or asteroids impact the Earth.

    “The possibility of new routes to make amino acids without a formaldehyde intermediate is novel and gaining ground, especially in extraterrestrial contexts,” the authors wrote. “The presence of formamide might be a most telling fingerprint of abiotic terrestrial and extraterrestrial amino acids.”

    However, Jeff Bada, who was a graduate student of Miller’s in the 1960s and spent his career working of the origin of life, remains skeptical about their results and theoretical approach. “Their model might not meaningfully represent what happens in a solution,” he says. “We know there’s a lot of formaldehyde made in the spark discharge experiment. I don’t think the formamide reaction would be significant in comparison to the traditional reaction.”

    But Saitta points out that formamide is very unstable, so it may not last long enough to be observed in real Miller experiments. “In our simulation, formamide always formed spontaneously. And it was some sort of crucible—it would either break up into water and hydrogen cyanide, or combine with other molecules and form the amino acid glycine.”

    Another key insight from their study is that the formation of some of life’s building blocks may have occurred on mineral surfaces, since most have strong natural electric fields.

    “The electric field of mineral surfaces can be easily 10 or 20 times stronger than the one in our study,” Saitta said. “The problem is that it only acts on a very short range. So to feel the effects, molecules would have to be very close to the surface.”

    “I think that this work is of great significance,” said François Guyot, a geochemist at the French Museum of Natural History. “Regarding the mineral surfaces, strong electric fields undoubtedly exist at their immediate proximity. And because of their strong role on the reactivity of organic molecules, they might enhance the formation of more complex molecules by a mechanism distinct from the geometrical concentration of reactive species, a mechanisms often proposed when mineral surfaces are invoked for explaining the formation of the first biomolecules.”

    One of the leading hypotheses in the field of life’s origin suggests that important prebiotic reactions may have occurred on mineral surfaces. But so far scientists don’t fully understand the mechanism behind it. “Nobody has really looked at electric fields on mineral surfaces,” Saitta said. “My feeling is that there’s probably something to explore there.”

    Their results are published this week in the scientific journal Proceedings of the National Academy of Sciences.

    See the full article here.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
    • Atul Gupta 9:15 am on September 25, 2014 Permalink | Reply

      Adding on, I think I read this in Hawking’s book, life as we know it is organic, but maybe at some other location or conditions in the universe, life didn’t have to depend on Oxygen, maybe there were other forms of life that didn’t need Earth like conditions!

      -Metaphysicien

      Like

  • richardmitnick 3:46 pm on September 23, 2014 Permalink | Reply
    Tags: , Biochemistry,   

    From SLAC: “Research Pinpoints Role of ‘Helper’ Atoms in Oxygen Release” 


    SLAC Lab

    September 22, 2014

    System Studied at SLAC’s Synchrotron Mimics Steps in Photosynthesis

    Experiments at the Department of Energy’s SLAC National Accelerator Laboratory solve a long-standing mystery in the role calcium atoms serve in a chemical reaction that releases oxygen into the air we breathe. The results offer new clues about atomic-scale processes that drive the life-sustaining cycle of photosynthesis and could help forge a foundation for producing cleaner energy sources by synthesizing nature’s handiwork.

    The research is detailed in a paper published Sept. 14 in Nature Chemistry. X-ray experiments at SLAC’s Stanford Synchrotron Radiation Lightsource (SSRL), a DOE Office of Science User Facility, played a key role in the study, led by Wonwoo Nam at Ewha Womans University in Korea in a joint collaboration with Riti Sarangi, an SSRL staff scientist.

    SLAC SSRL
    SLAC SSRL

    “For the first time, we show how calcium can actually tune this oxygen-releasing reaction in subtle but precise ways,” said Sarangi, who carried out the X-ray work and supporting computer simulations and calculations. “The study helps us resolve the question, ‘Why does nature choose calcium?'”

    Photosynthesis is one of many important biological processes that rely on proteins with metal-containing centers, such as iron or manganese. The chemistry carried out in such centers is integral to their function. Scientists have known that the presence of calcium is necessary for the oxygen-releasing stages of photosynthesis, but they didn’t know how or why.

    The SSRL experiment used a technique known as X-ray absorption spectroscopy to explore the chemical and structural details of sample systems that mimic of the oxygen-releasing steps in photosynthesis. The basic oxygen-releasing system contained calcium and was centered around an iron atom.

    Researchers found that charged atoms, or ions, of calcium and another element, strontium, bind to the oxygen atoms in a way that precisely tunes the chemical reaction at the iron center. This, in turn, facilitates the bond formation between two oxygen atoms. The study also revealed that calcium and strontium do not obstruct the release of these bound oxygen atoms into the air as an oxygen molecule — the final step in this reaction.

    “We saw that unless you use calcium or strontium, this sample system will not release oxygen,” Sarangi said. “Calcium and strontium bind at just the right strength to facilitate the oxygen release. Anything that binds too strongly would impede that step.”

    While the sample system studied is not biological, the chemistry at work is considered a very good analogue for the oxygen-releasing steps carried out in photosynthesis, she said, and could assist in constructing artificial systems that replicate these steps. The next step will be to study more complex samples that perform more closely to the actual chemistry in photosynthesis.

    Other participants in this research were from Osaka University in Japan and the Japan Science Technology Agency. The research was supported by the National Research Foundation of Korea and the Ministry of Education, Culture, Sports, Science and Technology in Japan. SSRL’s Structural Molecular Biology program is supported by the National Institutes of Health and the Office of Biological and Environmental Research of the U.S. Department of Energy.

    See the full article here.

    SLAC Campus
    SLAC is a multi-program laboratory exploring frontier questions in photon science, astrophysics, particle physics and accelerator research. Located in Menlo Park, California, SLAC is operated by Stanford University for the DOE’s Office of Science.
    i1

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 3:24 pm on September 23, 2014 Permalink | Reply
    Tags: , Biochemistry, , ,   

    From Sandia Lab: “Sandia researchers find clues to superbug evolution” 


    Sandia Lab

    September 23, 2014
    Patti Koning, pkoning@sandia.gov, (925) 294-4911

    Imagine going to the hospital with one disease and coming home with something much worse, or not coming home at all.

    With the emergence and spread of antibiotic-resistance pathogens, healthcare-associated infections have become a serious threat. On any given day about one in 25 hospital patients has at least one such infection and as many as one in nine die as a result, according to the Centers for Disease Control and Prevention.

    Consider Klebsiella pneumoniae, not typically a ferocious pathogen, but now armed with resistance to virtually all antibiotics in current clinical use. It is the most common species of carbapenem-resistant Enterobacteriaceae (CRE) in the United States. As carbapenems are considered the antibiotic of last resort, CREs are a triple threat for their resistance to nearly all antibiotics, high mortality rates and ability to spread their resistance to other bacteria.

    But there is hope. A team of Sandia National Laboratories microbiologists for the first time recently sequenced the entire genome of a Klebsiella pneumoniae strain, encoding New Delhi Metallo-beta-lactamase (NDM-1). They presented their findings in a paper published in PLOS One, Resistance Determinants and Mobile Genetic Elements of an NDM-1 Encoding Klebsiella pneumoniae Strain.

    image
    Sandia National Laboratories’ researchers Kelly Williams, left, and Corey Hudson look at the mosaic pattern of one of the Klebsiella pneumoniae plasmids and discuss mechanisms that mobilize resistance genes. (Photo by Dino Vournas)

    The Sandia team of Corey Hudson, Zach Bent, Robert Meagher and Kelly Williams is beginning to understand the bacteria’s multifaceted mechanisms for resistance. To do this, they developed several new bioinformatics tools for identifying mechanisms of genetic movement, tools that also might be effective at detecting bioengineering.

    “Once we had the entire genome sequenced, it was a real eye opener to see the concentration of so many antibiotic resistant genes and so many different mechanisms for accumulating them,” explained Williams, a bioinformaticist. “Just sequencing this genome unlocked a vault of information about how genes move between bacteria and how DNA moves within the chromosome.”

    Meagher first worked last year with Klebsiella pneumoniae ATCC BAA-2146 (Kpn2146), the first U.S. isolate found to encode NDM-1. Along with E.coli, it was used to test an automatic sequencing library preparation platform for the RapTOR Grand Challenge, a Sandia project that developed techniques to allow discovery of pathogens in clinical samples.

    “I’ve been interested in multi-drug-resistant organisms for some time. The NDM-1 drug resistance trait is spreading rapidly worldwide, so there is a great need for diagnostic tools,” said Meagher. “This particular strain of Klebsiella pneumoniae is fascinating and terrifying because it’s resistant to practically everything. Some of that you can explain on the basis on NDM-1, but it’s also resistant to other classes of antibiotics that NDM-1 has no bearing on.”

    Unlocking Klebsiella pneumoniae

    Assembling an entire genome is like putting together a puzzle. Klebsiella pneumoniae turned out to have one large chromosome and four plasmids, small DNA molecules physically separate from and able to replicate independently of the bacterial cell’s chromosomal DNA. Plasmids often carry antibiotic resistant genes and other defense mechanisms.

    The researchers discovered their Klebsiella pneumoniae bacteria encoded 34 separate enzymes of antibiotic resistance, as well as efflux pumps that move compounds out of cells, and mutations in chromosomal genes that are expected to confer resistance. They also identified several mechanisms that allow cells to mobilize resistance genes, both within a single cell and between cells.

    “Each one of those genes has a story: how it got into this bacteria, where it has been, and how it has evolved,” said Williams.

    Necessity leads to development of new tools

    Klebsiella pneumoniae uses established mechanisms to move genes, such as “jumping genes” known as transposons, and genomic islands, mobile DNA elements that enable horizontal gene transfer between organisms. However, the organism has so many tricks and weapons that the research team had to go beyond existing bioinformatics tools and develop new ways of identifying mechanisms of genetic movement.

    Williams and Hudson detected circular forms of transposons in movement, which has never been shown this way, and discovered sites within the genome undergoing homologous recombination, another gene mobilization mechanism. By applying two existing bioinformatics methods for detecting genomic islands, they found a third class of islands that neither method alone could have detected.

    “To some extent, every extra piece of DNA that a bacteria acquires comes at some cost, so the bacteria doesn’t usually hang onto traits it doesn’t need,” said Hudson. “The further we dug down into the genome, the more stories we found about movement within the organism and from other organisms and the history of insults, like antibiotics, that it has faced. This particular bacteria is just getting nastier over time.”

    Applying findings to future work

    The findings are being applied to a Laboratory Directed Research and Development project led by Sandia microbiologist Eric Carnes, who is examining alternative approaches for treating drug-resistant organisms. “Instead of traditional antibiotics, we use a sequence-based approach to silence expression of drug-resistant genes,” said Meagher.

    The researchers also are applying their understanding of Klebsiella pneumoniae’s mechanisms of resistance and their new bioinformatics tools to developing diagnostic tools to detect bioengineering. Looking across 10 related but distinct strains of Klebsiella pneumoniae, they pinpointed regions that were new to their strain, and so indicate genetic movement.

    “By studying the pattern of movement, we can better characterize a natural genomic island,” said Hudson. “This leads down the path of what does an unnatural island look like, which is an indication of bioengineering. We hope to apply the knowledge we gained from sequencing Klebsiella pneumoniae to developing diagnostic tools that could detect bioengineering.”

    See the full article here.

    Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.
    i1
    i2
    i3

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 6:44 pm on September 22, 2014 Permalink | Reply
    Tags: , Biochemistry, , ,   

    From Stanford: “Stanford researchers create ‘evolved’ protein that may stop cancer from spreading” 

    Stanford University Name
    Stanford University

    September 21, 2014
    Tom Abate

    Experimental therapy stopped the metastasis of breast and ovarian cancers in lab mice, pointing toward a safe and effective alternative to chemotherapy.

    A team of Stanford researchers has developed a protein therapy that disrupts the process that causes cancer cells to break away from original tumor sites, travel through the bloodstream and start aggressive new growths elsewhere in the body.

    This process, known as metastasis, can cause cancer to spread with deadly effect.

    “The majority of patients who succumb to cancer fall prey to metastatic forms of the disease,” said Jennifer Cochran, an associate professor of bioengineering who describes a new therapeutic approach in Nature Chemical Biology.

    Today doctors try to slow or stop metastasis with chemotherapy, but these treatments are unfortunately not very effective and have severe side effects.

    The Stanford team seeks to stop metastasis, without side effects, by preventing two proteins – Axl and Gas6 – from interacting to initiate the spread of cancer.

    Axl proteins stand like bristles on the surface of cancer cells, poised to receive biochemical signals from Gas6 proteins.

    When two Gas6 proteins link with two Axls, the signals that are generated enable cancer cells to leave the original tumor site, migrate to other parts of the body and form new cancer nodules.

    To stop this process Cochran used protein engineering to create a harmless version of Axl that acts like a decoy. This decoy Axl latches on to Gas6 proteins in the bloodstream and prevents them from linking with and activating the Axls present on cancer cells.

    In collaboration with Professor Amato Giaccia, co-director of the Radiation Biology Program in the Stanford Cancer Center, the researchers gave intravenous treatments of this bioengineered decoy protein to mice with aggressive breast and ovarian cancers.

    two
    Jennifer Cochran and Amato Giaccia are members of a team of researchers who have developed an experimental therapy to treat metastatic cancer.

    Mice in the breast cancer treatment group had 78 percent fewer metastatic nodules than untreated mice. Mice with ovarian cancer had a 90 percent reduction in metastatic nodules when treated with the engineered decoy protein.

    “This is a very promising therapy that appears to be effective and nontoxic in preclinical experiments,” Giaccia said. “It could open up a new approach to cancer treatment.”

    Giaccia and Cochran are scientific advisors to Ruga Corp., a biotech startup in Palo Alto that has licensed this technology from Stanford. Further preclinical and animal tests must be done before determining whether this therapy is safe and effective in humans.

    Greg Lemke, of the Molecular Neurobiology Laboratory at the Salk Institute, called this “a prime example of what bioengineering can do” to open up new therapeutic approaches to treat metastatic cancer.

    “One of the remarkable things about this work is the binding affinity of the decoy protein,” said Lemke, a noted authority on Axl and Gas6 who was not part of the Stanford experiments.

    “The decoy attaches to Gas6 up to a hundredfold more effectively than the natural Axl,” Lemke said. “It really sops up Gas6 and takes it out of action.”
    Directed evolution

    The Stanford approach is grounded on the fact that all biological processes are driven by the interaction of proteins, the molecules that fit together in lock-and-key fashion to perform all the tasks required for living things to function.

    In nature proteins evolve over millions of years. But bioengineers have developed ways to accelerate the process of improving these tiny parts using technology called directed evolution. This particular application was the subject of the doctoral thesis of Mihalis Kariolis, a bioengineering graduate student in Cochran’s lab.

    Using genetic manipulation, the Stanford team created millions of slightly different DNA sequences. Each DNA sequence coded for a different variant of Axl.

    The researchers then used high-throughput screening to evaluate over 10 million Axl variants. Their goal was to find the variant that bound most tightly to Gas6.

    Kariolis made other tweaks to enable the bioengineered decoy to remain in the bloodstream longer and also to tighten its grip on Gas6, rendering the decoy interaction virtually irreversible.

    Yu Rebecca Miao, a postdoctoral scholar in Giaccia’s lab, designed the testing in animals and worked with Kariolis to administer the decoy Axl to the lab mice. They also did comparison tests to show that sopping up Gas6 resulted in far fewer secondary cancer nodules.

    Irimpan Mathews, a protein crystallography expert at SLAC National Accelerator Laboratory, joined the research effort to help the team better understand the binding mechanism between the Axl decoy and Gas6.

    Protein crystallography captures the interaction of two proteins in a solid form, allowing researchers to take X-ray-like images of how the atoms in each protein bind together. These images showed molecular changes that allowed the bioengineered Axl decoy to bind Gas6 far more tightly than the natural Axl protein.
    Next steps

    Years of work lie ahead to determine whether this protein therapy can be approved to treat cancer in humans. Bioprocess engineers must first scale up production of the Axl decoy to generate pure material for clinical tests. Clinical researchers must then perform additional animal tests in order to win approval for and to conduct human trials. These are expensive and time-consuming steps.

    But these early, hopeful results suggest that the Stanford approach could become a nontoxic way to fight metastatic cancer.

    Glenn Dranoff, a professor of medicine at Harvard Medical School and a leading researcher at the Dana-Farber Cancer Institute, reviewed an advance copy of the Stanford paper but was otherwise unconnected with the research. “It is a beautiful piece of biochemistry and has some nuances that make it particularly exciting,” Dranoff said, noting that tumors often have more than one way to ensure their survival and propagation.

    Axl has two protein cousins, Mer and Tyro3, that can also promote metastasis. Mer and Tyro3 are also activated by Gas6.

    “So one therapeutic decoy might potentially affect all three related proteins that are critical in cancer development and progression,” Dranoff said.

    Erinn Rankin, a postdoctoral fellow in the Giaccia lab, carried out proof of principle experiments that paved the way for this study.

    Other co-authors on the Nature Chemical Biology paper include Douglas Jones, a former doctoral student, and Shiven Kapur, a postdoctoral scholar, both of Cochran’s lab, who contributed to the protein engineering and structural characterization, respectively.

    Cochran said Stanford’s support for interdisciplinary research made this work possible.

    Stanford ChEM-H (Chemistry, Engineering & Medicine for Human Health) provided seed funds that allowed Cochran and Mathews to collaborate on protein structural studies.

    The Stanford Wallace H. Coulter Translational Research Grant Program, which supports collaborations between engineers and medical researchers, supported the efforts of Cochran and Giaccia to apply cutting-edge bioengineering techniques to this critical medical need.

    See the full article here.

    Leland and Jane Stanford founded the University to “promote the public welfare by exercising an influence on behalf of humanity and civilization.” Stanford opened its doors in 1891, and more than a century later, it remains dedicated to finding solutions to the great challenges of the day and to preparing our students for leadership in today’s complex world. Stanford, is an American private research university located in Stanford, California on an 8,180-acre (3,310 ha) campus near Palo Alto. Since 1952, more than 54 Stanford faculty, staff, and alumni have won the Nobel Prize, including 19 current faculty members

    Stanford University Seal

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 3:50 pm on September 22, 2014 Permalink | Reply
    Tags: , Biochemistry, , ,   

    From Caltech: “Variability Keeps The Body In Balance” 

    Caltech Logo
    Caltech

    09/22/2014
    Jessica Stoller-Conrad

    Although the heart beats out a very familiar “lub-dub” pattern that speeds up or slows down as our activity increases or decreases, the pattern itself isn’t as regular as you might think. In fact, the amount of time between heartbeats can vary even at a “constant” heart rate—and that variability, doctors have found, is a good thing.

    runner

    Reduced heart rate variability (HRV) has been found to be predictive of a number of illnesses, such as congestive heart failure and inflammation. For athletes, a drop in HRV has also been linked to fatigue and overtraining. However, the underlying physiological mechanisms that control HRV—and exactly why this variation is important for good health—are still a bit of a mystery.

    By combining heart rate data from real athletes with a branch of mathematics called control theory, a collaborative team of physicians and Caltech researchers from the Division of Engineering and Applied Sciences have now devised a way to better understand the relationship between HRV and health—a step that could soon inform better monitoring technologies for athletes and medical professionals.

    The work was published in the August 19 print issue of the Proceedings of the National Academy of Sciences.

    To run smoothly, complex systems, such as computer networks, cars, and even the human body, rely upon give-and-take connections and relationships among a large number of variables; if one variable must remain stable to maintain a healthy system, another variable must be able to flex to maintain that stability. Because it would be too difficult to map each individual variable, the mathematics and software tools used in control theory allow engineers to summarize the ups and downs in a system and pinpoint the source of a possible problem.

    Researchers who study control theory are increasingly discovering that these concepts can also be extremely useful in studies of the human body. In order for a body to work optimally, it must operate in an environment of stability called homeostasis. When the body experiences stress—for example, from exercise or extreme temperatures—it can maintain a stable blood pressure and constant body temperature in part by dialing the heart rate up or down. And HRV plays an important role in maintaining this balance, says study author John Doyle, the Jean-Lou Chameau Professor of Control and Dynamical Systems, Electrical Engineering, and Bioengineering.

    “A familiar related problem is in driving,” Doyle says. “To get to a destination despite varying weather and traffic conditions, any driver—even a robotic one—will change factors such as acceleration, braking, steering, and wipers. If these factors suddenly became frozen and unchangeable while the car was still moving, it would be a nearly certain predictor that a crash was imminent. Similarly, loss of heart rate variability predicts some kind of malfunction or ‘crash,’ often before there are any other indications,” he says.

    To study how HRV helps maintain this version of “cruise control” in the human body, Doyle and his colleagues measured the heart rate, respiration rate, oxygen consumption, and carbon dioxide generation of five healthy young athletes as they completed experimental exercise routines on stationary bicycles.

    By combining the data from these experiments with standard models of the physiological control mechanisms in the human body, the researchers were able to determine the essential tradeoffs that are necessary for athletes to produce enough power to maintain an exercise workload while also maintaining the internal homeostasis of their vital signs.

    Because monitors in hospitals can already provide HRV levels and dozens of other signals and readings, the integration of such mathematical analyses of control theory into HRV monitors could, in the future, provide a way to link a drop in HRV to a more specific and treatable diagnosis. In fact, one of Doyle’s students has used an HRV application of control theory to better interpret traditional EKG signals.

    Control theory could also be incorporated into the HRV monitors used by athletes to prevent fatigue and injury from overtraining, he says.

    “Physicians who work in very data-intensive settings like the operating room or ICU are in urgent need of ways to rapidly and acutely interpret the data deluge,” says Marie Csete, MD (PhD, ’00), chief scientific officer at the Huntington Medical Research Institutes and a coauthor on the paper. “We hope this work is a first step in a larger research program that helps physicians make better use of data to care for patients.”

    “For example, the heart, lungs, and circulation must deliver sufficient oxygenated blood to the muscles and other organs while not raising blood pressure so much as to damage the brain,” Doyle says. “This is done in concert with control of blood vessel dilation in the muscles and brain, and control of breathing. As the physical demands of the exercise change, the muscles must produce fluctuating power outputs, and the heart, blood vessels, and lungs must then respond to keep blood pressure and oxygenation within narrow ranges.”

    Once these trade-offs were defined, the researchers then used control theory to analyze the exercise data and found that a healthy heart must maintain certain patterns of variability during exercise to keep this complicated system in balance. Loss of this variability is a precursor of fatigue, the stress induced by exercise. Today, some HRV monitors in the clinic can let a doctor know when variability is high or low, but they provide little in the way of an actionable diagnosis.

    Because monitors in hospitals can already provide HRV levels and dozens of other signals and readings, the integration of such mathematical analyses of control theory into HRV monitors could, in the future, provide a way to link a drop in HRV to a more specific and treatable diagnosis. In fact, one of Doyle’s students has used an HRV application of control theory to better interpret traditional EKG signals.

    See the full article here.

    The California Institute of Technology (commonly referred to as Caltech) is a private research university located in Pasadena, California, United States. Caltech has six academic divisions with strong emphases on science and engineering. Its 124-acre (50 ha) primary campus is located approximately 11 mi (18 km) northeast of downtown Los Angeles. “The mission of the California Institute of Technology is to expand human knowledge and benefit society through research integrated with education. We investigate the most challenging, fundamental problems in science and technology in a singularly collegial, interdisciplinary atmosphere, while educating outstanding students to become creative members of society.”
    Caltech buildings
    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 11:13 am on September 19, 2014 Permalink | Reply
    Tags: , Biochemistry,   

    From PNNL: “As Light Dims and Food Sources Are Limited, Key Changes in Proteins Occur in Cyanobacteria” 


    PNNL Lab

    September 2014
    Web Publishing Services

    As Light Dims and Food Sources Are Limited, Key Changes in Proteins Occur in Cyanobacteria
    Identification of redox-sensitive enzymes can enrich biofuel production research

    Results: Using a chemical biology approach, scientists at Pacific Northwest National Laboratory (PNNL) identified more than 300 proteins in a bacterium adept at converting carbon dioxide into other molecules of interest to energy researchers. These proteins are involved in generating macromolecule synthesis and carbon flux through central metabolic pathways and may also be involved in cell signaling and response mechanisms.

    The team’s research also suggests that dynamic redox changes in response to specific nutrient limitations, including carbon and nitrogen limitations, contribute to the regulatory changes driven by a shift from light to dark.

    They also observed that the number of labeled proteins under nitrogen or carbon limitation was ~50 percent greater than in nutrient-replete cultures, suggesting that nitrogen or carbon limitation results in increased probe labeling of proteins, indicative of a more reduced cellular environment.

    “Together, our results contribute to a high-level understanding of post-translational mechanisms that regulate flux distributions and suggest potential metabolic engineering targets for redirecting carbon toward biofuel precursors,” said Dr. Charles Ansong, PNNL scientist and co-first author of the research publication that appears in Frontiers in Microbiology. “Our identification of redox-sensitive enzymes involved in these processes can potentially enrich the experimental design of research in biofuel production.”

    image
    Overview of the chemical biology technique used by PNNL scientists to determine Synechococcus sp. PCC 7002 cells’ protein redox status in real time and identify redox-sensitive proteins as they occurred under induced nutrient perturbations including C and N limitation and transition from light to dark environments. Cell-permeable chemical probes derived from iodoacetamide (IAM-RP) and n-ethylmaleimide (Mal-RP) (top right) were applied to living cells. Once applied, the chemical probes irreversibly labeled proteins with reduced cysteines (bottom middle). The probe-labeled proteins were subsequently isolated for identification by high-resolution LC-MS (bottom right).

    Why It Matters: Plants or organisms that use sunlight to convert inorganic materials to organic ones for chemical compound production and respiration, among other functions, are called phototrophs. Scientists are interested in them because their conversion properties could translate into research experiments for biofuel production. But a key step toward such research is understanding protein redox chemistry-a reaction that can alter protein structure thereby regulating function. The lack of such understanding is a major void in knowledge about photoautotrophic system regulation and signaling processes.

    To decrease that void, the PNNL team analyzed redox-sensitive proteins in live Synechococcus sp. PCC 7002 cells in both light and dark periods, and to understand how cellular redox balance is disrupted during nutrient perturbation.

    Research Team: Charles Ansong, Natalie C. Sadler, Eric A. Hill, Michael P. Lewis, Erika M. Zink, Richard D. Smith, Alexander S. Beliaev, Allan E. Konopka, and Aaron T. Wright, all PNNL.

    See the full article here.

    Pacific Northwest National Laboratory (PNNL) is one of the United States Department of Energy National Laboratories, managed by the Department of Energy’s Office of Science. The main campus of the laboratory is in Richland, Washington.

    PNNL scientists conduct basic and applied research and development to strengthen U.S. scientific foundations for fundamental research and innovation; prevent and counter acts of terrorism through applied research in information analysis, cyber security, and the nonproliferation of weapons of mass destruction; increase the U.S. energy capacity and reduce dependence on imported oil; and reduce the effects of human activity on the environment. PNNL has been operated by Battelle Memorial Institute since 1965.

    i1

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 9:43 am on September 14, 2014 Permalink | Reply
    Tags: , Biochemistry, , , ,   

    From M.I.T Tech Review: “Gene-Silencing Drugs Finally Show Promise” 

    MIT Technology Review
    M.I.T Technology Review

    September 14, 2014
    Kevin Bullis

    After more than a decade of disappointment, a startup leads the development of a powerful new class of drugs based on a Nobel-winning idea.

    The disease starts with a feeling of increased clumsiness. Spilling a cup of coffee. Stumbling on the stairs. Having accidents that are easy to dismiss—everyone trips now and then.

    But it inevitably gets worse. Known as familial amyloid polyneuropathy, or FAP, it can go misdiagnosed for years as patients lose the ability to walk or perform delicate tasks with their hands. Most patients die within 10 to 15 years of the first symptoms.

    There is no cure. The disease is caused by malformed proteins produced in the liver, so one treatment is a liver transplant. But few patients can get one—and it only slows the disease down.

    Now, after years of false starts and disappointment, it looks like an audacious idea for helping these patients finally could work.

    In 1998, researchers at the Carnegie Institution and the University of Massachusetts made a surprising discovery about how cells regulate which proteins they produce. They found that certain kinds of RNA—which is what DNA makes to create proteins—can turn off specific genes. The finding, called RNA interference (RNAi), was exciting because it suggested a way to shut down the production of any protein in the body, including those connected with diseases that couldn’t be touched with ordinary drugs. It was so promising that its discoverers won the Nobel Prize just eight years later.

    Inspired by the discovery, another group of researchers—including the former thesis supervisor of one of the Nobel laureates—founded Alnylam in Cambridge, Massachusetts, in 2002. Their goal: fight diseases like FAP by using RNAi to eliminate bad proteins (see “The Prize of RNAi” and “Prescription RNA”). Never mind that no one knew how to make a drug that could trigger RNAi. In fact, that challenge would bedevil the researchers for the better part of a decade. Along the way, the company lost the support of major drug companies that had signed on in a first wave of enthusiasm. At one point the idea of RNAi therapy was on the verge of being discredited.

    But now Alnylam is testing a drug to treat FAP in advanced human trials. It’s the last hurdle before the company will seek regulatory approval to put the drug on the market. Although it’s too early to tell how well the drug will alleviate symptoms, it’s doing what the researchers hoped it would: it can decrease the production of the protein that causes FAP by more than 80 percent.

    This could be just the beginning for RNAi. Alnylam has more than 11 drugs, including ones for hemophilia, hepatitis B, and even high cholesterol, in its development pipeline, and has three in human trials —progress that led the pharmaceutical company Sanofi to make a $700 million investment in the company last winter. Last month, the pharmaceutical giant Roche, an early Alnylam supporter that had given up on RNAi, reversed its opinion of the technology as well, announcing a $450 million deal to acquire the RNAi startup Santaris. All told, there are about 15 RNAi-based drugs in clinical trials from several research groups and companies.

    “The world went from believing RNAi would change everything to thinking it wouldn’t work, to now thinking it will,” says Robert Langer, a professor at MIT, and one of Alnylam’s advisors.

    Delivering Drugs

    Alnylam started with high hopes. Its founders, among them the Nobel laureate and MIT biologist Philip Sharp, had solved one of the biggest challenges facing the idea of RNAi therapies. When RNAi was discovered, the process was triggered by introducing a type of RNA, called double stranded RNA, into cells. This worked well in worms and fruit flies. But the immune system in mammals reacted violently to the RNA, causing cells to die and making the approach useless except as a research tool. The Alnylam founders figured out that shorter strands, called siRNA, could slip into mammalian cells without triggering an immune reaction, suggesting a way around this problem.

    Yet another huge problem remained. RNA interference depends upon delivering RNA to cells, tricking the cells into allowing it through the protective cell membrane, and then getting the cells to incorporate it into molecular machinery that regulates proteins. Scientists could do this in petri dishes but not in animals.

    Alnylam looked everywhere for solutions, scouring the scientific literature, collaborating with other companies, considering novel approaches of its own. It focused on two options. One was encasing RNA in bubbles of fat-like nanoparticles of lipids. They are made with the same materials that make up cell membranes—the thought was that the cell would respond well to the familiar substance. The other approach was attaching a molecule to the RNA that cells like to ingest, tricking the cell into eating it.

    And both approaches worked, sort of. Researchers were able to block protein production in lab animals. But getting the delivery system right wasn’t easy. The early mechanisms were too toxic at the doses required to be used as drugs.

    As a result, delivering RNA through the bloodstream like a conventional drug seemed a far-off prospect. The company tried a shortcut of injecting chemically modified RNA directly into diseased tissue —for example, into the retina to treat eye diseases. That approach even got to clinical trials. But it was shelved because it didn’t perform as well as up-and-coming drugs from other companies.

    By 2010, some of the major drug companies that were working with and investing in Alnylam lost patience. Novartis decided not to extend a partnership with Alnylam; Roche gave up on RNAi altogether. Alnylam laid off about a quarter of its workers, and by mid-2011, its stock price had plunged by 80 percent from its peak.

    But Alnylam and partner companies, notably the Canadian startup Tekmira, were making steady progress in the lab. Researchers identified one part of the lipid nanoparticles that was keeping them from delivering its cargo of RNA to the right part of a cell. That was “the real eureka moment,” says Rachel Meyers, Alnylam’s vice president of research. Better nanoparticles improved the potency of a drug a hundredfold and its safety by about five times, clearing the way for clinical trials for FAP—a crucial event that kept the company alive.

    Even with that progress, Alnylam needed more. The nanoparticle delivery mechanism is costly to make and requires frequent visits to the hospital for hour-long IV infusions—something patients desperate to stay alive will put up with, but likely not millions of people with high cholesterol.

    So Alnylam turned to its second delivery approach—attaching molecules to RNA to trick cells into ingesting it. Researchers found just the right inducement—attaching a type of sugar molecule. This approach allows for the drug to be administered with a simple injection that patients could give themselves at home.

    In addition to being easier to administer, the new sugar-based drugs are potentially cheaper to make. The combination of low cost and ease-of-use is allowing Alnylam to go after more common diseases—not just the rare ones that patients will go to great lengths to treat. “Because we’ve made incredible improvements in the delivery strategy,” Meyers says, “we can now go after big diseases where we can treat millions of patients potentially.”

    The Next Frontier

    In a sixth-floor lab on the MIT campus, postdoctoral researcher James Dahlman takes down boxes from a high shelf. They contain hundreds of vials, each containing a unique type of nanoparticle that Dahlman synthesized painstakingly, one at a time. “It turns out we have a robot in the lab that can do that,” he says. “But I didn’t know about it at the time.”

    Dahlman doesn’t work for Alnylam; he had been searching for the next great delivery mechanism, one that could greatly expand the diseases that can be treated by RNAi. Some of the materials look like clear liquids. Some are waxy, some like salt crystals. He points to a gap in the rows of vials, where a vial is conspicuously missing. “That’s the one that worked. That’s the miracle material,” he says.

    For all of their benefits, the drug delivery mechanisms Alnylam uses have one flaw—they’re effective only for delivering drugs to liver cells. For a number of reasons, the liver is a relatively easy target—that’s where all kinds of nanoparticles tend to end up. Alnylam sees the potential for billions of dollars in revenue from liver-related diseases. Yet most diseases involve other tissues in the body.

    Dahlman and his colleagues at MIT are some of the leaders in the next generation of RNAi delivery—targeting delivery to places throughout the body. Last month, in two separate articles, they published the results of studies showing that Dahlman’s new nanoparticles are a powerful way to deliver RNAi to blood vessel cells, which are associated with a wide variety of diseases. The studies showed that the method could be used to reduce tumor growth in lung cancer, for example.

    Treating cancer is one area where RNAi’s particular advantages are expected to shine. Conventional chemotherapy affects more than just the target cancer cells—it also hurts healthy tissue, which is why it makes people feel miserable. But RNAi can be extremely precise, potentially shutting down only proteins found in cancer cells. And Dahlman’s latest delivery system makes it possible to simultaneously target up to 10 proteins at once, which could make cancer treatments far more effective. Lab work like this is far from fruition, but if it maintains its momentum, the drugs currently in clinical trials could represent just a small portion of the benefits of the discovery of RNAi.

    See the full article here.

    The mission of MIT Technology Review is to equip its audiences with the intelligence to understand a world shaped by technology.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
Follow

Get every new post delivered to your Inbox.

Join 378 other followers

%d bloggers like this: