Tagged: Biology Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 9:41 pm on January 15, 2017 Permalink | Reply
    Tags: , , Biology, , ELSI - Earth-Life Science Institute, EON - ELSI Origins Network, LUCA - the Last Universal Common Ancestor of Life on Earth, , Messy chemistry, Ribosomes,   

    From Many Worlds: “Messy Chemistry, Evolving Rocks, and the Origin of Life” 

    NASA NExSS bloc


    Many Worlds

    Many Words icon

    Marc Kaufman

    Ribosomes are life’s oldest and most universal assembly of molecules. Today’s ribosome converts genetic information (RNA) into proteins that carry out various functions in an organism. A growing number of scientists are exploring how earliest components of life such as the ribosome came to be. They’re making surprising progress, but the going remains tough. No image credit.

    Noted synthetic life researcher Steven Benner of Foundation for Applied Molecular Evolution is fond of pointing out that gooey tars are the end product of too many experiments in his field. His widely-held view is that the tars, made out of chemicals known to be important in the origin of life, are nonetheless a dead end to be avoided when trying to work out how life began.

    But in the changing world of origins of life research, others are asking whether those messy tars might not be a breeding ground for the origin of life, rather than an obstacle to it.

    One of those is chemist and astrobiologist Irena Mamajanov of the Earth-Life Science Institute (ELSI) in Tokyo. As she recently explained during an institute symposium, scientists know that tar-like substances were present on early Earth, and that she and her colleagues are now aggressively studying their potential role in the prebiotic chemical transformations that ultimately allowed life to emerge out of non-life.

    “We call what we do messy chemistry, and we think it can help shed light on some important processes that make life possible.”

    Irena Mamajanov of the Earth-Life Science Institute (ELSI) in Tokyo was the science lead for a just completed symposium on emerging approaches to the origin of life question.

    It stands to reason that the gunky tar played a role, she said, because tars allow some essential processes to occur: They can concentrate compounds, it can encapsulate them, and they could provide a kind of primitive (messy) scaffolding that could eventually evolve into the essential backbones of a living entity.

    “Scientists in the field have tended to think of the origin of life as a process going from simple to more complex, but we think it may have gone from very complex — messy — to more structured.”

    Mamajanov is part of an unusual group gathered at (ELSI), a relatively new site on the campus of the Tokyo Institute of Technology for origin of life study with a mandate to be interdisciplinary and to think big and outside the box.

    ELSI just completed its fifth annual symposium, and it brought together researchers from a wide range of fields to share their research on what might have led to the emergence of life. And being so interdisciplinary, the ELSI gathering was anything but straight and narrow itself.

    There was talk of the “evolution” of prebiotic compounds; of how the same universal 30 to 50 genes can be found in all living things from bacteria to us; of the possibility that the genomes of currently alive microbes surviving in extreme environments provide a window into the very earliest life; and even that evolutionary biology suggests that life on other Earth-like planets may well have evolved to form rather familiar creatures.

    Except for that last subject, the focus was very much on ways to identify the last universal common ancestor (LUCA), and what about Earth made life possible and what about life changed Earth.

    Scientific interest in the origin of life on Earth (and potentially elsewhere) tends to wax and wane, in large part because the problem is so endlessly complex. It’s one of the biggest questions in science, but some say that it will never be fully answered.

    But there has been a relatively recent upsurge in attention being paid and in funding for origin of life researchers.

    The Japanese government gave $100 million to start and operate ELSI, the Simons Foundation has donated another $100 million for an origins of life institute at Harvard, the Templeton Foundation has made numerous origin of life grants and, as it has for years, the NASA Astrobiology Institute has funded researchers. Some of the findings and theories are most intriguing and represent a break of sorts from the past.

    For some decades now, the origins of life field has been pretty sharply divided. One group holds that life began when metabolism (a small set of reactions able to harness and transform energy ) arose spontaneously; others maintain that it was the ability of a chemical system to replicate itself (the RNA world) that was the turning point. Metabolism First versus the RNA First, plus some lower-profile theories.

    In keeping with its goal of bringing scientists and disciplines together and to avoid as much origin-of-life dogma as possible, Mamajanov sees their “messy chemistry” approach as a third way and a more non-confrontational approach. It’s not a model for how life began per se, but one of many new approaches designed to shed light and collect data about those myriad processes.

    “This division in the field is hurting science because people are not talking to each other ,” she said. “By design we’re not in one camp or another.”

    Loren Williams of Georgia tech

    Another speaker who exemplified that approach was Loren Williams of Georgia Tech, a biochemist whose lab studies the genetic makeup of those universal 30 to 50 ribosomes (a complex molecule made of RNA molecules and proteins that form a factory for protein synthesis in cells.) He was principal investigator for the NASA Astrobiology Institute’s Georgia Tech Center for Ribosome Adaptation and Evolution from 2009-2014.

    His goal is to collect hard data on these most common genes, with the inference that they are the oldest and closest to LUCA.

    “What becomes quickly clear is that the models of the origin of life don’t fit the data,” he said. “What the RNA model predicts, for instance, is totally disconnected from this data. So what happens with this disconnect? The modelers throw away the data. They say it doesn’t relate. Instead, I ignore the models.”

    A primary conclusion of his work is that early molecules — rather like many symbiotic relationships in nature today — need each other to survive. He gave the current day example of the fig wasp, which spends its larval stage in a fig, then serves as a pollinator for the tree, and then survives on the fruit that appears.

    He sees a parallel “mutualism” in the ribosomes he studies. “RNA is made by protein; all protein is made by RNA,” he said. It’s such a powerful concept for him that he wonders if “mutualism” doesn’t define a living system from the non-living.

    These stromatolites, wavelike patterns created by bacteria embedded in sediment, are 3.7 billion years old and may represent the oldest life on the planet. Photo by Allen Nutman

    Stromalites, sedimentary structures produced by microorganisms, today at Shark Bay, Australia. Remarkably, the lifeform has survived through billions of years of radical transformation on Earth, catastrophes and ever-changing ecologies.

    A consistent theme of the conference was that life emerged from the geochemistry present in early Earth. It’s an unavoidable truth that leads down some intriguing pathways.

    As planetary scientist Marc Hirschmann of the University of Minnesota reported at the gathering, the Earth actually has far less carbon, oxygen, nitrogen and other elements essential for life than the sun, than most asteroids, than even intersellar space.

    Since Earth was initially formed with the same galactic chemistry as those other bodies and arenas, Hirschmann said, the story of how the Earth was formed is one of losing substantial amounts of those elements rather than, as is commonly thought, by gaining them.

    The logic of this dynamic raises the question of how much of those elements does a planet have to lose, or can lose, to be considered habitable. And that in turn requires examination of how the Earth lost so much of its primordial inheritance — most likely from the impact that formed the moon, the resulting destruction of the early Earth atmosphere, and the later movement of the elements into the depths of the planet via plate tectonics. It’s all now considered part of the origins story.

    And as argued by Charley Lineweaver, a cosmologist with the Planetary Science Institute and the Australian National University, it has become increasingly difficult to contend that life on other planets is anything but abundant, especially now that we know that virtually all stars have planets orbiting them and that many billions of those planets will be the size of Earth.

    Other planets will have similar geochemical regimes and some will have undergone events that make their distribution of elements favorable for life. And as described by Eric Smith, an expert in complex systems at ELSI and the Santa Fe Institute, the logic of physics says that if life can emerge then it will.

    Any particular planetary life may not evolve beyond single cell lifeforms for a variety of reasons, but it will have emerged. The concept of the “origin of life” has taken on some very new meanings.

    ELSI was created in 2012 after its founders won a World Premier International Research Center Initiative grant from the Japanese government. The WPI grant is awarded to institutes with a research vision to become globally competitive centers that can attract the best scientists from around the world to come work in Japan.

    The nature and aims of ELSI and its companion group the ELSI Origins Network (EON) strike me as part of the story. They break many molds.

    The creators of ELSI, both Japanese and from elsewhere, say that the institute is highly unusual for its welcome of non-Japanese faculty and students. They stay for years or months or even weeks as visitors.

    While ELSI is an government-funded institute with buildings, professors, researchers and a mission (to greatly enhance origin of life study in Japan), EON is a far-flung collection of top international origins scientists of many disciplines. Their home bases are places like Princeton’s Institute for Advanced Study, Harvard, Columbia, Dartmouth, Caltech and the University of Minnesota, among others in the U.S., Europe and Asia. NASA officials also play a supporting, but not financial, role.

    ELSI postdocs and other students live in Tokyo, while the EON fellows spend six months at ELSI and six months at home institutions. All of this is in the pursuit of scientific collaboration, exposing young scientists in one field related to origins to those in another, and generally adding to global knowledge about the sprawling subject of origins of life.

    Jim Cleaves, of ELSI and the Institute for Advanced Study, is the director of EON and an ambassador of sorts for its unusual mission. He, and others at the ELSI symposium, are eager to share their science and want young scientists interested in the origins of life to know there are many opportunities with ELSI and EON for research, study and visitorships on the Tokyo campus.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    About Many Worlds

    There are many worlds out there waiting to fire your imagination.

    Marc Kaufman is an experienced journalist, having spent three decades at The Washington Post and The Philadelphia Inquirer, and is the author of two books on searching for life and planetary habitability. While the “Many Worlds” column is supported by the Lunar Planetary Institute/USRA and informed by NASA’s NExSS initiative, any opinions expressed are the author’s alone.

    This site is for everyone interested in the burgeoning field of exoplanet detection and research, from the general public to scientists in the field. It will present columns, news stories and in-depth features, as well as the work of guest writers.

    About NExSS

    The Nexus for Exoplanet System Science (NExSS) is a NASA research coordination network dedicated to the study of planetary habitability. The goals of NExSS are to investigate the diversity of exoplanets and to learn how their history, geology, and climate interact to create the conditions for life. NExSS investigators also strive to put planets into an architectural context — as solar systems built over the eons through dynamical processes and sculpted by stars. Based on our understanding of our own solar system and habitable planet Earth, researchers in the network aim to identify where habitable niches are most likely to occur, which planets are most likely to be habitable. Leveraging current NASA investments in research and missions, NExSS will accelerate the discovery and characterization of other potentially life-bearing worlds in the galaxy, using a systems science approach.
    The National Aeronautics and Space Administration (NASA) is the agency of the United States government that is responsible for the nation’s civilian space program and for aeronautics and aerospace research.

    President Dwight D. Eisenhower established the National Aeronautics and Space Administration (NASA) in 1958 with a distinctly civilian (rather than military) orientation encouraging peaceful applications in space science. The National Aeronautics and Space Act was passed on July 29, 1958, disestablishing NASA’s predecessor, the National Advisory Committee for Aeronautics (NACA). The new agency became operational on October 1, 1958.

    Since that time, most U.S. space exploration efforts have been led by NASA, including the Apollo moon-landing missions, the Skylab space station, and later the Space Shuttle. Currently, NASA is supporting the International Space Station and is overseeing the development of the Orion Multi-Purpose Crew Vehicle and Commercial Crew vehicles. The agency is also responsible for the Launch Services Program (LSP) which provides oversight of launch operations and countdown management for unmanned NASA launches. Most recently, NASA announced a new Space Launch System that it said would take the agency’s astronauts farther into space than ever before and lay the cornerstone for future human space exploration efforts by the U.S.

    NASA science is focused on better understanding Earth through the Earth Observing System, advancing heliophysics through the efforts of the Science Mission Directorate’s Heliophysics Research Program, exploring bodies throughout the Solar System with advanced robotic missions such as New Horizons, and researching astrophysics topics, such as the Big Bang, through the Great Observatories [Hubble, Chandra, Spitzer, and associated programs. NASA shares data with various national and international organizations such as from the [JAXA]Greenhouse Gases Observing Satellite.

  • richardmitnick 9:07 am on January 9, 2017 Permalink | Reply
    Tags: , , Biology, It's official: A brand-new human organ has been classified, Mesentery,   

    From Science Alert: “It’s official: A brand-new human organ has been classified” 


    Science Alert

    3 JAN 2017

    J Calvin Coffey/D Peter O’Leary/Henry Vandyke Carter

    Your body now has an extra organ.

    Researchers have classified a brand-new organ inside our bodies, one that’s been hiding in plain sight in our digestive system this whole time.

    Although we now know about the structure of this new organ, its function is still poorly understood, and studying it could be the key to better understanding and treatment of abdominal and digestive disease.

    Known as the mesentery, the new organ is found in our digestive systems, and was long thought to be made up of fragmented, separate structures. But recent research has shown that it’s actually one, continuous organ.

    The evidence for the organ’s reclassification is now published in The Lancet Gastroenterology & Hepatology.

    “In the paper, which has been peer reviewed and assessed, we are now saying we have an organ in the body which hasn’t been acknowledged as such to date,” said J Calvin Coffey, a researcher from the University Hospital Limerick in Ireland, who first discovered that the mesentery was an organ.

    “The anatomic description that had been laid down over 100 years of anatomy was incorrect. This organ is far from fragmented and complex. It is simply one continuous structure.”

    Thanks to the new research, as of last year, medical students started being taught that the mesentery is a distinct organ.

    The world’s best-known series of medical textbooks, Gray’s Anatomy, has even been updated to include the new definition.

    So what is the mesentery? It’s a double fold of peritoneum – the lining of the abdominal cavity – that attaches our intestine to the wall of our abdomen, and keeps everything locked in place.

    One of the earliest descriptions of the mesentery was made by Leonardo da Vinci, and for centuries it was generally ignored as a type of insignificant attachment. Over the past century, doctors who studied the mesentery assumed it was a fragmented structure made of separate sections, which made it pretty unimportant.

    But in 2012, Coffey and his colleagues showed through detailed microscopic examinations that the mesentery is actually a continuous structure.

    Over the past four years, they’ve gathered further evidence that the mesentery should actually be classified as its own distinct organ, and the latest paper makes it official.

    You can see the new organ illustrated below:

    No image caption. No image credit.

    And while that doesn’t change the structure that’s been inside our bodies all along, with the reclassification comes a whole new field of medical science that could improve our health outcomes.

    “When we approach it like every other organ… we can categorise abdominal disease in terms of this organ,” said Coffey.

    That means that medical students and researchers will now investigate what role – if any – the mesentery might play on abdominal diseases, and that understanding will hopefully lead to better outcomes for patients.

    “Now we have established anatomy and the structure. The next step is the function. If you understand the function you can identify abnormal function, and then you have disease. Put them all together and you have the field of mesenteric science … the basis for a whole new area of science,” said Coffey.

    “This is relevant universally as it affects all of us.”

    It just goes to show that no matter how advanced science becomes, there’s always more to learn and discover, even within our own bodies.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 1:20 pm on January 6, 2017 Permalink | Reply
    Tags: , Biology, , ,   

    From Stanford: “Stanford study shows that tissue in the brain, rather than being lost, grows across childhood and may underlie better face recognition” 

    Stanford University Name
    Stanford University

    January 5, 2017
    Taylor Kubota
    (650) 724-7707

    A central tenet in neuroscience has been that the amount of brain tissue goes in one direction throughout our lives – from too much to just enough. A new study finds that in some cases the brain can add tissue as well.

    Stanford Professor Kalanit Grill-Spector, left, research associate Kevin Weiner and graduate student Jesse Gomez study growth in brain tissue enabling face recognition. (Image credit: Brianna Jeska)

    People are born with brains riddled with excess neural connections. Those are slowly pruned back until early childhood when, scientists thought, the brain’s structure becomes relatively stable.

    Now a pair of studies, published in the Jan. 6, 2017, issue of Science and Nov. 30, 2016, in Cerebral Cortex, suggest this process is more complicated than previously thought. For the first time, the group found microscopic tissue growth in the brain continues in regions that also show changes in function.

    The work overturns a central thought in neuroscience, which is that the amount of brain tissue goes in one direction throughout our lives – from too much to just enough. The group made this finding by looking at the brains of an often-overlooked participant pool: children.

    “I would say it’s only in the last 10 years that psychologists started looking at children’s brains,” said Kalanit Grill-Spector, a professor of psychology at Stanford and senior author of both papers. “The issue is, kids are not miniature adults and their brains show that. Our lab studies children because there’s still a lot of very basic knowledge to be learned about the developing brain in that age range.”

    Grill-Spector and her team examined a region of the brain that distinguishes faces from other objects. In Cerebral Cortex, they demonstrate that brain regions that recognize faces have a unique cellular make-up. In Science, they find that the microscopic structures within the region change from childhood into adulthood over a timescale that mirrors improvements in people’s ability to recognize faces.

    “We actually saw that tissue is proliferating,” said Jesse Gomez, graduate student in the Grill-Spector lab and lead author of the Science paper. “Many people assume a pessimistic view of brain tissue: that tissue is lost slowly as you get older. We saw the opposite – that whatever is left after pruning in infancy can be used to grow.”

    Microscopic brain changes

    The group studied regions of the brain that recognize faces and places, respectively, because knowing who you are looking at and where you are is important for everyday function. In adults, these parts of the brain are close neighbors, but with some visible structural differences.

    “If you could walk across an adult brain and you were to look down at the cells, it would be like walking through different neighborhoods,” Gomez said. “The cells look different. They’re organized differently.”

    Curious about the deeper cellular structures not visible by magnetic resonance imaging (MRI), the Stanford group collaborated with colleagues in the Institute of Neuroscience and Medicine, Research Centre Jülich, in Germany, who obtained thin tissue slices of post-mortem brains. Over the span of a year, this international collaboration figured out how to match brain regions identified with functional MRI in living brains with the corresponding brain slices. This allowed them to extract the microscopic cellular structure of the areas they scanned with functional MRI, which is not yet possible to do in living subjects. The microscopic images showed visible differences in the cellular structure between face and place regions.

    “There’s been this pipe dream in the field that we will one day be able to measure cellular architecture in living humans’ brains and this shows that we’re making progress,” said Kevin Weiner, a Stanford social science research associate, co-author of the Science paper and co-lead author of the Cerebral Cortex paper with Michael Barnett, a former research assistant in the lab.

    Neighborhoods of the brain

    This work established that the two parts of the brain look different in adults, but Grill-Spector has been curious about these areas in brains of children, particularly because the skills associated with the face region improve through adolescence. To further investigate how development of these skills relates to brain development, the researchers used a new type of imaging technique.

    They scanned 22 children (ages 5 to 12) and 25 adults (ages 22 to 28) using two types of MRI, one that indirectly measures brain activity (functional MRI) and one that measures the proportion of tissue to water in the brain (quantitative MRI). This scan has been used to show changes in the fatty insulation surrounding the long neuronal wires connecting brain regions over a person’s lifetime, but this study is the first to use this method to directly assess changes in the cells’ bodies.

    What they found, published in Science, is that, in addition to seeing a difference in brain activity in these two regions, the quantitative MRI showed that a certain tissue in the face region grows with development. Ultimately, this development contributes to the tissue differences between face and place regions in adults. What’s more, tissue properties were linked with functional changes in both brain activity and face recognition ability, which they evaluated separately. There is no indication yet of which change causes the other or if they happen in tandem.

    A test bed

    Being able to identify familiar faces and places, while clearly an important skillset, may seem like an odd choice for study. The reason these regions are worth some special attention, said Grill-Spector, is because we can identify them in each person’s brain, even a 5-year-old child, which means research on these regions can include large pools of participants and produce results that are easy to compare across studies. This research also has health implications, as approximately 2 percent of the adult population is poor at recognizing faces, a disorder sometimes referred to as facial blindness.

    What’s more, the fusiform gyrus, an anatomical structure in the brain that contains face-processing regions, is only found in humans and great apes (gorillas, chimps, bonobos and orangutans).

    “If you had told me five or 10 years ago that we’d be able to actually measure tissue growth in vivo, I wouldn’t have believed it,” Grill-Spector said. “It shows there are actual changes to the tissue that are happening throughout your development. I think this is fantastic.”

    Additional Stanford co-authors on the Science paper are Michael Barnett, Vaidehi Natu and Aviv Mezer (now at Hebrew University in Jerusalem); other co-authors are Katrin Amunts, Karl Zilles and Nicola Palomero-Gallagher of Institute of Neuroscience and Medicine, Research Centre Jülich, Jülich, Germany.

    The Science research was funded by the National Science Foundation, the National Eye Institute, European Union Seventh Framework Programme and a NARSAD Young Investigator Grant.

    Additional co-authors on the Cerebral Cortex paper include Anthony Stigliani of Stanford University; Katrin Amunts, Karl Zilles, Simon Lorenz and Julian Caspers of the Institute of Neuroscience and Medicine, Research Centre Jülich, in Jülich, Germany; and Bruce Fischl of Harvard Medical School and the Massachusetts Institute of Technology.

    This research was funded by the National Eye Institute, European Union Seventh Framework Programme, the National Institute for Biomedical Imaging and Bioengineering, and the National Institute on Aging, the National Institute for Neurological Disorders and Stroke. It was also made possible by the resources provided by Shared Instrumentation Grants. Additional support was provided by the NIH Blueprint for Neuroscience Research, part of the multi-institutional Human Connectome Project.

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    Leland and Jane Stanford founded the University to “promote the public welfare by exercising an influence on behalf of humanity and civilization.” Stanford opened its doors in 1891, and more than a century later, it remains dedicated to finding solutions to the great challenges of the day and to preparing our students for leadership in today’s complex world. Stanford, is an American private research university located in Stanford, California on an 8,180-acre (3,310 ha) campus near Palo Alto. Since 1952, more than 54 Stanford faculty, staff, and alumni have won the Nobel Prize, including 19 current faculty members

    Stanford University Seal

  • richardmitnick 9:55 am on January 3, 2017 Permalink | Reply
    Tags: , Biology, Regenerate an entirely new animal, Replicating Life in Code, Schmidtea mediterranea, Thomas Hunt Morgan   

    From NOVA: “Replicating Life in Code” 



    06 Jul 2016
    Cynthia Graber

    A few years ago, Michael Levin faced a conundrum. He and his colleagues at the Tufts Center for Regenerative and Developmental Biology just outside Boston wanted to find a model that would explain why the flatworm—a model organism used throughout biology—looks the way it does. At a fundamental level, they wanted to be able to describe the cascade of events that leads to the growth of a head in one place and a tail in the other.

    In fact, it was almost the same problem that Thomas Hunt Morgan, a Nobel Prize-winning evolutionary biologist and one of the founders of the modern study of genetics, faced more than a century ago. Back then, he was busying himself making careful cuts into flatworms. He sliced them lengthwise. He dissected them in half. From each segment, the worm grew a complete body. Eventually, Morgan carved off 1/279th of the worm, a bit selected from its mid-section, and demonstrated that it could regenerate an entirely new animal. He was trying, unsuccessfully, to understand why and how certain body parts developed where they did: Why did a head appear at one end of the worm after a cut?

    Thomas Hunt Morgan, years before his famous flatworm research was published, reading papers at Columbia University.

    Over the next 100 years, using new tools and insights, researchers replicated Morgan’s efforts in increasing detail: Make a cut here, a fully formed trunk and head will reappear. Tinker with this particular chemical or knock down this gene, and create a worm with a tail at either end. There are more than a thousand such papers. Yet, still, nobody has been able to fully explain why a head forms where it does.

    Levin and his colleagues at the center, where he’s the director, have been tackling this problem for years. In that time, they have helped explain some basic questions about development and regeneration. By tweaking certain signals within flatworms, for example, he has been able to grow a worm with four heads or one with no head but a tail on either end. He and his team have even grown an eye on a tadpole’s belly. But those experiments didn’t help them understand how it all fit together.

    “We have a massive literature of results saying ‘I did this to the worm and this happened,’ ” Levin says. “And we’re increasingly drowning in ever higher resolution genetic data sets. And yet, since Morgan cut his first worm, we still don’t have any model that explains more than a couple of different cuts.” And so, “after 120 years of really smart people going at it, I started to wonder, maybe this is beyond the ability of us to come up with off the top of our heads, to create a model that fits all the data.”

    Developmental biologists use the flatworm Schmidtea mediterranea to study regeneration.

    Computers, however, now offer enough power that Levin thought it might be possible to create such a model for a flatworm, one that would detail, in silico, the cascade of events that lead to the growth of a head at one end and a tail in the other. Still, even with a supercomputer’s worth of computational power, it wouldn’t be easy.

    Yet Levin’s ambitions didn’t stop there. Full, in-depth models beguile nearly every aspect of biology, and Levin hoped to create a tool that could be employed beyond flatworm development, one that could eventually model the vastly more complex world of human diseases. If he did it right, it might even help develop cures for those diseases. All he needed first was the right computer program.

    Enter Evolution

    It’s fitting that Levin’s computer models of biology are inspired by one of the fundamental tenets of biology itself, evolution. He builds his models around the idea that computer algorithms can meet, mate, and select the most fit version to then mate with other fit algorithms. These so-called genetic algorithms were first proposed by computer scientist John Holland in the 1960s. But even earlier, in the 1940s and ’50s, “people were already thinking about using inspiration from biology to build life-like computer programs,” says Melanie Mitchell, a professor of computer science at Portland State University and author of the book An Introduction to Genetic Algorithms. For instance, John Von Neumann, one of the earliest computer scientists in the 1940s, envisioned computers that could replicate themselves, with code serving as what we now call DNA. Mitchell says that Holland saw the field of genetic algorithms as a mathematical tool that could help explain how adaptation occurs in evolution.

    What Holland saw as theory, others took as a practical tool. For instance, one of his students, David Goldberg, used these new genetic algorithms to optimize plans for gas pipelines by mating different models until the algorithm came up with the best design. But while early computer scientists were limited by memory and speed, today’s more powerful computers can run increasingly complex models to process millions of possible combinations and save the best chunks of code before passing them on to the next model, just as in natural evolution. Mitchell says these models have applications in engineering, big data, drug design, banking, and ever more realistic computer graphics and animations.

    Unlike biological evolution, where individuals meet, mate, and pass on useful traits that best fit the environment, computer-based evolution starts with a goal or a set of rules. Then the computer generates millions, even billions, of models to try to meet those goals. The ones that solve part of the problem or meet some aspect of the goal have a higher likelihood of passing on that relevant code to the next generation.

    Robotics researchers have employed this approach as well. Josh Bonguard at the University of Vermont modeled robots that learned to evolve walking. Columbia University’s Hod Lipson used this approach to simulate machines that learned how to crawl on a table. He spun off a company called Nutonian that allows scientists to input their data, and then the program evolves equations until one explains the data. That equation could allow researchers to optimize designs, model what might happen in the future, or show how changes in one part of a system might affect the final result. “It can be used anywhere,” Lipson says, “from finance to rainfall in the Amazon.”

    “This approach—reverse engineering—it’s like a Russian spy movie,” says Johannes Jaeger, a developmental geneticist and scientific director of the Konrad Lorenz Institute for Evolution and Cognition Research in Austria. “You have some kind of gadget, like in the Cold War, some kind of Russian technology. You don’t know what it does. Here, you have an organism, you don’t know how it works, and you’re trying to infer the networks that make the pattern that you see in animals.” Jaeger began working in this field more than a decade ago and has used such algorithms to model the genetic network that created a segmented body pattern in fruit flies.

    But none of the models are as complex as the development of an entire creature’s shape.

    Billions of Experiments

    When Levin first proposed his modeling project a few years ago, his colleagues in biology found the proposal absurd. “Pretty much nobody I talked to thought it was going to work,” Levin says. His critics had two overarching reactions. Some thought it would be impossible to find any model that worked; biologists would say, he said: “ ‘You’re telling me this program is going to take random models and by recombining random changes to random models you’re going to find the right model? That’s ridiculously impossible.’ ” Levin disregarded that criticism. That was how evolution had worked, he thought, and computers finally seemed powerful enough to try.

    The second criticism he heard was that they’d find many models that explained the data, maybe 10, maybe 1,000. How would they know which one was the correct one? “In theory,” Levin says, “that was a possible outcome. But we didn’t have any. It’s actually very difficult to find a model that does what it needs to. I wasn’t worried. If we found more than one—fabulous.”

    Levin hired post-doc Daniel Lobo to lead the flatworm modeling project, a computer science PhD who had worked with Hod Lipson, and whose research was also inspired by Johannes Jaeger. (Lobo now heads his own lab at the University of Maryland, Baltimore County.) Lobo had the right combination of computer expertise and interest in biology, and he’d written a paper about applying genetic algorithms to the evolution of shape that caught Levin’s attention. He’d used such algorithms to automatically design structures optimized for unmanned landings, such as those of the rovers sent to Mars, and Levin was impressed with the way Lobo combined a deep knowledge of the field with an interest in making it practical.

    The first challenge was to take the more than 1,000 experiments that had been done on flatworm shape and create one language to describe those results. It’s a not insignificant challenge. Natural language, as opposed to computer code, is ambiguous, even in scientific papers. At the same time, the team had to decide what to encode. They didn’t need exact dimensions of a flatworm head, but they did need to include relative scale, for instance. Eventually, Lobo created a standardized language and a standardized mathematical approach that represents the shapes of the worm’s regions, its organs, and how they’re interconnected. The end result was a searchable database of results, which is now available to all flatworm biologists.

    Next, Lobo designed the simulation itself, a virtual worm on which candidate models would test their results. The computer compares the results of the simulated experiments to the real-world results expected from the database. The models receive scores based on how well they predict the outcomes seen in flesh-and-blood flatworms. Those with high scores reproduce; those with bad scores are discarded.

    Simulated flatworms can help anticipate the results of experiments before they happen.

    After four years of work, they’d come up with a common language for scientific papers, distilled the most crucial aspects of the worm to put in a model, developed a simulator, and created the algorithm to find the shape model. Levin felt fairly sure that the project would work—but they needed one more piece of equipment. Common lab computers, no matter how powerful, can’t yet quickly process the massive computations needed to evolve an entire biological model, in effect replicating millions of years of evolution. They needed a supercomputer. So the team rented time on Stampede, the University of Texas supercomputer that can perform up to 10 quadrillion mathematical operations per second.

    TACC bloc
    Dell Poweredge U Texas Austin Stampede Supercomputer. Texas Advanced Computer Center 9.6 PF
    Dell Poweredge U Texas Austin Stampede Supercomputer. Texas Advanced Computer Center 9.6 PF

    Levin says the first models performed terribly; they didn’t get anything right. But by the 100th generation, some of the models started to predict some of the correct results. By the 1,000th, the models were increasingly matching the real-world experimental results.

    In the end, it took 6 billion simulated experiments, 26,727 generations of models, and about 42 hours of processing by the Stampede computer before the computer came up with one result. This was what they’d been waiting for: one model that could explain the 1,000 existing experiments that generate the head-trunk-tail pattern in a flatworm.

    To test the model, the team introduced data from two papers on shape formation that had purposefully not been included in the dataset when developing the models. The model accurately explained the results of both papers.

    So Levin and Lobo took it one step further. The model predicted experiments that had never been done before. The team tested those experiments in the real world, and they worked. The new results were published in May, 2016, describing the activity of a previously unknown gene that played a role in shape formation.

    Intriguingly, the model suggests the existence of a second node that hasn’t yet been explained by current scientific knowledge; it could be a protein, it could be a particular chemical. “The computer knows there’s a product that should be there, that seems to be important. In a way, it’s predicting a product that we don’t yet know,” Lobo says.

    From Silicon Models to Hard Data

    The shape investigation a success, Levin and Lobo turned their attention to modeling disease. They started with melanoma—skin cancer—and did so by focusing on pigmentation cells in tadpoles.

    They conducted experiments in which tadpoles were exposed to particular chemicals during their development. The tadpoles had no obvious chromosomal damage or genetic trigger, but for some of them, the chemicals would spark a change in all of their pigmentation cells. Those cells then turned metastatic and invaded tissues throughout their bodies. But for other tadpoles under exactly the same conditions, nothing changed. Could the algorithm determine why?

    Graduate student Maria Lobikin conducted dozens of experiments, knocking out genes or applying drugs and determining what percentage of the tadpoles became hyper-pigmented. Then she combined those results with other research published in the last decade. The team followed the same approach, creating a standardized language to describe the experiments and using a supercomputer to evolve a model to understand how, in some tadpoles, under certain biological circumstances, the cells flipped to a hyper-pigmented, cancerous state.

    The computer-generated model came quite close to the results of the existing experiments. The model predicted the results of all papers but one; perhaps the algorithm had even caught inaccurate data, a finding that was published in the journal Science Signaling in October 2015. “I said, you know what, go back and redo this experiment just to make sure, and sure enough, the data were slightly off,” Levin says. “It’s almost like a verification step. If it’s having trouble matching the results of one experiment, maybe the problem’s not the model, maybe the problem’s your data.”

    More recently, they asked the model a question. Is there any way to create a scenario where only some of the pigment cells become cancerous, where it’s not an all-or-none response? The computer generated an unusual three-step combination of drugs. When the team tried the experiment suggested by the computer model, they were able to create the first partially-pigmented animals.

    While the tadpole model is far from an ideal surrogate for human disease, Levin points out that this research supports what other scientists have claimed, that cancer is not always a result of specific DNA damage. Rather it may also be a systems disorder, where the exact right set of circumstances in the system generate conditions for cancer to grow.

    The simulated experiments demonstrate how artificial intelligence can augment human abilities, both Lobo and Levin say. “I see it definitely not as a way to replace biologists,” Jaeger says with a laugh. He says it’s nearly impossible for humans to process all the relevant parameters to generate a model, but computer power can do what our brains simply can’t.

    The duo see many more such experiments in the future. At his Baltimore lab, Lobo is now focusing on bacteria, as modeling the ways in which the microbes create different compounds could be useful for the field of synthetic biology. He’s also trying to reverse-engineer cancer tumors to attempt to discover the best possible treatments to cause them to collapse. Levin sees applications in many fields: drug development, regenerative medicine, and understanding metabolism and disease. (Levin recently was awarded one of the first two Paul Allen Frontiers Group grants, a $30 million grant over eight years, to support risky, unconventional research; these computer-generated models are only a portion of his lab’s research.)

    In any case, employing computer algorithms to wrestle with yet unanswered questions in biology will, researchers say, only become more mainstream. Lipson says this approach is crucial: “We’re in a stage where biology is producing lots and lots of data,” he says. “But magically it’s not going to make sense out of nowhere. You need these types of systems to make sense of the data we have.” In other words, systems that mimic evolution—and might help us evolve solutions as well.

    Image credits: Alicia DeWitt, Alfred F. Huettner/Marine Biological Laboratory Archives (CC BY-NC-SA), Alejandro Sánchez Alvarado/Wikimedia Commons (CC BY-SA)

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NOVA is the highest rated science series on television and the most watched documentary series on public television. It is also one of television’s most acclaimed series, having won every major television award, most of them many times over.

  • richardmitnick 1:04 pm on December 30, 2016 Permalink | Reply
    Tags: apolipoprotein E (ApoE) gene, , , Biology,   

    From ASU via phys.org: “New study shows cognitive decline may be influenced by interaction of genetics and… worms” 

    ASU Bloc




    A depiction of the double helical structure of DNA. Its four coding units (A, T, C, G) are color-coded in pink, orange, purple and yellow. Credit: NHGRI

    You’ve likely heard about being in the right place at the wrong time, but what about having the right genes in the wrong environment? In other words, could a genetic mutation (or allele) that puts populations at risk for illnesses in one environmental setting manifest itself in positive ways in a different setting?

    That’s the question behind a recent paper published in The FASEB Journal by several researchers including lead author Ben Trumble, an assistant professor at Arizona State University’s School of Human Evolution and Social Change and ASU’s Center for Evolution and Medicine.

    These researchers examined how the apolipoprotein E (ApoE) gene might function differently in an infectious environment than in the urban industrialized settings where ApoE has mostly been examined. All ApoE proteins help mediate cholesterol metabolism, and assist in the crucial activity of transporting fatty acids to the brain. But in industrialized societies, ApoE4 variant carriers also face up to a four-fold higher risk for Alzheimer’s disease and other age-related cognitive declines, as well as a higher risk for cardiovascular disease.

    The goal of this study, Trumble explains, was to reexamine the potentially detrimental effects of the globally-present ApoE4 allele in environmental conditions more typical of those experienced throughout our species’ existence—in this case, a community of Amazonian forager-horticulturalists called the Tsimane.

    “For 99% of human evolution, we lived as hunter gatherers in small bands and the last 5,000-10,000 years—with plant and animal domestication and sedentary urban industrial life—is completely novel,” Trumble says. “I can drive to a fast-food restaurant to ‘hunt and gather’ 20,000 calories in a few minutes or go to the hospital if I’m sick, but this was not the case throughout most of human evolution.”

    Due to the tropical environment and a lack of sanitation, running water, or electricity, remote populations like the Tsimane face high exposure to parasites and pathogens, which cause their own damage to cognitive abilities when untreated.

    As a result, one might expect Tsimane ApoE4 carriers who also have a high parasite burden to experience faster and more severe mental decline in the presence of both these genetic and environmental risk factors.

    But when the Tsimane Health and Life History Project tested these individuals using a seven-part cognitive assessment and a medical exam, they discovered the exact opposite.

    In fact, Tsimane who both carried ApoE4 and had a high parasitic burden displayed steadier or even improved cognitive function in the assessment versus non-carriers with a similar level of parasitic exposure. The researchers controlled for other potential confounders like age and schooling, but the effect still remained strong. This indicated that the allele potentially played a role in maintaining cognitive function even when exposed to environmental-based health threats.

    For Tsimane ApoE4 carriers without high parasite burdens, the rates of cognitive decline were more similar to those seen in industrialized societies, where ApoE4 reduces cognitive performance.

    “It seems that some of the very genetic mutations that help us succeed in more hazardous time periods and environments may actually become mismatched in our relatively safe and sterile post-industrial lifestyles,” Trumble explains.

    Still, the ApoE4 variant appears to be much more than an evolutionary leftover gone bad, he adds. For example, several studies have shown potential benefits of ApoE4 in early childhood development, and ApoE4 has also been shown to eliminate some infections like giardia and hepatitis.

    “Alleles with harmful effects may remain in a population if such harm occurs late in life, and more so if those same alleles have other positive effects,” adds co-author Michael Gurven, professor of anthropology at University of California, Santa Barbara. “Exploring the effects of genes associated with chronic disease, such as ApoE4, in a broader range of environments under more infectious conditions is likely to provide much-needed insight into why such ‘bad genes’ persist.”

    The abstract and full research paper “Apolipoprotein E4 is associated with improved cognitive function in Amazonian forager-horticulturalists with a high parasite burden” can be viewed here in The FASEB Journal

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    ASU is the largest public university by enrollment in the United States.[11] Founded in 1885 as the Territorial Normal School at Tempe, the school underwent a series of changes in name and curriculum. In 1945 it was placed under control of the Arizona Board of Regents and was renamed Arizona State College.[12][13][14] A 1958 statewide ballot measure gave the university its present name.
    ASU is classified as a research university with very high research activity (RU/VH) by the Carnegie Classification of Institutions of Higher Education, one of 78 U.S. public universities with that designation. Since 2005 ASU has been ranked among the Top 50 research universities, public and private, in the U.S. based on research output, innovation, development, research expenditures, number of awarded patents and awarded research grant proposals. The Center for Measuring University Performance currently ranks ASU 31st among top U.S. public research universities.[15]

    ASU awards bachelor’s, master’s and doctoral degrees in 16 colleges and schools on five locations: the original Tempe campus, the West campus in northwest Phoenix, the Polytechnic campus in eastern Mesa, the Downtown Phoenix campus and the Colleges at Lake Havasu City. ASU’s “Online campus” offers 41 undergraduate degrees, 37 graduate degrees and 14 graduate or undergraduate certificates, earning ASU a Top 10 rating for Best Online Programs.[16] ASU also offers international academic program partnerships in Mexico, Europe and China. ASU is accredited as a single institution by The Higher Learning Commission.

    ASU Tempe Campus
    ASU Tempe Campus

  • richardmitnick 8:02 am on December 29, 2016 Permalink | Reply
    Tags: , Biology, , , Weizmann group   

    From COSMOS: “Circadian rhythms and the microbiome” 

    Cosmos Magazine bloc


    29 December 2016
    Richard G. “Bugs” Stevens
    Professor, School of Medicine
    University of Connecticut

    New research is beginning to show that the composition and activity of the microbiota exhibits a daily, or circadian, rhythmicity, just like we do.

    Examples of the microbes associated with healthy human beings. Jonathan Bailey, NHGRI, CC BY

    We’ve known that bacteria live in our intestines as far back as the 1680s, when Leeuwenhoek first looked through his microscope. Yogurt companies use that information in the sales pitch for their product, claiming it can help keep your gut bacteria happy. The bacteria growing on our skin have also been effectively exploited to sell the underarm deodorants without which we can become, ahem, malodorous. Until fairly recently our various microbes were thought of as freeloaders without any meaningful benefit to our functioning as healthy human beings.

    However, that view has changed in a big way [Nature]over the last couple of decades.

    Interest in, and knowledge about, the microbiota has recently exploded. These highly diverse communities of microbes live in and on us in staggering numbers; researchers now estimate that a typical human body is made up of about 30 trillion human cells and 39 trillion bacteria.

    We now recognize they’re essential to our health, participating in many important physiological functions such as digestion and metabolism of foods, and immune responses and inflammation; disruption of the gut microbiota might then contribute to a variety of conditions including childhood asthma, obesity, colitis and colon cancer.

    New research is beginning to show that the composition and activity of the microbiota exhibits a daily, or circadian, rhythmicity [Cell], just like we do. This offers one pathway to explain a Pandora’s box of possible adverse health effects from aspects of modern life, such as eating late at night or too much electric light after sunset.

    The microbial daily routine

    The microbiota is primarily bacterial but also includes viruses and eukaryotes like yeast; the latter are much bigger and more complicated than bacteria, and have a structure similar to our own cells. The total DNA complement of the microbiota is termed the microbiome, and it’s what we study to learn about the inner workings of the microbiota.

    In this field’s early days, researchers took fecal samples from people to investigate the composition of the gut microbiome. Later they noticed that defining the microbiome from a sample taken in the morning was quite different from one taken in the evening: The gut microbiota was not static over the span of the day.

    Perhaps this was to be expected. Almost all life on Earth has an endogenous circadian rhythmicity that is genetically determined, but that also responds to changes in light and dark. For human beings, reliable changes occur between day and night in hunger, body temperature, sleep propensity, hormone production, activity level, metabolic rate and more.

    These findings on daily rhythmicity in microbiota have really piqued my interest because disruption of our circadian rhythmicity by electric light at night has been my research passion for several decades. As scientists investigate the links between our internal daily patterns, electric light and health, new information about the rhythmicity of our microbiome might hold clues about how this all works together.

    The crucial question is whether the microbes simply respond to their host human’s circadian rhythm or whether they can actually alter our rhythm somehow. And does this really matter anyway?

    Microbiota calling the shots

    A group of researchers from the Weizmann Institute in Israel have now used an array of remarkable DNA technologies to show that the gut microbiota changes location within the gut, and changes its metabolic outputs over the span of the 24-hour day, at least in mice. Amino acids, lipids and vitamins that the microbes release circulate in the host mouse’s blood. As the levels of these molecules in the blood changed throughout the day, they altered the expression of genes in the mouse’s liver that code for many metabolic enzymes.

    This is the first clear demonstration of the gut microbiota changing the circadian activity of an essential organ – in this case, the liver, which is the engine of our physiology and crucial to our health.

    Changes in microbial movements and metabolite production over the course of the day influence host tissues. Thaiss et al/Cell 2016, CC BY

    The authors showed this link by administering an antibiotic to mice that kills much of the gut microbiota. Afterward they found significant changes in liver physiology. They could produce the same effect just by changing the feeding times of the mice; mice forced to eat only during the day showed different patterns of microbiota metabolites circulating in the blood than those allowed to eat at night, their natural active period.

    In addition, the authors showed the liver changes how it responds to an overdose of acetaminophen over the span of the day in response to signals from the microbiota in the gut. They used acetaminophen as an example of a drug that could damage the liver depending on how it’s broken down. Interestingly, an overdose was less toxic at the beginning of the day, dawn, and most toxic at the end of the day, dusk.

    They concluded that the microbiota regulates how effectively the liver can detoxify over the course of the day. The authors argue that this finding can be extrapolated to apply to metabolism of drugs in general, including chemotherapeutic agents we use to treat disease. If so, then the time of day that a medication is administered could have a big impact on its effectiveness, and on the severity of its adverse side effects.

    This work has exciting implications. Understanding how time of day matters might allow for better treatment of disease, and for prevention of maladies like obesity, metabolic syndrome and perhaps other serious conditions.

    Technology drives the science

    The findings described by the Weizmann group were made possible by advances in the technology of DNA research. As so often happens, scientific insights follow on technological development.

    This is particularly true in the science of DNA. In order to count trillions of microbes as well as distinguish among hundreds of different species, there are four broad requirements: conceptual development, sequencing machines, analytic approaches and supercomputers to conduct the near hopelessly complex statistical analyses. Each of these has advanced to an extent that now studies like the one from the Weizmann Institute are achievable.

    The key conceptual breakthrough in analyzing the microbiome came with the recognition that the complex array of so many different organisms living together in a community may not be reducible. In other words, it doesn’t appear possible to separate out only one bacterial species from the group, and understand how it functions in isolation. The community works as a whole. For example, some of its members are bacteria that cannot absorb iron, which is necessary for growth. They require iron-binding molecules made by other members of the community to survive. So you can’t grow this guy in a Petri dish by itself.

    Shift work might have effects on you and your microbiota. Woman image via shuterstock.

    Gut and rhythm

    The findings of the new study from Israel, which extends previous exciting work in this area, are relevant to humans for many reasons. For example, people who must take antibiotics for extended periods, or shift workers who eat at the “wrong” time of day, may be at risk via these microbiome pathways. In both instances, there will be changes in their metabolism that could lead, perhaps, to higher risk of obesity and metabolic syndrome, both of which have been shown to be in excess in night workers.

    A root cause of these human health issues we see on the macro scale may be our gut microbiota and whether or not it is happy.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 12:35 pm on December 22, 2016 Permalink | Reply
    Tags: , Biology, Different Diseases Have Distinct Chemical Signatures, , , You Are What You Exhale   

    From Technion: “You Are What You Exhale” 

    Technion bloc


    No writer credit

    Different Diseases Have Distinct Chemical Signatures.

    An international team of 56 researchers in five countries has confirmed a hypothesis first proposed by the ancient Greeks – that different diseases are characterized by different “chemical signatures” identifiable in breath samples. The findings by the team led by Professor Hossam Haick of the Technion-Israel Institute of Technology Department of Chemical Engineering and Russell Berrie Nanotechnology Institute at the Technion were published today in ACS Nano.

    Professor Hossam Haick of the Technion-Israel Institute of Technology Department of Chemical Engineering

    Diagnostic techniques based on breath samples have been demonstrated in the past, but until now, there has not been scientific proof of the hypothesis that different and unrelated diseases are characterized by distinct chemical breath signatures. And technologies developed to date for this type of diagnosis have been limited to detecting a small number of clinical disorders, without differentiation between unrelated diseases.

    The study of more than 1,400 patients included 17 different and unrelated diseases: lung cancer, colorectal cancer, head and neck cancer, ovarian cancer, bladder cancer, prostate cancer, kidney cancer, stomach cancer, Crohn’s disease, ulcerative colitis, irritable bowel syndrome, Parkinson’s disease (two types), multiple sclerosis, pulmonary hypertension, preeclampsia and chronic kidney disease. Samples were collected between January 2011 and June 2014 from in 14 departments at 9 medical centers in 5 countries: Israel, France, the USA, Latvia and China.

    The researchers tested the chemical composition of the breath samples using an accepted analytical method (mass spectrometry), which enabled accurate quantitative detection of the chemical compounds they contained. 13 chemical components were identified, in different compositions, in all 17 of the diseases.

    Diagram: A schematic view of the study. Two breath samples were taken from each subject, one was sent for chemical mapping using mass spectrometry, and the other was analyzed in the new system, which produced a clinical diagnosis based on the chemical fingerprint of the breath sample.

    According to Prof. Haick, “each of these diseases is characterized by a unique fingerprint, meaning a different composition of these 13 chemical components. Just as each of us has a unique fingerprint that distinguishes us from others, each disease has a chemical signature that distinguishes it from other diseases and from a normal state of health. These odor signatures are what enables us to identify the diseases using the technology that we developed.”

    With a new technology called “artificially intelligent nanoarray,” developed by Prof. Haick, the researchers were able to corroborate the clinical efficacy of the diagnostic technology. The array enables fast and inexpensive diagnosis and classification of diseases, based on “smelling” the patient’s breath, and using artificial intelligence to analyze the data obtained from the sensors. Some of the sensors are based on layers of gold nanoscale particles and others contain a random network of carbon nanotubes coated with an organic layer for sensing and identification purposes.

    The study also assessed the efficiency of the artificially intelligent nanoarray in detecting and classifying various diseases using breath signatures. To verify the reliability of the system, the team also examined the effect of various factors (such as gender, age, smoking habits and geographic location) on the sample composition, and found their effect to be negligible, and without impairment on the array’s sensitivity.

    “Each of the sensors responds to a wide range of exhalation components,” explain Prof. Haick and his previous Ph.D student, Dr. Morad Nakhleh, “and integration of the information provides detailed data about the unique breath signatures characteristic of the various diseases. Our system has detected and classified various diseases with an average accuracy of 86%.

    This is a new and promising direction for diagnosis and classification of diseases, which is characterized not only by considerable accuracy but also by low cost, low electricity consumption, miniaturization, comfort and the possibility of repeating the test easily.”

    “Breath is an excellent raw material for diagnosis,” said Prof. Haick. “It is available without the need for invasive and unpleasant procedures, it’s not dangerous, and you can sample it again and again if necessary.”

    Prof. Haick, full professor at Technion and head of three major European consortia, has received numerous prestigious awards and grants, including the Marie Curie Excellence Award, the European Research Council grant, grants from the Bill & Melinda Gates Foundation, the Hershel Rich Technion Innovation Award and the Humboldt Senior Research Award (Germany). He has been included in several important lists, including the list of the world’s 35 leading young scientists published by MIT’s Technology Review, the Nominet Trust 100 list (London), which includes the world’s 100 most influential inventors and digital developments, and the Los Angeles-based GOOD Magazine’s list of the 100 most influential people in the world. Prof. Haick also received the highest teaching award granted by the Technion – the Yanai Prize for Academic Excellence.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Technion Campus

    A science and technology research university, among the world’s top ten,
    dedicated to the creation of knowledge and the development of human capital and leadership,
    for the advancement of the State of Israel and all humanity.

  • richardmitnick 9:23 am on December 15, 2016 Permalink | Reply
    Tags: , , Biology, , , , X-ray crystallography,   

    From Stanford: “Masters of Crystallization” 

    Stanford University Name
    Stanford University

    March 24, 2016 [Stanford just put this in social media 12.14.16.]
    Glennda Chui

    When molecules won’t crystallize and technology confounds, who you gonna call?

    Macromolecular Structure Knowledge Center at Stanford’s Shriram Center. From left: Ted Li, T.J. Lane, MSKC Director Marc C. Deller, Nick Cox, Timothy Rhorer, Zachary Rosenthal.

    Researcher Ted Li examines a sample tray full of protein crystals under a microscope. Photo: SLAC National Accelerator Laboratory.

    Biology isn’t just for biologists anymore. That’s nowhere more apparent than in the newly furnished lab in room 097 of the Shriram Center basement, where flasks of bacterial and animal cells, snug in their incubators, are churning out proteins destined for jobs they may not have done in nature.

    Researchers who use this lab span a broad range of backgrounds and interests: Chemists searching for novel antibiotics. Chemical engineers developing biofuels. Doctors seeking new treatments for diabetes.

    Most of these highly skilled researchers have one thing in common: They have no idea how to grow the proteins and other large biomolecules that are essential to their research or how to prepare those proteins for X-ray studies that will reveal their structure and function.

    That’s where Marc Deller comes in.

    “I’m the lab manager, scientist, lab cleaner — I do everything, and I help people who don’t know how to use the equipment,” says Deller, who arrived in August to establish and direct the Macromolecular Structure Knowledge Center (MSKC). “I’m pretty much unboxing things every day and trying to get things plugged in.”

    With a doctorate from Oxford and years of protein-wrangling experience, he’s here to help Stanford faculty and students grow, purify and crystallize proteins and other big biomolecules so they can be probed with the SSRL synchrotron or the LCLS X-ray laser at SLAC National Accelerator Laboratory, just up the hill.

    SLAC SSRL Tunnel


    SLAC jointly funds the center with Stanford ChEM-H, an interdisciplinary institute aimed at understanding human biology at a chemical level, and the services offered at MSKC augment help available from the expert staff at the SLAC X-ray facilities.

    X-ray crystallography has been a revolutionary tool for understanding how living things work, revealing the structures of more than 100,000 proteins, nucleic acids and their complexes over the past few decades and fueling the development of numerous life-saving medications.

    But it’s not always easy, as chemistry graduate student Ted Li can attest. The protein he’s studying — a natural catalyst found in soil bacteria that scientists hope to turn into an antibiotic factory — “is very resistant to crystallization. It’s very floppy and doesn’t want to pack,” says Li, who works in the lab of Chaitan Khosla, professor of chemistry and of chemical engineering. “So I need to find a way to force them to do that. Most of the things I’m doing these days are completely new to me, and Marc is my main mentor. He’ll actually go with me to SLAC and guide me in how to collect my data.”

    In its first six months, MSKC has already helped scientists with two dozen research projects, and Deller is eager to round up more. “From my experience of doing this for 20 years,” he says, “making the protein is definitely a bottleneck.”

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    Leland and Jane Stanford founded the University to “promote the public welfare by exercising an influence on behalf of humanity and civilization.” Stanford opened its doors in 1891, and more than a century later, it remains dedicated to finding solutions to the great challenges of the day and to preparing our students for leadership in today’s complex world. Stanford, is an American private research university located in Stanford, California on an 8,180-acre (3,310 ha) campus near Palo Alto. Since 1952, more than 54 Stanford faculty, staff, and alumni have won the Nobel Prize, including 19 current faculty members

    Stanford University Seal

  • richardmitnick 2:04 pm on December 13, 2016 Permalink | Reply
    Tags: , Biology, , , Ubiquitination   

    From Cornell: “Human Diseases and Ubiquitination” 

    Cornell Bloc

    Cornell University

    Caitlin Hayes


    Cornell researcher
    Yuxin Mao
    Molecular Biology and Genetics, College of Agriculture and Life Sciences/College of Arts and Sciences

    Ubiquitin is an essential amino acid protein that modifies other proteins in eukaryotes. These modifications, or ubiquitination, play an essential role in a broad number of cellular processes, including transcription, DNA repair, signal transduction, autophagy, cell cycle, immune response, and membrane trafficking. It follows that aberration in the mechanisms of ubiquitination can lead to a number of human diseases—specifically, neurodegenerative diseases and cancers.

    Yuxin Mao, Molecular Biology and Genetics, has discovered one way that bacteria target and manipulate these essential processes and is working to uncover the precise molecular mechanisms.

    Remarkably, although ubiquitin is absent in prokaryotes, bacteria can deliver certain ligases—bacterial pathogen-encoded E3 ubiquitin Ligases (BELs)—into eukaryotic host cells to manipulate the host ubiquitin system for successful infection. Mao’s lab recently discovered a novel family of BELs, named SidC, from the intracellular bacterial pathogen Legionella pneumophila. Ligases in the SidC family have a very unique sequence and structure, which raises intriguing questions: Given this structure, what is the molecular mechanism of this family of ligases? What are the specific substrates of SidC? And how does the ubiquitination of these potential host factors play a role in membrane trafficking regulation?

    Mao’s lab is working to answer these questions. The results will make significant contributions to the understanding of both the molecular mechanisms of the enzymatic cascade of ubiquitination and the role of the host ubiquitin pathway in bacterial pathogenesis. These studies will therefore forge new trails in understanding human pathogens and will help combat bacterial infectious diseases. NIH Award Number: 1R01GM116964-01A1

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    Once called “the first American university” by educational historian Frederick Rudolph, Cornell University represents a distinctive mix of eminent scholarship and democratic ideals. Adding practical subjects to the classics and admitting qualified students regardless of nationality, race, social circumstance, gender, or religion was quite a departure when Cornell was founded in 1865.

    Today’s Cornell reflects this heritage of egalitarian excellence. It is home to the nation’s first colleges devoted to hotel administration, industrial and labor relations, and veterinary medicine. Both a private university and the land-grant institution of New York State, Cornell University is the most educationally diverse member of the Ivy League.

    On the Ithaca campus alone nearly 20,000 students representing every state and 120 countries choose from among 4,000 courses in 11 undergraduate, graduate, and professional schools. Many undergraduates participate in a wide range of interdisciplinary programs, play meaningful roles in original research, and study in Cornell programs in Washington, New York City, and the world over.

  • richardmitnick 3:30 pm on December 9, 2016 Permalink | Reply
    Tags: Biology, , , New weapon against Diabetes   

    From ETH Zürich: “New weapon against Diabetes” 

    ETH Zurich bloc

    ETH Zürich

    Peter Rüegg

    Researchers have used the simplest approach yet to produce artificial beta cells from human kidney cells. Like their natural model, the artificial cells act as both sugar sensors and insulin producers.

    Repeated measurement of the blood glucose level and injection of insulin make the everyday life of diabetics complicated. The newly created beta cells of the ETH researchers could make life easier again. (Picture: Dolgachov / iStock)

    Researchers led by ETH Professor Martin Fussenegger at the Department of Biosystems Science and Engineering (D-BSSE) in Basel have produced artificial beta cells using a straightforward engineering approach. These pancreatic cells can do everything that natural ones do: they measure the glucose concentration in the blood and produce enough insulin to effectively lower the blood sugar level. The ETH researchers presented their development in the latest edition of the journal Science.

    Previous approaches were based on stem cells, which the scientists allowed to mature into beta cells either by adding growth factors or by incorporating complex genetic networks.

    For their new approach, the ETH researchers used a cell line based on human kidney cells, HEK cells. The researchers used the natural glucose transport proteins and potassium channels in the membrane of the HEK cells. They enhanced these with a voltage-dependent calcium channel and a gene for the production of insulin and GLP-1, a hormone involved in the regulation of the blood sugar level.

    Voltage switch causes insulin production

    In the artificial beta cells, the HEK cells’ natural glucose transport protein carries glucose from the bloodstream into the cell’s interior. When the blood sugar level exceeds a certain threshold, the potassium channels close. This flips the voltage distribution at the membrane, causing the calcium channels to open. As calcium flows in, it triggers the HEK cells’ built-in signalling cascade, leading to the production and secretion of insulin or GLP-1.

    Diagram of a HEK-beta cell: Extracellular glucose triggers glycolysis-dependent membrane depolarization, which activates the voltage-gated calcium channel, resulting in an influx of Calcium ions, induction of the calmodulincalcineurin signaling cascade, and PNFAT-mediated induction of insulin secretion. (Graphics: ETH Zürich)

    The initial tests of the artificial beta cells in diabetic mice revealed the cells to be extremely effective: “They worked better and for longer than any solution achieved anywhere in the world so far,” says Fussenegger. When implanted into diabetic mice, the modified HEK cells worked reliably for three weeks, producing sufficient quantities of the messengers that regulate blood sugar level.

    Helpful modelling

    In developing the artificial cells, the researchers had the help of a computer model created by researchers working under Jörg Stelling, another professor in ETH Zürich’s Department of Biosystems Science and Engineering (D-BSSE). The model allows predictions to be made of cell behaviour, which can be verified experimentally. “The data from the experiments and the values calculated using the models were almost identical,” says Fussenegger.

    He and his group have been working on biotechnology-based solutions for diabetes therapy for a long time. Several months ago, they unveiled beta cells that had been grown from stem cells from a person’s fatty tissue. This technique is expensive, however, since the beta cells have to be produced individually for each patient. The new solution would be cheaper, as the system is suitable for all diabetics.

    Market-readiness is a long way off

    It remains uncertain, though, when these artificial beta cells will reach the market. They first have to undergo various clinical trials before they can be used in humans. Trials of this kind are expensive and often last several years. “If our cells clear all the hurdles, they could reach the market in 10 years,” the ETH professor estimates.

    Diabetes is becoming the modern-day scourge of humanity. The International Diabetes Federation estimates that more than 640 million people worldwide will suffer from diabetes by 2040. Half a million people are affected in Switzerland today, with 40,000 of them suffering from type 1 diabetes, the form in which the body’s immune system completely destroys the insulin-producing beta cells.


    Xie M et al. Beta-cell-mimetic designer cells provide closed-loop glycemic control. Science, Advanced Online Publication, 8 November 2016, DOI: 10.1126/science.aaf4006

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    ETH Zurich campus
    ETH Zurich is one of the leading international universities for technology and the natural sciences. It is well known for its excellent education, ground-breaking fundamental research and for implementing its results directly into practice.

    Founded in 1855, ETH Zurich today has more than 18,500 students from over 110 countries, including 4,000 doctoral students. To researchers, it offers an inspiring working environment, to students, a comprehensive education.

    Twenty-one Nobel Laureates have studied, taught or conducted research at ETH Zurich, underlining the excellent reputation of the university.

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: