Tagged: Applied Research & Technology Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 8:20 pm on June 25, 2017 Permalink | Reply
    Tags: Applied Research & Technology, Radha Chhita, RIT,   

    From RIT: Women in STEM “Eight degrees of success for South African family” Radha Chhita 

    Rochester Institute of Technology

    March 28, 2017 [Well hidden, finally pops out into social media]
    Marcia Morphy
    mpmuns@rit.edu

    1
    Radha Chhita will get her master’s degree in May, becoming the eighth member of her family to attend RIT.

    RIT has become an educational legacy for eight members of the Chhita family from Johannesburg, South Africa.

    Radha will graduate in May with a Master of Science degree in professional studies from the School of Individualized Study (SOIS)—like her sisters Asha ’07, and Tulsi ’16, before her. Another sibling, Kalpana, earned an MS in print media in 2004, and their father, Kishor, received an AAS degree in printing from RIT in 1974. Three cousins also attended RIT: Yogesh Chhita ’06 (graphic media); Janak Chhita ’05, ’06 (graphic media, MBA); and Bhadresh Rama, who studied graphic communications from 1993-1994.

    And all work for Golden Era Group, the Chhita-family-owned business and second largest independent printing and packaging company in South Africa.

    “The business was started by my grandfather, Bhoola, in 1942, and we continue to carry on our grandfather’s dream to build a better future for his family,” said Radha. “We all have a deep family commitment and strong work ethic to become part of the company’s executive team.

    “My father, who is co-CEO with my sister Asha, believes everyone in the family should start working from the ground up. That’s the family rule: Whatever is needed is the role in the company we fill.”

    Technology is the infrastructure of Golden Era’s product line which includes folding cartons, self-adhesive and IML labels, shrink sleeves, boutique bags, thermoformed plastic containers, three-piece metal cans, and manufacturing and printing of corrugated boxes and papermaking.

    “I chose RIT over other printing and packaging offerings in the world because of its world-class program,” said Tulsi, who earned a Bachelor of Commerce in Accountancy and postgraduate Financial Management Honors in her homeland. “My father taught us all the equation in life: Opportunity + Instinct = Profit.”

    Similarly, Radha received an accounting sciences degree and a post graduate degree in accounting, and is finalizing her Chartered Accountancy while completing her SOIS print, packaging and business concentration at RIT. She says the sisters don’t mind studying or working hard—and all get along really well.

    “My observation of both Tulsi and Radha, and this can apply to the whole family, is that they had a mission when they came to RIT,” said Peter Boyd, SOIS lecturer and graduate program coordinator. “They showed up with a vision, were very clear about what they wanted to do, and able to articulate how a masters in professional studies would help them advance and impact their family business.”

    Radha left Rochester in December and is completing her capstone project from South Africa. “I’m the eighth in the family line coming to RIT,” she said. “We always joke that there should be a Chhita building named after us on campus as there will be many generations to come.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Rochester Institute of Technology (RIT) is a private doctoral university within the town of Henrietta in the Rochester, New York metropolitan area.

    RIT is composed of nine academic colleges, including National Technical Institute for the Deaf. The Institute is one of only a small number of engineering institutes in the State of New York, including New York Institute of Technology, SUNY Polytechnic Institute, and Rensselaer Polytechnic Institute. It is most widely known for its fine arts, computing, engineering, and imaging science programs; several fine arts programs routinely rank in the national “Top 10” according to US News & World Report.

    The Institute as it is known today began as a result of an 1891 merger between Rochester Athenæum, a literary society founded in 1829 by Colonel Nathaniel Rochester and associates, and Mechanics Institute, a Rochester institute of practical technical training for local residents founded in 1885 by a consortium of local businessmen including Captain Henry Lomb, co-founder of Bausch & Lomb. The name of the merged institution at the time was called Rochester Athenæum and Mechanics Institute (RAMI). In 1944, the school changed its name to Rochester Institute of Technology and it became a full-fledged research university.

     
  • richardmitnick 7:53 pm on June 25, 2017 Permalink | Reply
    Tags: Applied Research & Technology, Burmese pythons (as well as other snakes) massively downregulate their metabolic and physiological functions during extended periods of fasting During this time their organs atrophy saving energy Howe, Evolution takes eons but it leaves marks on the genomes of organisms that can be detected with DNA sequencing and analysis, Researchers use Supercomputer to Uncover how Pythons Regenerate Their Organs, , , The Role of Supercomputing in Genomics Research, understanding the mechanisms by which Burmese pythons regenerate their organs including their heart liver kidney and small intestines after feeding, Within 48 hours of feeding Burmese pythons can undergo up to a 44-fold increase in metabolic rate and the mass of their major organs can increase by 40 to 100 percent   

    From UT Austin: “Researchers use Supercomputer to Uncover how Pythons Regenerate Their Organs” 

    U Texas Austin bloc

    University of Texas at Austin

    06/22/2017
    No writer credit found

    1
    A Burmese python superimposed on an analysis of gene expression that uncovers how the species changes in its organs upon feeding.Todd Castoe

    Evolution takes eons, but it leaves marks on the genomes of organisms that can be detected with DNA sequencing and analysis.

    As methods for studying and comparing genetic data improve, scientists are beginning to decode these marks to reconstruct the evolutionary history of species, as well as how variants of genes give rise to unique traits.

    A research team at the University of Texas at Arlington led by assistant professor of biology Todd Castoe has been exploring the genomes of snakes and lizards to answer critical questions about these creatures’ evolutionary history. For instance, how did they develop venom? How do they regenerate their organs? And how do evolutionarily-derived variations in genes lead to variations in how organisms look and function?

    “Some of the most basic questions drive our research. Yet trying to understand the genetic explanations of such questions is surprisingly difficult considering most vertebrate genomes, including our own, are made up of literally billions of DNA bases that can determine how an organism looks and functions,” says Castoe. “Understanding these links between differences in DNA and differences in form and function is central to understanding biology and disease, and investigating these critical links requires massive computing power.”

    To uncover new insights that link variation in DNA with variation in vertebrate form and function, Castoe’s group uses supercomputing and data analysis resources at the Texas Advanced Computing Center or TACC, one of the world’s leading centers for computational discovery.

    TACC Maverick HP NVIDIA supercomputer

    TACC Lonestar Cray XC40 supercomputer

    Dell Poweredge U Texas Austin Stampede Supercomputer. Texas Advanced Computer Center 9.6 PF

    TACC HPE Apollo 8000 Hikari supercomputer

    TACC Maverick HP NVIDIA supercomputer

    Recently, they used TACC’s supercomputers to understand the mechanisms by which Burmese pythons regenerate their organs — including their heart, liver, kidney, and small intestines — after feeding.

    Burmese pythons (as well as other snakes) massively downregulate their metabolic and physiological functions during extended periods of fasting. During this time their organs atrophy, saving energy. However, upon feeding, the size and function of these organs, along with their ability to generate energy, dramatically increase to accommodate digestion.

    Within 48 hours of feeding, Burmese pythons can undergo up to a 44-fold increase in metabolic rate and the mass of their major organs can increase by 40 to 100 percent.

    Writing in BMC Genomics in May 2017, the researchers described their efforts to compare gene expression in pythons that were fasting, one day post-feeding and four days post-feeding. They sequenced pythons in these three states and identified 1,700 genes that were significantly different pre- and post-feeding. They then performed statistical analyses to identify the key drivers of organ regeneration across different types of tissues.

    What they found was that a few sets of genes were influencing the wholesale change of pythons’ internal organ structure. Key proteins, produced and regulated by these important genes, activated a cascade of diverse, tissue-specific signals that led to regenerative organ growth.

    Intriguingly, even mammalian cells have been shown to respond to serum produced by post-feeding pythons, suggesting that the signaling function is conserved across species and could one day be used to improve human health.

    “We’re interested in understanding the molecular basis of this phenomenon to see what genes are regulated related to the feeding response,” says Daren Card, a doctoral student in Castoe’s lab and one of the authors of the study. “Our hope is that we can leverage our understanding of how snakes accomplish organ regeneration to one day help treat human diseases.”

    Making Evolutionary Sense of Secondary Contact

    Castoe and his team used a similar genomic approach to understand gene flow in two closely related species of western rattlesnakes with an intertwined genetic history.

    The two species live on opposite sides of the Continental Divide in Mexico and the U.S. They were separated for thousands of years and evolved in response to different climates and habitat. However, over time their geographic ranges came back together to the point that the rattlesnakes began to crossbreed, leading to hybrids, some of which live in a region between the two distinct climates.

    The work was motivated by a desire to understand what forces generate and maintain distinct species, and how shifts in the ranges of species (for example, due to global change) may impact species and speciation.

    The researchers compared thousands of genes in the rattlesnakes’ nuclear DNA to study genomic differentiation between the two lineages. Their comparisons revealed a relationship between genetic traits that are most important in evolution during isolation and those that are most important during secondary contact, with greater-than-expected overlap between genes in these two scenarios.

    However, they also found regions of the rattlesnake genome that are important in only one of these two scenarios. For example, genes functioning in venom composition and in reproductive differences — distinct traits that are important for adaptation to the local habitat — likely diverged under selection when these species were isolated. They also found other sets of genes that were not originally important for diversification of form and function, that later became important in reducing the viability of hybrids. Overall, their results provide a genome-scale perspective on how speciation might work that can be tested and refined in studies of other species.

    The team published their results in the April 2017 issue of Ecology and Evolution.

    The Role of Supercomputing in Genomics Research

    The studies performed by members of the Castoe lab rely on advanced computing for several aspects of the research. First, they use advanced computing to create genome assemblies — putting millions of small chunks of DNA in the correct order.

    “Vertebrate genomes are typically on the larger side, so it takes a lot of computational power to assemble them,” says Card. “We use TACC a lot for that.”

    Next, the researchers use advanced computing to compare the results among many different samples, from multiple lineages, to identify subtle differences and patterns that would not be distinguishable otherwise.

    Castoe’s lab has their own in-house computers, but they fall short of what is needed to perform all of the studies the group is interested in working on.

    “In terms of genome assemblies and the very intensive analyses we do, accessing larger resources from TACC is advantageous,” Card says. “Certain things benefit substantially from the general output from TACC machines, but they also allow us to run 500 jobs at the same time, which speeds up the research process considerably.”

    A third computer-driven approach lets the team simulate the process of genetic evolution over millions of generations using synthetic biological data to deduce the rules of evolution, and to identify genes that may be important for adaptation.

    For one such project, the team developed a new software tool called GppFst that allows researchers to differentiate genetic drift – a neutral process whereby genes and gene sequences naturally change due to random mating within a population – from genetic variations that are indicative of evolutionary changes caused by natural selection.

    The tool uses simulations to statistically determine which changes are meaningful and can help biologists better understand the processes that underlie genetic variation. They described the tool in the May 2017 issue of Bioinformatics.

    Lab members are able to access TACC resources through a unique initiative, called the University of Texas Research Cyberinfrastructure, which gives researchers from the state’s 14 public universities and health centers access to TACC’s systems and staff expertise.

    “It’s been integral to our research,” said Richard Adams, another doctoral student in Castoe’s group and the developer of GppFst. “We simulate large numbers of different evolutionary scenarios. For each, we want to have hundreds of replicates, which are required to fully vet our conclusions. There’s no way to do that on our in-house systems. It would take 10 to 15 years to finish what we would need to do with our own machines — frankly, it would be impossible without the use of TACC systems.”

    Though the roots of evolutionary biology can be found in field work and close observation, today, the field is deeply tied to computing, since the scale of genetic material — tiny but voluminous — cannot be viewed with the naked eye or put in order by an individual.

    “The massive scale of genomes, together with rapid advances in gathering genome sequence information, has shifted the paradigm for many aspects of life science research,” says Castoe.

    “The bottleneck for discovery is no longer the generation of data, but instead is the analysis of such massive datasets. Data that takes less than a few weeks to generate can easily take years to analyze, and flexible shared supercomputing resources like TACC have become more critical than ever for advancing discovery in our field, and broadly for the life sciences.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    U Texas Arlington Campus

    In 1839, the Congress of the Republic of Texas ordered that a site be set aside to meet the state’s higher education needs. After a series of delays over the next several decades, the state legislature reinvigorated the project in 1876, calling for the establishment of a “university of the first class.” Austin was selected as the site for the new university in 1881, and construction began on the original Main Building in November 1882. Less than one year later, on Sept. 15, 1883, The University of Texas at Austin opened with one building, eight professors, one proctor, and 221 students — and a mission to change the world. Today, UT Austin is a world-renowned higher education, research, and public service institution serving more than 51,000 students annually through 18 top-ranked colleges and schools.

     
  • richardmitnick 1:15 pm on June 25, 2017 Permalink | Reply
    Tags: Applied Research & Technology, , Bacteriophages, , Genetically modified viruses,   

    From Nature: “Modified viruses deliver death to antibiotic-resistant bacteria” 

    Nature Mag
    Nature

    21 June 2017
    Sara Reardon

    Engineered microbes turn a bacterium’s immune response against itself using CRISPR.

    1
    Phages (green) attack a bacterium (orange). Researchers are hoping to use engineered versions of these viruses to fight antibiotic resistance. AMI Images/SPL

    Genetically modified viruses that cause bacteria to kill themselves could be the next step in combating antibiotic-resistant infections [Nature].

    Several companies have engineered such viruses, called bacteriophages, to use the CRISPR gene-editing system to kill specific bacteria, according to a presentation at the CRISPR 2017 conference in Big Sky, Montana, last week. These companies could begin clinical trials of therapies as soon as next year.

    Initial tests have saved mice from antibiotic-resistant infections that would otherwise have killed them, said Rodolphe Barrangou, chief scientific officer of Locus Biosciences in Research Triangle Park, North Carolina, at the conference.

    Bacteriophages isolated and purified from the wild have long been used to treat infections in people, particularly in Eastern Europe. These viruses infect only specific species or strains of bacteria, so they have less of an impact on the human body’s natural microbial community, or microbiome, than antibiotics do. They are also generally thought to be very safe for use in people.

    But the development of phage therapy has been slow, in part because these viruses are naturally occurring and so cannot be patented. Bacteria can also quickly evolve resistance to natural phages, meaning researchers would have to constantly isolate new ones capable of defeating the same bacterial strain or species. And it would be difficult for regulatory agencies to continually approve each new treatment.

    CRISPR-fuelled death

    To avoid these issues, Locus and several other companies are developing phages that turn the bacterial immune system known as CRISPR against itself. In Locus’ phages, which target bacteria resistant to antibiotics, the CRISPR system includes DNA with instructions for modified guide RNAs that home in on part of an antibiotic-resistance gene. Once the phage infects a bacterium, the guide RNA latches on to the resistance gene. That prompts an enzyme called Cas3, which the bacterium normally produces to kill phages, to destroy that genetic sequence instead. Cas3 eventually destroys all the DNA, killing the bacterium. “I see some irony now in using phages to kill bacteria,” says Barrangou.

    Another company, Eligo Bioscience in Paris, uses a similar approach. It has removed all the genetic instructions that allow phages to replicate, and inserted DNA that encodes guide RNAs and the bacterial enzyme Cas9. Cas9 cuts the bacterium’s DNA at a designated spot, and the break triggers the bacterium to self-destruct. The system will target human gut pathogens, says Eligo chief executive Xavier Duportet, although he declined to specify which ones.

    The two companies hope to start clinical trials in 18–24 months. Their first goal is to treat bacterial infections that cause severe disease. But eventually, they want to develop phages that let them precisely engineer the human microbiome by removing naturally occurring bacteria associated with conditions such as obesity, autism and some cancers.

    Both Barrangou and Duportet acknowledge that for now, causal links between the human microbiome and these conditions are tenuous at best. But they hope that by the time their therapies have been proved safe and effective in humans, the links will be better understood. Phages could also allow researchers to manipulate the microbiomes of experimental animals, which could help them to untangle how certain bacteria influence conditions such as autism, says Timothy Lu, a synthetic biologist at the Massachusetts Institute of Technology in Cambridge and a co-founder of Eligo.

    An engineered solution

    Other companies are working to get phages to perform different tasks. ‘Supercharged’ phages, created by a group at Synthetic Genomics in La Jolla, California, could contain dozens of special features, including enzymes that break down biofilms or proteins that help to hide the phages from the human immune system.

    But engineered phages still have to overcome some hurdles. Treating an infection might require a large volume of phages, says Elizabeth Kutter, a microbiologist at Evergreen State College in Olympia, Washington, and it’s unclear whether this would trigger immune reactions, some of which could interfere with the treatment. Phages could also potentially transfer antibiotic-resistance genes to non-resistant bacteria, she notes.

    Lu adds that bacteria may still develop resistance even to the engineered phages. So researchers might have to frequently modify their phages to keep up with bacterial mutations.

    But as antibiotic resistance spreads, Kutter says, there will be plenty of space for both engineered phages and natural phage therapies, which are also growing in popularity. “I think they’ll complement the things that can be done by natural phages that have been engineered for hundreds of thousands of years,” she says.

    Related stories and links
    See the full article for further references with links

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Nature is a weekly international journal publishing the finest peer-reviewed research in all fields of science and technology on the basis of its originality, importance, interdisciplinary interest, timeliness, accessibility, elegance and surprising conclusions. Nature also provides rapid, authoritative, insightful and arresting news and interpretation of topical and coming trends affecting science, scientists and the wider public.

     
  • richardmitnick 12:42 pm on June 25, 2017 Permalink | Reply
    Tags: Applied Research & Technology, Efimov molecules, How the first complex molecules formed in the early universe and how complex materials came into being, Prof. Cheng Chin, Quantum objects,   

    From U Chicago: “UChicago physicists settle debate over how exotic quantum particles form” 

    U Chicago bloc

    University of Chicago

    June 22, 2017
    Carla Reiter

    New research by physicists at the University of Chicago settles a longstanding disagreement over the formation of exotic quantum particles known as Efimov molecules.

    The findings, published last month in Nature Physics, address differences between how theorists say Efimov molecules should form and the way researchers say they did form in experiments. The study found that the simple picture scientists formulated based on almost 10 years of experimentation had it wrong—a result that has implications for understanding how the first complex molecules formed in the early universe and how complex materials came into being.

    1
    Prof. Cheng Chin. No image credit.

    Efimov molecules are quantum objects formed by three particles that bind together when two particles are unable to do so. The same three particles can make molecules in an infinite range of sizes, depending on the strength of the interactions between them.

    Experiments had shown the size of an Efimov molecule was roughly proportional to the size of the atoms that comprise it—a property physicists call universality.

    “This hypothesis has been checked and rechecked multiple times in the past 10 years, and almost all the experiments suggested that this is indeed the case,” said Cheng Chin, a professor of physics at UChicago, who leads the lab where the new findings were made. “But some theorists say the real world is more complicated than this simple formula. There should be some other factors that will break this universality.“

    The new findings come down somewhere between the previous experimental findings and predictions of theorists. They contradict both and do away with the idea of universality.

    “I have to say that I am surprised,” Chin said. “This was an experiment where I did not anticipate the result before we got the data.”

    The data came from extremely sensitive experiments done with cesium and lithium atoms using techniques devised by Jacob Johansen, previously a graduate student in Chin’s lab who is now a postdoctoral fellow at Northwestern University. Krutik Patel, a graduate student at UChicago, and Brian DeSalvo, a postdoctoral researcher at UChicago, also contributed to the work.

    “We wanted to be able to say once and for all that if we didn’t see any dependence on these other properties, then there’s really something seriously wrong with the theory,” Johansen said. “If we did see dependence, then we’re seeing the breakdown of this universality. It always feels good, as a scientist, to resolve these sorts of questions.”

    Developing new techniques

    2
    Here “3” symbolizes an Efimov molecule comprised of three atoms. While all “3”s look about the same, research from the Chin group observed a tiny “3” that is clearly different. Courtesy of Cheng Chin.

    Efimov molecules are held together by quantum forces rather than by the chemical bonds that bind together familiar molecules such as H2O. The atoms are so weakly connected that the molecules can’t exist under normal conditions. Heat in a room providing enough energy to shatter their bonds.

    The Efimov molecule experiments were done at extremely low temperatures—50 billionths of a degree above absolute zero—and under the influence of a strong magnetic field, which is used to control the interaction of the atoms. When the field strength is in a particular, narrow range, the interaction between atoms intensifies and molecules form. By analyzing the precise conditions in which formation occurs, scientists can infer the size of the molecules.

    But controlling the magnetic field precisely enough to make the measurements Johansen sought is extremely difficult. Even heat generated by the electric current used to create the field was enough to change that field, making it hard to reproduce in experiments. The field could fluctuate at a level of only one part in a million—a thousand times weaker than the Earth’s magnetic field—and Johansen had to stabilize it and monitor how it changed over time.

    The key was a technique he developed to probe the field using microwave electronics and the atoms themselves.

    “I consider what Jacob did a tour de force,” Chin said. “He can control the field with such high accuracy and perform very precise measurements on the size of these Efimov molecules and for the first time the data really confirm that there is a significant deviation of the universality.”

    The new findings have important implications for understanding the development of complexity in materials. Normal materials have diverse properties, which could not have arisen if their behavior at the quantum level was identical. The three-body Efimov system puts scientists right at the point at which universal behavior disappears.

    “Any quantum system made with three or more particles is a very, very difficult problem,” Chin said. “Only recently do we really have the capability to test the theory and understand the nature of such molecules. We are making progress toward understanding these small quantum clusters. This will be a building block for understanding more complex material.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    U Chicago Campus

    An intellectual destination

    One of the world’s premier academic and research institutions, the University of Chicago has driven new ways of thinking since our 1890 founding. Today, UChicago is an intellectual destination that draws inspired scholars to our Hyde Park and international campuses, keeping UChicago at the nexus of ideas that challenge and change the world.

     
  • richardmitnick 10:47 am on June 25, 2017 Permalink | Reply
    Tags: Applied Research & Technology, Deep Carbon Observatory (DCO) Summer School, , Studying Yellowstone by Integrating Deep Carbon Science, , Yellowstone’s tectonic magmatic hydrothermal and microbial processes and their controls on carbon dioxide flux   

    From Eos: “Studying Yellowstone by Integrating Deep Carbon Science” 

    AGU bloc

    AGU
    Eos news bloc

    Eos

    23 June 2017
    Shaunna M. Morrison
    Mattia Pistone
    Lukas Kohl

    Second Deep Carbon Observatory Summer School; Yellowstone National Park, Montana and Wyoming, 23–28 July 2016.

    1
    Phormidium, a genus of orange, carotenoid-producing cyanobacteria, thrives in the outflow of Yellowstone’s Grand Prismatic hot spring. Deep Carbon Observatory (DCO) Summer School participants studied the conditions that are conducive to microbial life using published data and measurements acquired in Yellowstone National Park. Credit: Heidi Needham.

    Yellowstone National Park is a fascinating natural laboratory for geoscientists and biologists alike. Its steaming geysers and hot springs have been extensively studied to characterize the underlying hydrothermal activity. Scientists have also focused on microbial mat populations in extreme and hostile ecological niches with temperatures near boiling and pH from less than 1 to greater than 9. Yet little is known about the source of Yellowstone’s highly variable carbon fluxes.

    With this in mind, 38 early-career geologists, geochemists, microbiologists, and informaticians from 16 countries ventured to Yellowstone National Park for the Second Deep Carbon Observatory (DCO) Summer School in July 2016. Their goal was to study the complex interplay between the geosphere and biosphere, the effect of this interplay on the carbon-containing gases emitted by the Yellowstone volcanic system, and influences of high- and low-temperature fluids on microbial habitability through time and space.

    2
    Deep Carbon Observatory Summer School participants study a hot spring in Yellowstone National Park to observe the brightly colored microbial colonies that thrive in this extreme environment. Credit: Katie Pratt.

    The DCO Yellowstone short course consisted of three components:

    Fieldwork: Participants studied rock unit relationships, microbial mat communities, and hydrothermal fluid chemistry, and they made in situ carbon dioxide flux measurements.

    Classroom: Experts lectured and led discussions on the deep carbon cycle, extreme microbial systems, mineral evolution, the origin of life, geochemistry of gas fluxes, and fluid-rock interactions.

    Science presentations: Students presented their current research as fast-paced 1-minute lightning talks, followed by a poster session. Student abstracts can be found on the DCO website.

    Interdisciplinary and integrative science is essential to understanding complex systems: the ecology of extreme environments, intracontinental volcanism, and the deep carbon cycle. Participants faced the challenge of reconciling differences not only in subject matter but in temporal and spatial scales across their widely varying scientific domains. By the end of the session, DCO Summer School participants had integrated differing concepts of time and depth, fields of study, and technical experience to examine Yellowstone’s tectonic, magmatic, hydrothermal, and microbial processes and their controls on carbon dioxide flux.

    3
    At the summer school, participants learned about the geologic temperature (T) and pressure (P) regimes that can support microbial life (white areas). Credit: Mattia Pistone.

    Using published data and measurements acquired in the field, DCO Summer School scientists conducted a study on the conditions suitable for microbial life on Earth during the short course. The understanding they gained about the regimes in which life can interact with geologic materials and processes will enable these researchers to deepen their scientific studies and recognize fruitful cross-disciplinary collaborations.

    The DCO Summer School participants are continuing to build on the work they began in Yellowstone by asking questions that address long-standing unknowns in the scientific community. Such questions include the following: What are the sources and timing of the accumulation of Earth’s volatiles in Yellowstone? What are the geochemical and geophysical contexts of organic compound synthesis that predated the emergence of life? How deep is life found in the Earth’s interior? What are the fluid flux conditions that sustain life, as well as the hydrosphere and atmosphere, on Earth?

    The DCO is an interconnected community, and Summer School participants provide research updates at annual meetings, on the DCO website, and to various DCO committees. The authors thank the DCO Summer School organizers, instructors, and fellow participants as well as the DCO, American Geosciences Institute, the Center for Dark Energy Biosphere Investigations, Nano-Tech, and MO BIO Laboratories for their support.

    4
    The Grand Prismatic hot spring in Yellowstone National Park is one of the largest hot springs in the world. It owes its brilliant color gradient to changes in microbial populations with temperature. Credit: Daniel Petrash.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

     
  • richardmitnick 10:29 am on June 25, 2017 Permalink | Reply
    Tags: Applied Research & Technology, , , ,   

    From SUNY Buffalo: “Near instantaneous evolution discovered in bacteria” 

    SUNY Buffalo

    SUNY Buffalo

    June 23, 2017
    Grove Potter

    1
    An illustration on bacteria. No image credit.

    Discovery leads to $1.28 million grant to investigate mechanisms involved.

    How fast does evolution occur?

    In certain bacteria, it can occur almost instantaneously, a University at Buffalo molecular biologist has discovered.

    Mark R. O’Brian, PhD, chair and professor of the Department of Biochemistry in the Jacobs School of Medicine and Biomedical Sciences at UB, made the surprising discovery when studying how bacteria finds and draws iron into itself. The National Institutes of Health has awarded him a $1.28 million, four-year grant to delve into the mechanisms of bacteria mutating to accept iron, and how the organism expels excess iron.

    The discovery was made almost by accident, O’Brian said. The bacteria Bradyrhizobium japonicum was placed in a medium along with a synthetic compound to extract all the iron. O’Brian expected the bacteria to lie dormant having been deprived of the iron needed to multiply. But to his surprise, the bacteria started multiplying.

    “We had the DNA of the bacteria sequenced on campus, and we discovered they had mutated and were using the new compound to take iron in to grow,” he said. “It suggests that a single mutation can do that. So we tried it again with a natural iron-binding compound, and it did it again.”

    The speed of the genetic mutations — 17 days — was astounding.

    “We usually think of evolution taking place over a long period of time, but we’re seeing evolution — at least as the ability to use an iron source that it couldn’t before — occurring as a single mutation in the cell that we never would have predicted,” he said.

    “The machinery to take up iron is pretty complicated, so we would have thought many mutations would have been required for it to be taken up,” he said.

    The evolution of the bacteria does not mean it is developing into some other type of creature. Evolution can also change existing species “to allow them to survive,” O’Brian said.

    Bacteria, the most abundant life form on the planet, have been around for 3 billion years, evolving and adapting. So how big is the discovery of near instantaneous evolution?

    “It will depend on how broadly applicable it is,” O’Brian said. “Can we characterize the mechanisms, and look around and see if they are in other systems? How does this affect bacterial communities? How important is it for human health?”

    O’Brian said other researchers may take up work on how the new knowledge could impact human health.

    The mutation may not be related to how bacteria become resistant to antibiotics. The mutation that O’Brian observed resulted in a “gain of function,” a much more complicated event than the adaptation to block an antibiotic, he said.

    Organisms can adapt by switching genes on and off. Part of O’Brian’s grant is to study how bacteria expel excess iron by turning on different genes.

    The work now is “strictly scientific,” but uses could be in the offing.

    “There is the understanding of a mechanism that may help to better understand how you can approach an infectious disease, or approach remediation of the environment using bacteria,” O’Brian said.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    SUNY Buffalo Campus

    UB is a premier, research-intensive public university and a member of the Association of American Universities. As the largest, most comprehensive institution in the 64-campus State University of New York system, our research, creative activity and people positively impact the world.

     
  • richardmitnick 3:33 pm on June 24, 2017 Permalink | Reply
    Tags: Applied Research & Technology, ,   

    From MIT: “Toward mass-producible quantum computers” 

    MIT News

    MIT Widget

    MIT News

    May 26, 2017 [Always glad to find something I missed.]
    Larry Hardesty

    1
    A team of researchers from MIT, Harvard University, and Sandia National Laboratories reports a new technique for creating targeted defects in diamond materials, which is simpler and more precise than its predecessors and could benefit diamond-based quantum computing devices.

    Quantum computers are experimental devices that offer large speedups on some computational problems. One promising approach to building them involves harnessing nanometer-scale atomic defects in diamond materials.

    But practical, diamond-based quantum computing devices will require the ability to position those defects at precise locations in complex diamond structures, where the defects can function as qubits, the basic units of information in quantum computing. In today’s issue of Nature Communications, a team of researchers from MIT, Harvard University, and Sandia National Laboratories reports a new technique for creating targeted defects, which is simpler and more precise than its predecessors.

    In experiments, the defects produced by the technique were, on average, within 50 nanometers of their ideal locations.

    “The dream scenario in quantum information processing is to make an optical circuit to shuttle photonic qubits and then position a quantum memory wherever you need it,” says Dirk Englund, an associate professor of electrical engineering and computer science who led the MIT team. “We’re almost there with this. These emitters are almost perfect.”

    The new paper has 15 co-authors. Seven are from MIT, including Englund and first author Tim Schröder, who was a postdoc in Englund’s lab when the work was done and is now an assistant professor at the University of Copenhagen’s Niels Bohr Institute. Edward Bielejec led the Sandia team, and physics professor Mikhail Lukin led the Harvard team.

    Appealing defects

    Quantum computers, which are still largely hypothetical, exploit the phenomenon of quantum “superposition,” or the counterintuitive ability of small particles to inhabit contradictory physical states at the same time. An electron, for instance, can be said to be in more than one location simultaneously, or to have both of two opposed magnetic orientations.

    Where a bit in a conventional computer can represent zero or one, a “qubit,” or quantum bit, can represent zero, one, or both at the same time. It’s the ability of strings of qubits to, in some sense, simultaneously explore multiple solutions to a problem that promises computational speedups.

    Diamond-defect qubits result from the combination of “vacancies,” which are locations in the diamond’s crystal lattice where there should be a carbon atom but there isn’t one, and “dopants,” which are atoms of materials other than carbon that have found their way into the lattice. Together, the dopant and the vacancy create a dopant-vacancy “center,” which has free electrons associated with it. The electrons’ magnetic orientation, or “spin,” which can be in superposition, constitutes the qubit.

    A perennial problem in the design of quantum computers is how to read information out of qubits. Diamond defects present a simple solution, because they are natural light emitters. In fact, the light particles emitted by diamond defects can preserve the superposition of the qubits, so they could move quantum information between quantum computing devices.

    Silicon switch

    The most-studied diamond defect is the nitrogen-vacancy center, which can maintain superposition longer than any other candidate qubit. But it emits light in a relatively broad spectrum of frequencies, which can lead to inaccuracies in the measurements on which quantum computing relies.

    In their new paper, the MIT, Harvard, and Sandia researchers instead use silicon-vacancy centers, which emit light in a very narrow band of frequencies. They don’t naturally maintain superposition as well, but theory suggests that cooling them down to temperatures in the millikelvin range — fractions of a degree above absolute zero — could solve that problem. (Nitrogen-vacancy-center qubits require cooling to a relatively balmy 4 kelvins.)

    To be readable, however, the signals from light-emitting qubits have to be amplified, and it has to be possible to direct them and recombine them to perform computations. That’s why the ability to precisely locate defects is important: It’s easier to etch optical circuits into a diamond and then insert the defects in the right places than to create defects at random and then try to construct optical circuits around them.

    In the process described in the new paper, the MIT and Harvard researchers first planed a synthetic diamond down until it was only 200 nanometers thick. Then they etched optical cavities into the diamond’s surface. These increase the brightness of the light emitted by the defects (while shortening the emission times).

    Then they sent the diamond to the Sandia team, who have customized a commercial device called the Nano-Implanter to eject streams of silicon ions. The Sandia researchers fired 20 to 30 silicon ions into each of the optical cavities in the diamond and sent it back to Cambridge.

    Mobile vacancies

    At this point, only about 2 percent of the cavities had associated silicon-vacancy centers. But the MIT and Harvard researchers have also developed processes for blasting the diamond with beams of electrons to produce more vacancies, and then heating the diamond to about 1,000 degrees Celsius, which causes the vacancies to move around the crystal lattice so they can bond with silicon atoms.

    After the researchers had subjected the diamond to these two processes, the yield had increased tenfold, to 20 percent. In principle, repetitions of the processes should increase the yield of silicon vacancy centers still further.

    When the researchers analyzed the locations of the silicon-vacancy centers, they found that they were within about 50 nanometers of their optimal positions at the edge of the cavity. That translated to emitted light that was about 85 to 90 percent as bright as it could be, which is still very good.

    “It’s an excellent result,” says Jelena Vuckovic, a professor of electrical engineering at Stanford University who studies nanophotonics and quantum optics. “I hope the technique can be improved beyond 50 nanometers, because 50-nanometer misalignment would degrade the strength of the light-matter interaction. But this is an important step in that direction. And 50-nanometer precision is certainly better than not controlling position at all, which is what we are normally doing in these experiments, where we start with randomly positioned emitters and then make resonators.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    MIT Seal

    The mission of MIT is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of the MIT community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

    MIT Campus

     
  • richardmitnick 2:29 pm on June 24, 2017 Permalink | Reply
    Tags: Applied Research & Technology, , , SRM's -Standard Reference Materials   

    From NIST: ” Measurements Matter – How NIST Reference Materials Affect You” 

    NIST

    June 13, 2017
    Fran Webber

    In 2012, Consumer Reports announced startling findings—with potentially serious public health ramifications.

    The publication investigated arsenic levels in apple juice and rice and found levels of the toxin above those allowed in water by the Environmental Protection Agency. The articles pointed out that there were no rules about allowable levels for arsenic in food.

    The Food and Drug Administration responded by issuing a limit for arsenic levels in apple juice and, in 2016, for infant rice cereal . But the damage was already done.

    It’s a funny quirk of human psychology: we take the most important things for granted—until it all goes wrong.

    You probably don’t often question whether the food you buy in the grocery store is safe. Or if the lab where your doctor sends your samples accurately calculated your vitamin D levels.

    But imagine, for a moment, how much more difficult it would be to go about your daily life if you didn’t have the information those measurements provide.

    How would you decide what is safe and healthy to eat? How would you know if you were getting enough vitamin D or if your cholesterol levels were too high?

    That’s one of the big reasons NIST exists—to reduce uncertainty in our measurements and increase your confidence in the information you use to make important decisions in your daily life.

    And part of the way NIST does that is through Standard Reference Materials (SRMs).

    Standard Reference … what?

    The government has acronyms for seemingly everything. At NIST, one even has a registered trademark: SRM® is the “brand name” of our certified reference materials, the generic term for these vital tools. Many other organizations measure and distribute certified reference materials, but only NIST has SRMs.

    So what exactly is an SRM or certified reference material?


    NIST chemist Bob Watters provides an overview of how NIST’s standard reference materials, ranging from metal alloys to cholesterol samples, have helped industry make reliable measurements since the earliest days of the agency.

    It can be difficult to explain, because SRMs are actually a lot of different things. In fact, NIST sells more than 1,000 different types of SRMs, from gold nanoparticles to peanut butter .

    NIST has very carefully studied each of its SRMs, and it’s these characterizations, rather than the materials themselves, that customers pay for. SRMs serve a variety of purposes but are mostly used by other labs and members of industry to check their analytical measurements and to perform other kinds of quality-control tests.

    Steve Choquette, director of NIST’s Office of Reference Materials, says SRMs are like widgets, tools that provide a service or help you complete a task. In this case, SRMs give manufacturers access to a level of measurement accuracy they wouldn’t otherwise be able to obtain.

    “What an SRM really does is give our customers the highest quality measurements in a form they can easily use,” Choquette says.

    Peanut butter—SRM 2387—is an excellent example. NIST scientists know exactly how much fat, salt, sugar and other nutrients are in the peanut butter, and they’ve recorded those amounts on a certificate that’s sold with the SRM. When an SRM user measures the NIST peanut butter with his or her own instrument, he or she should get amounts that match the certificate. If not, the manufacturer knows the machine must be adjusted.

    NIST is a nonregulatory agency, which means it doesn’t set the rules for things like food and water safety. However, manufacturers frequently use NIST standards such as SRMs because they are a reliable, science-based means to demonstrating compliance with the rules set by regulatory agencies.

    Does your food measure up?

    Like the peanut butter SRM, many NIST SRMs are food products. These SRMs help the food industry comply with various U.S. food regulations such as those requiring nutrition facts labels. Regulators can be sure those labels are accurate when producers use SRMs to ensure their measurement instruments are properly calibrated.

    In the lab, Joe Katzenmeyer, senior scientist and strategic analytical manager at Land O’Lakes, uses the SRMs for nonfat milk powder, infant formula and meat homogenate (a canned pork and chicken mix).

    “We most often use NIST SRMs when developing a new testing procedure, and we need to know that a result is the ‘correct’ result,” Katzenmeyer said. “NIST values are established through a very thorough process and by labs across the country. This gives a high credibility to their established values.”

    And that’s how you can be confident in the nutrition facts labels, too, so you can make healthy decisions about what to eat.

    2
    NIST SRM 2385, spinach. Credit: K. Irvine/NIST

    But NIST food SRMs don’t just help you accurately count your carbs.

    Remember the concern about arsenic in apple juice and rice? NIST already had a rice flour SRM, but NIST researchers recently added measurements for different types of arsenic. And, NIST is in the process of making an SRM for apple juice that will include levels for various forms of arsenic as well. Government agencies, like the Food and Drug Administration, can use these SRMs to ensure that arsenic levels in the foods we eat are safe.

    And both health and safety are driving forces behind another type of NIST SRMs—those for dietary supplements.

    Marketers can make some pretty strong claims about their products. But do so-called “superfoods” like green tea or blueberries live up to the hype? The first step in finding out is to carefully measure the properties of these foods.

    That’s why NIST makes SRMs for green tea and blueberries, as well as multivitamins, St. John’s Wort and Ginkgo biloba, among others.

    A medical measurement marvel

    Nearly 74 million Americans have high levels of LDL cholesterol —that’s the bad kind. Those with high cholesterol have twice the risk of heart disease as those with normal levels.

    Keeping tabs on your cholesterol can be a matter of life and death. So, when you or your loved one goes to the doctor’s office to give a blood sample, how do you know the result you get is right?

    If you’re thinking it’s because of NIST SRMs, you’d be right! NIST sells a number of SRMs that lab techs use to calibrate clinical laboratory equipment.

    But SRMs don’t just help maintain the status quo. They also help drive innovation.

    A new SRM for monoclonal antibodies—a large class of drugs for treating cancer and autoimmune diseases, among other things—could make these life-saving treatments more widely available.

    Monoclonal antibodies are large protein molecules designed to bind to disease-causing cells or proteins, triggering a patient’s immune system to attack and clear them from the body. Sales of these drugs in the U.S. reached $50 billion in 2015.

    3

    NIST’s monoclonal antibody reference material, NIST RM 8671, is shipped in cryovials packaged in dry ice. It should be stored in a frozen state at -80 °C (-112 °F). Shown is a sample that underwent extensive round-robin testing by more than 100 collaborators before the biological material, donated by MedImmune, was certified as a NIST RM. Credit: NIST

    Manufacturing a monoclonal antibody drug on a large scale is complex and involves the use of genetically engineered cells that churn out large quantities of the molecule. Testing to make sure that the molecules are being made correctly happens at many points in the manufacturing process. The NIST SRM is an important tool for assuring the quality of these test methods and of the final product.

    And, since patents on many monoclonal antibodies are set to expire in the next several years, many anticipate a growing market for biosimilar—or generic—versions of the drugs. Generics could save patients billions of dollars by 2020 .

    But, this will mean a lot of testing and measurements to determine whether these generic versions are nearly identical to the branded versions. The NIST monoclonal antibody SRM could help with measurement challenges faced by researchers tasked with testing these drugs.

    Taking measurements to court

    In 1978, Michael Hanline was found guilty of murder in California. But Hanline always said he was innocent. Eventually, the California Innocence Project at California Western School of Law took up his case, and through DNA analysis, showed that Hanline was not the source of DNA found on key evidence.

    Hanline spent 36 years in prison. He is the longest-serving wrongfully convicted person in California history.

    When Hanline was convicted, the ability to evaluate DNA evidence didn’t yet exist. But today, it’s not uncommon to hear of cases where DNA evidence makes or breaks the case. And not just to exonerate the innocent. Far more often, DNA evidence helps law enforcement put away the right people the first time.

    NIST forensic DNA SRMs are crucial to this process. They help make sure that labs conducting forensic DNA analysis obtain accurate results. The Federal Bureau of Investigation requires that forensic DNA testing laboratories meet certain quality assurance standards. Labs must check their processes with a NIST SRM (or a reference material that traces back to NIST) every year or anytime they make substantial changes to their protocol.

    “The NIST DNA SRM we use in our lab is essential to ensure our analyses are reliable,” said Todd Bille, DNA technical leader at the Bureau of Alcohol, Tobacco, Firearms and Explosives. “With all the advances in the forensic community, NIST SRM 2391c is the only set of DNA samples that has what we need to make sure the analyses function properly in our hands. Our lab is also constantly evaluating new methods to handle DNA. Having this set of standard DNA samples allows us to be sure new methods don’t adversely affect the results.”

    Cementing quality control

    First of all, John Sieber wants you to know: There’s a difference between cement and concrete.

    “People get the two mixed up,” says Sieber, a NIST research chemist. “Cement is what you have before, and then you mix it with water and sand and gravel—aggregate, they call it—and you pour it into your sidewalk and it hardens through a chemical reaction and becomes concrete.”

    4
    NIST researcher John Sieber, concrete SRM development. Credit: copyright Earl Zubkoff

    Though you may have never given it a second thought, you no doubt interact with concrete on a daily basis as you drive to work, park your car in a garage, walk across the sidewalk to your office and sit at your desk in a high-rise building.

    “The human race is trying to cover the planet in concrete,” Sieber jokes.

    To make sure their product can withstand the tests of time, wear and weather, cement makers conform to certain quality standards. During the manufacturing process, cement makers test their products hourly. NIST SRMs are crucial to letting manufacturers know the results of their tests are accurate—and that they’re creating a high-quality product.

    NIST sells 15 cement—not concrete—SRMs that help manufacturers ensure their products meet certain quality standards and help buyers know they’re getting what they paid for.

    5
    NIST researchers in CAVE 3D Visualization lab exploring the movement of concrete particles. Credit: copyright Earl Zubkoff

    Standards of excellence

    To tell the story of SRMs is to tell the story of industry in America—its breakthroughs and its setbacks. From the turn of the 20th century onward, NIST stood with American makers as they erected skyscrapers, laid railways and took to the skies in airplanes. NIST helped manufacturers overcome technical challenges they faced in bringing innovative technology to the American people.

    In 1905, NIST—then known as the National Bureau of Standards—began preparing and distributing the first SRMs, standardized samples of iron, which manufacturers used as a check on their lab analyses. From those early standard samples, the program grew.

    Today, NIST still sells versions of these original SRMs, but it has come a long way. The diverse array of SRMs currently available reflect the complexity and technological advancement of a 21st-century society—and the new challenges it faces.

    NIST constantly works to improve its existing SRMs to adapt to changing needs—such as the arsenic levels added to the rice flour SRM, or the blueberry SRM, to which NIST is in the process of adding measurements for anthocyanins, a type of flavonoid, or pigment, in the blueberries that contributes to its antioxidant properties. And, NIST is always looking for opportunities to create new SRMs to drive innovation in emerging markets, like the monoclonal antibody SRM for biopharmaceutical manufacturers.

    “Good science is our carrot,” Choquette says.

    Speaking of carrots, we’ve got an SRM for that.

    To learn more about NIST’s Standard Reference Materials, visit http://www.nist.gov.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NIST Campus, Gaitherberg, MD, USA

    NIST Mission, Vision, Core Competencies, and Core Values

    NIST’s mission

    To promote U.S. innovation and industrial competitiveness by advancing measurement science, standards, and technology in ways that enhance economic security and improve our quality of life.
    NIST’s vision

    NIST will be the world’s leader in creating critical measurement solutions and promoting equitable standards. Our efforts stimulate innovation, foster industrial competitiveness, and improve the quality of life.
    NIST’s core competencies

    Measurement science
    Rigorous traceability
    Development and use of standards

    NIST’s core values

    NIST is an organization with strong values, reflected both in our history and our current work. NIST leadership and staff will uphold these values to ensure a high performing environment that is safe and respectful of all.

    Perseverance: We take the long view, planning the future with scientific knowledge and imagination to ensure continued impact and relevance for our stakeholders.
    Integrity: We are ethical, honest, independent, and provide an objective perspective.
    Inclusivity: We work collaboratively to harness the diversity of people and ideas, both inside and outside of NIST, to attain the best solutions to multidisciplinary challenges.
    Excellence: We apply rigor and critical thinking to achieve world-class results and continuous improvement in everything we do.

     
  • richardmitnick 2:00 pm on June 24, 2017 Permalink | Reply
    Tags: Applied Research & Technology, , Improved understanding of a widely used 'thermometer' for Earth's ancient oceans, ,   

    From EMSL: “Improved understanding of a widely used ‘thermometer’ for Earth’s ancient oceans” 

    EMSL

    EMSL

    June 16, 2017
    Tom Rickey
    tom.rickey@pnnl.gov
    (509) 375-3732

    1
    Foraminifera – a key to understanding ancient Earth. Credit: Jennifer Fehrenbacher/Oregon State University

    Scientists have improved our ability to interpret one of the most common measures of the temperature of Earth’s oceans in the distant past.

    The measurement is based on the ancient remains of tiny marine organisms called foraminifera, a type of plankton that lives and feeds in water.

    The organisms use calcium and magnesium from seawater to help form their shells – more magnesium when ocean temperatures are warmer and less when the temperatures are cooler. But magnesium levels can vary significantly within individual shells, and scientists have been exploring why.

    In a paper published recently in Nature Communications, scientists explain that changes in light levels from daytime to nighttime can cause the organisms to vary how they build their shells, which plays a direct role in determining the levels of magnesium in the shells. The information gives scientists a better understanding of the biological processes involved when using this plankton-based temperature gauge to assess past ocean conditions.

    The project was led by Jennifer Fehrenbacher of Oregon State University and also included scientists from UC Davis, the University of Washington, and EMSL, the Environmental Molecular Sciences Laboratory, a Department of Energy Office of Science User Facility at the Pacific Northwest National Laboratory. The team included John B. Cliff III and Zihua Zhu from EMSL and PNNL.

    Earlier from EMSL:

    Daily Light/Dark Cycle Controls Patterns within Marine Protist Shells

    The trace element composition of the calcite shells of foraminifera, sand grain-sized marine protists, is commonly used to reconstruct the history of ocean conditions in Earth’s past. A recent study explored environmental and biological factors that control the compositional variability of the element magnesium (Mg), which is used to reconstruct past ocean temperature.

    The Impact

    These findings suggest the same light-triggered mechanism is responsible for Mg banding in two species that occupy different ecological niches in the ocean, and that Mg variability is an integral component of shell-building processes in planktic foraminifera. The experimental results will be used to update a 70-year-old model of foraminifera shell development and could be used to develop more accurate methods for assessing past ocean conditions.

    Summary

    The relationship between seawater temperature and the average Mg/Calcium (Ca) ratios in planktic foraminifera is well established, providing an essential tool for reconstructing past ocean temperatures. However, the mechanism responsible for variability in the trace element composition within individual shells is poorly understood. In particular, many species display alternating high and low Mg bands within their shell walls that cannot be explained by temperature alone. Recent experiments demonstrate intrashell Mg variability in Orbulina universa, which forms a spherical terminal shell, is paced by the daily light and dark cycle. Whether Mg heterogeneity is also controlled by the light and dark cycle in species with more complex shell structures was previously unknown. To address this knowledge gap, a team of researchers from Oregon State University; University of California, Davis; University of Washington; and EMSL, the Environmental Molecular Sciences Laboratory, a DOE Office of Science User Facility, combined culture techniques and high-resolution NanoSIMS imaging to show high Mg/Ca-calcite forms at night (in dark conditions) in cultured specimens of the multi-chambered species Neogloboquadrina dutertrei. The results also demonstrate N. dutertrei adds a significant amount of calcite, as well as nearly all Mg bands, after the final chamber forms. These results have implications for interpreting patterns of calcification in N. dutertrei, and suggest daily Mg banding is an intrinsic component of biomineralization in planktic foraminifera, likely modified by growth conditions. Moreover, the findings suggest the overall Mg content of the shell is primarily controlled by temperature, while the amplitude of the intrashell banding, which is triggered by a light response, is modulated by pH. By shedding light on mechanisms that control Mg variability in the shells of diverse planktic foraminifera, the findings could lead to improved methods for reconstructing past ocean conditions.

    PI Contact

    Jennifer S. Fehrenbacher
    Oregon State University
    fehrenje@coas.oregonstate.edu

    EMSL Contacts

    Zihua Zhu
    EMSL
    zihua.zhu@pnnl.gov

    John Cliff
    EMSL
    john.cliff@pnnl.gov

    Funding

    This work was supported by the U.S. Department of Energy’s Office of Science (Office of Biological and Environmental Research), including support of the Environmental Molecular Sciences Laboratory (EMSL), a DOE Office of Science User Facility; and the U.S. National Science Foundation.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    EMSL campus

    Welcome to EMSL. EMSL is a national scientific user facility that is funded and sponsored by DOE’s Office of Biological & Environmental Research. As a user facility, our scientific capabilities – people, instruments and facilities – are available for use by the global research community. We support BER’s mission to provide innovative solutions to the nation’s environmental and energy production challenges in areas such as atmospheric aerosols, feedstocks, global carbon cycling, biogeochemistry, subsurface science and energy materials.

    A deep understanding of molecular-level processes is critical to gaining a predictive, systems-level understanding of the impacts of aerosols and terrestrial systems on climate change; making clean, affordable, abundant energy; and cleaning up our legacy wastes. Visit our Science page to learn how EMSL leads in these areas, through our Science Themes.

    Team’s in Our DNA. We approach science differently than many institutions. We believe in – and have proven – the value of drawing together members of the scientific community and assembling the people, resources and facilities to solve problems. It’s in our DNA, since our founder Dr. Wiley’s initial call to create a user facility that would facilitate “synergism between the physical, mathematical, and life sciences.” We integrate experts across disciplines; experiment with theory; and our user program proposal calls with other user facilities.

    We proudly provide an enriched, customized experience that allows users to connect with our people and capabilities in an environment where we focus on solving problems. We collaborate with researchers from academia, government labs and industry, and from nearly all 50 states and from other countries.

     
  • richardmitnick 3:51 pm on June 23, 2017 Permalink | Reply
    Tags: Applied Research & Technology, , Biomass, Capturing methane gas, Living together a methane-loving methanotroph and a photosynthetic cyanobacterium, Methylomicrobium alcaliphilum 20Z, , the cyanobacteria absorb light and use carbon dioxide as fuel to produce oxygen fueling the methane-munching bacteria, The cyanobacteria Synechococcus species 7002   

    From EMSL: “From moo to goo: Cooperating microbes convert methane to alternative fuel source” 

    EMSL

    EMSL

    April 12, 2017 [Now in social media, better late than never.]
    Tom Rickey

    1
    Hans Bernstein holds a test tube full of liquid biomass created from methane-rich gases, including biogas from dairy farms and natural gas from oil wells. No image credit.

    Oil and gas wells and even cattle release methane gas into the atmosphere, and researchers are working on ways to not only capture this gas but also convert it into something useful and less-polluting.

    Now scientists at the Department of Energy’s Pacific Northwest National Laboratory have developed a new system to convert methane into a deep green, energy-rich, gelatin-like substance that can be used as the basis for biofuels and other bioproducts, specialty chemicals — and even feed for cows that create the gas in the first place.

    “We take a waste product that is normally an expense and upgrade it to microbial biomass which can be used to make fuel, fertilizer, animal feed, chemicals and other products,” said Hans Bernstein, corresponding author of a recent paper in Bioresource Technology.

    Methane is an unavoidable byproduct of our lifestyle. Manure from dairy cows, cattle and other livestock that provide us food often breaks down into methane. Drilling processes used to obtain the oil and natural gas we use to drive our cars and trucks or heat our homes often vent or burn off excess methane to the atmosphere, wasting an important energy resource.

    A tale of two microbes

    PNNL scientists approached the problem by getting two very different micro-organisms to live together in harmony.

    One is a methane-loving methanotroph, found underground near rice paddies and landfills — where natural methane production typically occurs. The other is a photosynthetic cyanobacterium that resembles algae. Originally cultured from a lake in Siberia, it uses light along with carbon dioxide to produce oxygen.

    The two aren’t usually found together, but the two co-exist in harmony in a bioreactor at PNNL — thanks to a co-culture system created by Leo Kucek, Grigoriy E. Pinchuk, and Sergey Stolyar as well as Eric Hill and Alex Beliaev, who are two authors of the current paper.

    PNNL scientist Hans Bernstein collected methane gas from a Washington dairy farm and Colorado oil fields and fed it to the microbes in the bioreactor.

    One bacterium, Methylomicrobium alcaliphilum 20Z, ate the methane and produced carbon dioxide and energy-rich biomass made up largely of a form of carbon that can be used to produce energy.

    But Methylomicrobium alcaliphilum 20Z can’t do it alone. It needs the other micro-organism, Synechococcus species 7002, which uses light to produce the steady stream of oxygen its counterpart needs to carry out the methane-consuming reaction.

    Each one accomplishes an important task while supplying the other with a substance it needs to survive. They keep each other happy and well fed — as Bernstein puts it, they’re engaging in a “productive metabolic coupling.”

    “The two organisms complement each other, support each other,” said Bernstein. “We have created an adaptable biotechnology platform with microbes that are genetically tractable for the synthesis of biofuels and biochemicals.”

    Flick of a switch

    Agricultural and industrial biogas is typically used to generate electricity but engineers have developed ways of upgrading biogas to compressed or liquefied natural gas. But the biogas is typically filled with corrosive impurities like hydrogen sulfide that must be removed before it can be used.

    The PNNL process produces a much cleaner product, either liquid or solid, with simply the flick of a light switch or exposure to sunlight. When there’s methane to convert, the cyanobacteria absorb light and use carbon dioxide as fuel to produce oxygen, fueling the methane-munching bacteria. When there is not much methane, researchers dim the lights, reducing the oxygen, which slows the action of the methanotrophs. In recent tests the PNNL team ran the system continuously for about two months.

    “The beauty of this system is that it doesn’t matter where the methane comes from,” said Ron Thomas, deputy director of technology deployment and outreach at PNNL. “It could be agricultural waste; it could be methane from oil wells. The system can take waste from multiple waste streams and create a useful product.”

    The research was funded by DOE’s Office of Science (BER) and the Linus Pauling Distinguished Postdoctoral Fellowship Program at PNNL. Cell imaging and sorting were performed at EMSL, the Environmental Molecular Sciences Laboratory, a DOE Office of Science User Facility at PNNL.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    EMSL campus

    Welcome to EMSL. EMSL is a national scientific user facility that is funded and sponsored by DOE’s Office of Biological & Environmental Research. As a user facility, our scientific capabilities – people, instruments and facilities – are available for use by the global research community. We support BER’s mission to provide innovative solutions to the nation’s environmental and energy production challenges in areas such as atmospheric aerosols, feedstocks, global carbon cycling, biogeochemistry, subsurface science and energy materials.

    A deep understanding of molecular-level processes is critical to gaining a predictive, systems-level understanding of the impacts of aerosols and terrestrial systems on climate change; making clean, affordable, abundant energy; and cleaning up our legacy wastes. Visit our Science page to learn how EMSL leads in these areas, through our Science Themes.

    Team’s in Our DNA. We approach science differently than many institutions. We believe in – and have proven – the value of drawing together members of the scientific community and assembling the people, resources and facilities to solve problems. It’s in our DNA, since our founder Dr. Wiley’s initial call to create a user facility that would facilitate “synergism between the physical, mathematical, and life sciences.” We integrate experts across disciplines; experiment with theory; and our user program proposal calls with other user facilities.

    We proudly provide an enriched, customized experience that allows users to connect with our people and capabilities in an environment where we focus on solving problems. We collaborate with researchers from academia, government labs and industry, and from nearly all 50 states and from other countries.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: