Dedicated to spreading the Good News of Basic and Applied Science at great research institutions world wide. Good science is a collaborative process. The rule here: Science Never Sleeps.
I am telling the reader this story in the hope of impelling him or her to find their own story and start a wordpress blog. We all have a story. Find yours.
The oldest post I can find for this blog is From FermiLab Today: Tevatron is Done at the End of 2011 (but I am not sure if that is the first post, just the oldest I could find.)
But the origin goes back to 1985, Timothy Ferris Creation of the Universe PBS, November 20, 1985, available in different videos on YouTube; The Atom Smashers, PBS Frontline November 25, 2008, centered at Fermilab, not available on YouTube; and The Big Bang Machine, with Sir Brian Cox of U Manchester and the ATLAS project at the LHC at CERN.
In 1993, our idiot Congress pulled the plug on The Superconducting Super Collider, a particle accelerator complex under construction in the vicinity of Waxahachie, Texas. Its planned ring circumference was 87.1 kilometers (54.1 mi) with an energy of 20 Tev per proton and was set to be the world’s largest and most energetic. It would have greatly surpassed the current record held by the Large Hadron Collider, which has ring circumference 27 km (17 mi) and energy of 13 TeV per proton.
If this project had been built, most probably the Higgs Boson would have been found there, not in Europe, to which the USA had ceded High Energy Physics.
(We have not really left High Energy Physics. Most of the magnets used in The LHC are built in three U.S. DOE labs: Lawrence Berkeley National Laboratory; Fermi National Accelerator Laboratory; and Brookhaven National Laboratory. Also, see below. the LHC based U.S. scientists at Fermilab and Brookhaven Lab.)
I have recently been told that the loss of support in Congress was caused by California pulling out followed by several other states because California wanted the collider built there.
The project’s director was Roy Schwitters, a physicist at the University of Texas at Austin. Dr. Louis Ianniello served as its first Project Director for 15 months. The project was cancelled in 1993 due to budget problems, cited as having no immediate economic value.
Some where I learned that fully 30% of the scientists working at CERN were U.S. citizens. The ATLAS project had 600 people at Brookhaven Lab. The CMS project had 1,000 people at Fermilab. There were many scientists which had “gigs” at both sites.
I started digging around in CERN web sites and found Quantum Diaries, a “blog” from before there were blogs, where different scientists could post articles. I commented on a few and my dismay about the lack of U.S recognition in the press.
Those guys at Quantum Diaries, gave me access to the Greybook, the list of every institution in the world in several tiers processing data for CERN. I collected all of their social media and was off to the races for CERN and other great basic and applied science.
Since then I have expanded the list of sites that I cover from all over the world. I build .html templates for each institution I cover and plop their articles, complete with all attributions and graphics into the template and post it to the blog. I am not a scientist and I am not qualified to write anything or answer scientific questions. The only thing I might add is graphics where the origin graphics are weak. I have a monster graphics library. Any science questions are referred back to the writer who is told to seek his answer from the real scientists in the project.
The blog has to date 900 followers on the blog, its Facebook Fan page and Twitter. I get my material from email lists and RSS feeds. I do not use Facebook or Twitter, which are both loaded with garbage in the physical sciences.
Scientists can now control the rate of breaking and fixing dihydrogen molecule.
This molecule was generated in situ by hydride abstraction from n fluorobenzene. No image credit.
Hydrogen is the most abundant element in the universe. The dihydrogen molecule, with an H-H bond, is one of the simplest and most flexible in chemistry. Cleaving a dihydrogen bond to produce or store energy requires designing the catalyst with the perfect balance of properties to achieve the desired reactivity. In addition, the ability to get that molecule to reassemble itself and to control the rate of assembly and disassembly is important in the production of clean fuels. Morris Bullock and his colleagues at Pacific Northwest National Laboratory achieved control over the rate of cleavage and reassembly of a dihydrogen molecule.
Why It Matters: In the continual search for clean fuel production, scientists have been investigating simple ways to heterolytically cleave the hydrogen molecule into two uneven products. Understanding the properties of heterolytic dihydrogen bond cleavage and controlling the location and energy of the resulting proton and negatively charged hydride is important for the design of new catalysts for fuel cells and other clean energy sources.
Methods: The dihydrogen bond is the simplest in chemistry but it offers a flexibility in how the bond is ruptured. It can be broken in two different ways, homolytically or heterolytically, into two identical fragments or two different charged fragments, a proton and hydride. Heterolytic cleavage is the breaking of the bonding electron pair into two uneven products. This is a common process in the use of hydrogen in fuel cells and in biological processes occurring in nature in which enzymes oxidize hydrogen. Reverse heterolytic cleavage is the process of taking these uneven fragments and reconstructing them to their original structure; that is, combining the proton and hydride and creating dihydrogen.
Before this study, Bullock and his colleagues investigated how dihydrogen bonds are broken and are reformed into a dihydrogen molecule. “What we’re trying to do is find the right electronic characteristics so that the energy needed for cleavage is low,” says Bullock, a catalysis scientist.
Designing this molecule is a balancing act. Earlier iterations of these molecules either were bonded too strongly to the catalyst after cleavage or were too weak to bond to the catalyst. In response, PNNL scientists created a series of molybdenum-based catalysts, for which the rate of H-H cleavage and reassembly could be systematically varied.
In addition, Bullock and his colleagues proved that a mechanism exists to control the rate of reversible heterolytic cleavage. Using nuclear magnetic resonance spectroscopy at PNNL, they observed the reaction as it occurred. Further, they controlled the rate of cleavage by systematically changing the electronic characteristics of the metal complexes. Some of these bonds are cleaving and reassembling at close to 10 million times a second at room temperature. By changing the acidity of these complexes, the reversible heterolytic cleavage rate can be changed by a factor of 10,000.
What’s Next? Understanding the thermodynamic and kinetic properties of heterolytic dihydrogen bond cleavage and controlling the transfer of the proton and hydride are critically important for the design of new catalysts. The next step is determining how to achieve cleavage of the H-H bonds and control delivery of protons and hydrides after the H-H bond is broken.
PNNL scientists conduct basic and applied research and development to strengthen U.S. scientific foundations for fundamental research and innovation; prevent and counter acts of terrorism through applied research in information analysis, cyber security, and the nonproliferation of weapons of mass destruction; increase the U.S. energy capacity and reduce dependence on imported oil; and reduce the effects of human activity on the environment. PNNL has been operated by Battelle Memorial Institute since 1965.
richardmitnick
8:20 pm on November 16, 2017 Permalink
| Reply Tags: Applied Research & Technology ( 11,325 ), EBOLA ( 20 ), Medicine ( 1,039 ), PNNL, The focus of the study were blood samples from Ebola patients that were obtained during the outbreak in Sierra Leone in 2014, The scientists found that levels of two biomarkers known as L-threonine (an amino acid) and vitamin-D-binding-protein may accurately predict which patients live and which die, The team found that survivors had higher levels of some immune-related molecules and lower levels of others compared to those who died, The team looked at activity levels of genes and proteins as well as the amounts of lipids and byproducts of metabolism
PNNL scientists and their collaborators have identified molecules in the blood that indicate which patients with Ebola virus are most likely to have a poor outcome. (Credit: Photo courtesy of PNNL)
Scientists have identified a set of biomarkers that indicate which patients infected with the Ebola virus are most at risk of dying from the disease.
The results come from scientists at the Department of Energy’s Pacific Northwest National Laboratory and their colleagues at the University of Wisconsin-Madison, Icahn School of Medicine at Mount Sinai, the University of Tokyo and the University of Sierra Leone. The results were published online Nov. 16 in the journal Cell Host & Microbe.
The findings could allow clinicians to prioritize the scarce treatment resources available and provide them to the sickest patients, said the senior author of the study, Yoshihiro Kawaoka, a virology professor at the UW-Madison School of Veterinary Medicine.
The focus of the study were blood samples from Ebola patients that were obtained during the outbreak in Sierra Leone in 2014. The Wisconsin team obtained 29 blood samples from 11 patients who ultimately survived and nine blood samples from nine patients who died from the virus. The Wisconsin team inactivated the virus according to approved protocols, developed in part at PNNL, and then shipped the samples to PNNL and other institutions for analysis.
The team looked at activity levels of genes and proteins as well as the amounts of lipids and byproducts of metabolism. The team found 11 biomarkers that distinguish fatal infections from non-fatal ones and two that, when screened for early upon symptom onset, accurately predict which patients are likely to die.
“Our team studied thousands of molecular clues in each of these samples, sifting through extensive data on the activity of genes, proteins, and other molecules to identify those of most interest,” said Katrina Waters, the leader of the PNNL team and a corresponding author of the paper. “This may be the most thorough analysis yet of blood samples of patients infected with the Ebola virus.”
The team found that survivors had higher levels of some immune-related molecules and lower levels of others compared to those who died. Plasma cytokines, which are involved in immunity and stress response, were higher in the blood of people who perished. Fatal cases had unique metabolic responses compared to survivors, higher levels of virus, changes to plasma lipids involved in processes like blood coagulation, and more pronounced activation of some types of immune cells.
Pancreatic enzymes also leaked into the blood of patients who died, suggesting that these enzymes contribute to the tissue damage characteristic of fatal Ebola virus disease.
The scientists found that levels of two biomarkers, known as L-threonine (an amino acid) and vitamin-D-binding-protein, may accurately predict which patients live and which die. Both were present at lower levels at the time of admission in the patients who ultimately perished.
The team found that many of the molecular signals present in the blood of sick, infected patients overlap with sepsis, a condition in which the body – in response to infection by bacteria or other pathogens – mounts a damaging inflammatory reaction.
Fifteen PNNL scientists contributed to the study. Among the corresponding authors of the study are three PNNL scientists: Waters, Thomas Metz and Richard D. Smith. Three additional PNNL scientists – Jason P. Wendler, Jennifer E. Kyle and Kristin E. Burnum-Johnson – are among six scientists who share “first author” honors.
Other PNNL authors include Jon Jacobs, Young-Mo Kim, Cameron Casey, Kelly Stratton, Bobbie-Jo Webb-Robertson, Marina Gritsenko, Matthew Monroe, Karl Weitz, and Anil Shukla.
Analyses of proteins, lipids and metabolites in the blood samples were performed at EMSL, the Environmental Molecular Sciences Laboratory, a DOE Office of Science User Facility at PNNL.
The study was funded by a Japanese Health and Labor Sciences Research Grant; by grants for Scientific Research on Innovative Areas from the Ministry of Education, Cultures, Sports, Science and Technology of Japan; by Emerging/Re-emerging Infectious Diseases Project of Japan; and by the National Institute of Allergy and Infectious Diseases, part of the National Institutes of Health. Support was also provided by the Department of Scientific Computing at the Icahn School of Medicine at Mount Sinai and by a grant from the National Institute of General Medicine.
PNNL scientists conduct basic and applied research and development to strengthen U.S. scientific foundations for fundamental research and innovation; prevent and counter acts of terrorism through applied research in information analysis, cyber security, and the nonproliferation of weapons of mass destruction; increase the U.S. energy capacity and reduce dependence on imported oil; and reduce the effects of human activity on the environment. PNNL has been operated by Battelle Memorial Institute since 1965.
Algae was used as the feedstock for the hydrothermal liquefaction process at PNNL. No image credit.
After conversion at PNNL, algae are separated into biocrude (top black layer) and aqueous waste water.
No image credit.
Researchers at the Department of Energy’s Pacific Northwest National Laboratory have created a continuous thermo-chemical process that produces useful biocrude from algae. The process takes just minutes and PNNL is working with a company which has licensed the technology to build a pilot plant using the technology.
The first part of the conversion process, hydrothermal liquefaction, creates biocrude that can be upgraded to produce fuels such as gasoline, diesel, and jet fuel. It also produces a by-product wastewater stream which includes carbon and nutrients from the algae. A partnership with Washington State University researchers at the Tri-Cities campus led to a means of converting the wastewater stream, to a bio-based natural gas. In this process, any remaining solid material can be further recycled into the hydrothermal liquefaction process or converted to an agricultural fertilizer.
WSU researchers are using anaerobic microbes — those that don’t need oxygen — to break down the residue in the wastewater. Having a viable way of dealing with the wastewater enhances the commercial viability of creating biocrude from algae or even sewage sludge as demonstrated by PNNL.
PNNL scientists conduct basic and applied research and development to strengthen U.S. scientific foundations for fundamental research and innovation; prevent and counter acts of terrorism through applied research in information analysis, cyber security, and the nonproliferation of weapons of mass destruction; increase the U.S. energy capacity and reduce dependence on imported oil; and reduce the effects of human activity on the environment. PNNL has been operated by Battelle Memorial Institute since 1965.
The new metallic-organic framework, NU-1301, is made up of uranium oxide nodes and tricarboxylate organic linkers. Image courtesy of Northwestern University.
Two firsts in science came about because of a near-dare. According to Nigel Browning at Pacific Northwest National Laboratory, “Omar Farha was giving a presentation on MOFs [metal-organic frameworks] and someone said ‘I bet you couldn’t make one out of uranium.’” Farha took the challenge and proved them wrong. In designing the uranium-laden frameworks, PNNL scientists Dr. Nigel Browning and Dr. Layla Mehdi helped Farha and his colleagues at Northwestern University overcome a troubling bottleneck in imaging the material. Before this study, scientists used x-ray analysis and modeling to map out MOF structures. The approaches come with sharp drawbacks. Browning and Mehdi showed that low-dose imaging is a viable option for MOF imaging, allowing for the structure to be resolved at the near-atomic level.
This collaborative effort produced two notable milestones; it was first MOF made out of uranium, and the first time low-dose electron microscopy was used to map the MOF structure.
PNNL scientists conduct basic and applied research and development to strengthen U.S. scientific foundations for fundamental research and innovation; prevent and counter acts of terrorism through applied research in information analysis, cyber security, and the nonproliferation of weapons of mass destruction; increase the U.S. energy capacity and reduce dependence on imported oil; and reduce the effects of human activity on the environment. PNNL has been operated by Battelle Memorial Institute since 1965.
Recently, a PNNL-led proposal, “ExaGraph: Combinatorial Methods for Enabling Exascale Applications,” was selected as the fifth Exascale Computing Project (ECP) Co-Design Center. The center will focus on graph analytics, primarily combinatorial (graph) kernels. These kernels can access computing system resources to enhance data analytic computing applications but are among the most difficult to implement on parallel systems. Mahantesh Halappanavar, the Analytics and Algorithms Team Lead with ACMD Division’s Data Sciences group, will lead the center with Aydin Buluç, from Lawrence Berkeley National Laboratory; Erik Boman, of Sandia National Laboratories; and Alex Pothen, from Purdue University, serving as co-principal investigators.
PNNL is leading the fifth Exascale Computing Project Co-Design Center, which according to ECP leadership puts the program in “a better position to ready current and evolving data analytic computing applications for efficient use of capable exascale platforms.”
According to Halappanavar, the center will tackle developing key combinatorial algorithms arising from several exascale application domains such as power grid, computational chemistry, computational biology, and climate science. These applications and their respective growing data volumes increasingly pose an unprecedented need for larger computational resources to solve problems. This complexity will drive selection of kernels and their integration among software tools. To start, the intent is to work with the scientists involved in related ECP projects, such as NWChemEx, to fine-tune software tools that will perform on current and future extreme-scale systems, as well as enhance scientific discovery by providing more computation and flexibility to do what is needed for large volumes of data.
“In the end, the applications will need to benefit from the tools that incorporate the algorithms targeted for exascale architectures,” Halappanavar explained.
As part of the four-year project, the ExaGraph Co-Design Center will investigate a diversity of data analytic computational motifs, including graph traversals, graph matching and coloring, graph clustering and partitioning, parallel mixed-integer programs, and ordering and scheduling algorithms.
“The ExaGraph Co-Design Center’s aim is to highlight the value of graph kernels via co-design of key algorithmic motifs and science applications along with the classical hardware-software co-design of algorithmic kernels,” Halappanavar said. “These graph algorithms will augment how data analytics are performed for applications and scientific computing.”
Beyond its initial launch, Halappanavar noted the ExaGraph Co-Design Center aims to deliver a software library. The library will feature a set of frameworks that implement combinatorial kernels that can communicate with each other to enable scientific computing, which further empowers basic science research.
In addition, Adolfy Hoisie, PNNL’s Chief Scientist for Computing and Laboratory Fellow, explained that having a PNNL-led ECP Co-Design Center that takes advantage of Halappanavar’s considerable expertise and unites some key collaborators is a welcome and synergistic addition to the laboratory’s research landscape and capabilities.
“The ExaGraph Co-Design Center is technically important to ECP and will provide significant contributions that benefit its overall exascale program in a way that can be accessible and useful across many scientific application areas,” Hoisie said. “I look forward to seeing this center grow.”
About ECP
The U.S. Department of Energy’s Exascale Computing Project is responsible for developing the strategy, aligning the resources, and conducting the R&D necessary to achieve the nation’s imperative of delivering exascale computing by 2021. ECP’s mission is to ensure all the necessary pieces are in place for the first exascale systems—an ecosystem that includes mission critical applications, software stack, hardware architecture, advanced system engineering and hardware components to enable fully functional, capable exascale computing environments critical to national security, scientific discovery, and a strong U.S. economy.
The ECP is a collaborative project of two U.S. Department of Energy organizations, the Office of Science and the National Nuclear Security Administration.
PNNL scientists conduct basic and applied research and development to strengthen U.S. scientific foundations for fundamental research and innovation; prevent and counter acts of terrorism through applied research in information analysis, cyber security, and the nonproliferation of weapons of mass destruction; increase the U.S. energy capacity and reduce dependence on imported oil; and reduce the effects of human activity on the environment. PNNL has been operated by Battelle Memorial Institute since 1965.
Global Arrays Gets an Update from PNNL and Intel Corp.
Scientists Jeff Daily, Abhinav Vishnu, and Bruce Palmer, all from the ACMD Division High Performance Computing group at PNNL, served as the core team for a new release of the Global Arrays (GA) toolkit, known as Version 5.5. GA 5.5 provides additional support and bug fixes for the parallel Partitioned Global Address Space (PGAS) programing model.
GA 5.5 incorporates support for libfabric (https://ofiwg.github.io/libfabric/), which helps meet performance and scalability requirements of high-performance applications, such as PGAS programming models (like GA), Message Passing Interface (MPI) libraries, and enterprise applications running in tightly coupled network environments. The updates to GA 5.5 resulted from a coordinated effort between the GA team and Intel Corp. Along with incorporating support for libfabric, the update added native support for the Intel Omni-Path high-performance communication architecture and applied numerous bug fixes since the previous GA 5.4 release to both Version 5.5 and the ga-5-4 release branch of GA’s subversion repository.
Originally developed in the late 1990s at PNNL, the GA toolkit offers diverse libraries employed within many applications, including quantum chemistry and molecular dynamics codes (notably, NWChem), as well as those used for computational fluid dynamics, atmospheric sciences, astrophysics, and bioinformatics.
“This was a significant effort from Intel to work with us on the libfabric, and eventual Intel Omni-Path, support,” Daily explained. “Had we not refactored our Global Arrays one-sided communication library, ComEx, a few years ago to make it easier to port to new systems, this would not have been possible. Now that our code is much easier to integrate with, we envision more collaborations like this in the future.”
Download information for GA 5.5. and the GA subversion repository is available here.
PNNL scientists conduct basic and applied research and development to strengthen U.S. scientific foundations for fundamental research and innovation; prevent and counter acts of terrorism through applied research in information analysis, cyber security, and the nonproliferation of weapons of mass destruction; increase the U.S. energy capacity and reduce dependence on imported oil; and reduce the effects of human activity on the environment. PNNL has been operated by Battelle Memorial Institute since 1965.
In what is believed to be the largest study of its kind, scientists at the Pacific Northwest National Laboratory, Johns Hopkins University and their collaborators from institutions across the nation have examined the collections of proteins in the tumors of 169 ovarian cancer patients to identify critical proteins present in their tumors.
By integrating their findings about the collection of proteins (the proteome) with information already known about the tumors’ genetic data (the genome), the investigators report the potential for new insights into the progress of the most malignant form of the disease. The work is published June 29, 2016, in the advance online edition of Cell [science paper not made available].
The researchers say their achievement illustrates the power of combining genomic and proteomic data — an approach known as proteogenomics — to yield a more complete picture of the biology of a cancer that accounts for three percent of all cancers in women and is the fifth leading cause of cancer deaths among women in the United States.
“Historically, cancer’s been looked at as a disease of the genome,” said Karin Rodland, a senior author of the study and chief scientist for biomedical research at PNNL, a U.S. Department of Energy laboratory.
Karin Rodland
“But that genome has to express itself in functional outcomes, and that’s what the proteomic data adds, because proteins do the actual work of the genome.”
Daniel W. Chan, the study’s other senior author, who led the team at the Johns Hopkins University School of Medicine, said, “Correlating our data with clinical outcomes is the first step toward the eventual ability to predict outcomes that reflect patient survival, with potential applications for precision medicine and new targets for pharmaceutical interventions. But just like anything in medicine, clinical validation will be a long and rigorous process.”
The authors say that with the findings, researchers expect to be better able to identify the biological factors defining the 70% of ovarian cancer patients who suffer from the most malignant form of ovarian cancer, called high-grade serous carcinoma. Currently, only one in six such patients lives five or more years beyond diagnosis.
The work draws on the efforts of physicians, scientists and patients who have worked together to understand ovarian cancer. The investigators say the effort requires collaboration among physicians as well as patients willing to take part in research to benefit others with the disease or even to prevent others from ever developing cancer.
Under the leadership of the National Cancer Institute, scientists around the nation have worked together to create the Cancer Genome Atlas (TCGA), a collaborative effort to map cancer’s genetic mutations. The task for ovarian cancer was completed in 2011. In the current study, the PNNL and JHU teams each studied subsets of 169 high-grade serous carcinoma tissue samples and accompanying genomic and clinical data drawn from that study.
The Johns Hopkins team initially selected 122 of the samples based on those tumors’ ability to repair damaged DNA — known as homologous recombination deficiency — and characterized by changes in genes including BRCA1, BRCA2 and PTEN, mutations long linked to increased cancer risk and severity.
“We chose to examine these samples because patients with changes in these genes already are benefiting from a specific drug regimen for breast cancer, so if we could find similar changes in ovarian cancer genomes and proteomes, those patients would likely benefit from the same regimen,” said Chan, a professor of pathology and oncology at JHU. Chan is one of the inventors of the OVA1 ovarian cancer detection test, which is licensed to Vermillion Inc. of Austin, Texas.
The PNNL team initially selected 84 samples based on overall patient survival times. “We examined the data for the shortest-surviving patients and the longest-surviving patients hoping to pinpoint biological factors associated with extremely short survival or better-than-average, longer survival,” said Rodland.
Then, through their participation in the Clinical Proteomic Tumor Analysis Consortium (CPTAC), another program of the National Cancer Institute which funded both teams, the two groups combined their efforts.
Using protein measurement and identification techniques based on mass spectrometry, the teams identified 9,600 proteins in all the tumors, and pursued study on 3,586 proteins common to all 169 tumor samples.
Beyond the genome
While many people are familiar with the role our genes play in the development of cancer, the genes are often just a starting point, for patients and researchers alike. Genes are transcribed into RNA, the genetic material that makes proteins, which are the workhorses of cells. The activity of the proteins varies dramatically, with many undergoing changes that affect their impact and interactions with other proteins.
A detailed look at the activity of proteins in cancer biology gives researchers insight into specific molecular events that would otherwise remain unknown.
This figure illustrates the difference in survival between two small groups of women with ovarian cancer differentiated by a complex pattern of proteins in their tumors. These proteins are all affected by changes in the patient’s DNA. The group with greater expression of the signature proteins has an overall survival more than three times better than those in whom this particular set of proteins was less active. Scientists hope it will be possible to tailor a woman’s treatment for ovarian cancer based on the levels of such proteins once these findings are validated.
Courtesy of Zhang et al./Cell, 2016
The illustration hints at the complexity of the development of ovarian cancer. Each of the purple boxes signifies a particular protein, a cellular workhorse which affects other proteins and ultimately the behavior of a cell. Here, more than two dozen proteins are affected by a protein known as PDGFR, or platelet-derived growth factor receptor, which plays an important role in the formation of new blood vessels. The team showed that this molecule and its pathways are much more active in the tumors of patients with ovarian cancer who had short survival compared to other patients who lived longer than five years.
Courtesy of Zhang et al./Cell, 2016
A hallmark of cancer, and particularly high grade serous carcinoma, is when genetic instructions go awry. One form is the appearance of more copies of certain regions of the genome. These so-called copy number alterations can lead to changes in protein abundance. When the researchers compared known regions of copy number alterations, they found that parts of chromosomes 2, 7, 20 and 22 led to changes in abundance of more than 200 proteins. A more careful study of those 200 proteins revealed that many are involved in cell movement and immune system function, both processes implicated in cancer progression, the researchers said.
“Adding the information about the proteome on top of the genome provides an entirely new dimension of information that has enabled the discovery of new biological insights to ovarian cancer, while creating a valuable resource that the scientific community can use to generate new hypotheses about the disease, and how to treat it,” said Rodland.
“High grade serous carcinoma is such a challenging disease, requiring complex clinical care to achieve long-term survival. This new knowledge gives us new directions to test in the lab and clinic,” said study author Douglas A. Levine, director of gynecologic oncology at the Laura and Isaac Perlmutter Cancer Center of NYU Langone Medical Center. “This proteogenomic analysis will help us improve patient outcomes and quality of life.”
In addition to large teams of scientists from PNNL and Johns Hopkins, contributors included colleagues from Stanford University School of Medicine, Vanderbilt University School of Medicine, University of California at San Diego, New York University School of Medicine, Virginia Tech, the National Cancer Institute’s Office of Cancer Clinical Proteomics Research, as well as CPTAC investigators.
The proteomic analyses performed by the PNNL team were done at EMSL, the Environmental Molecular Sciences Laboratory, a DOE Office of Science user facility.
Reference: Hui Zhang, Tao Liu, Zhen Zhang, Samuel H. Payne, Bai Zhang, Jason E. McDermott, Jian-Ying Zhou, Vladislav A. Petyuk, Li Chen, Debjit Ray, Shisheng Sun, Feng Yang, Lijun Chen, Jing Wang, Punit Shah, Seong Won Cha, Paul Aiyetan, Sunghee Woo, Yuan Tian, Marina A. Gritsenko, Therese R. Clauss, Caitlin Choi, Matthew E. Monroe, Stefani Thomas, Song Nie, Chaochao Wu, Ronald J. Moore, Kun-Hsing Yu, David L. Tabb, David Fenyö, Vineet Bafna, Yue Wang, Henry Rodriguez, Emily S. Boja, Tara Hiltke, Robert C. Rivers, Lori Sokoll, Heng Zhu, Ie-Ming Shih, Leslie Cope, Akhilesh Pandey, Bing Zhang, Michael P. Snyder, Douglas A. Levine, Richard D. Smith, Daniel W. Chan, Karin D. Rodland, and the CPTAC investigators, Integrated Proteogenomic Characterization of Human High-Grade Serous Ovarian Cancer, Cell, June 29, 2016,http://www.cell.com/cell/pdf/S0092-8674%2816%2930673-0.pdf
PNNL scientists conduct basic and applied research and development to strengthen U.S. scientific foundations for fundamental research and innovation; prevent and counter acts of terrorism through applied research in information analysis, cyber security, and the nonproliferation of weapons of mass destruction; increase the U.S. energy capacity and reduce dependence on imported oil; and reduce the effects of human activity on the environment. PNNL has been operated by Battelle Memorial Institute since 1965.
The study compared two models of very large ions (macroions) that carry an electrical charge, and compared the results to experimental studies. On the left is a model of a cylinder with uniform axial charge density; on the right is the more complex (and useful) discrete charge model.
April 2016
Refined insights into critical ionic interactions with nature’s building blocks
Nucleic acids, large biomolecules essential to life, include the familiar double-stranded deoxyribonucleic acid (DNA), a very stable long molecule that stores genetic information.
In nature, DNA exists within a solution rife with electrostatically charged atoms or molecules called ions. A recent study by researchers at Pacific Northwest National Laboratory (PNNL) proposed a new model of how B-DNA, the form of DNA that predominates in cells, is influenced by the water-and-ions “atmosphere” around it.
Understanding the ionic atmosphere around nucleic acids, and being able to simulate its dynamics, is important. After all, this atmosphere stabilizes DNA’s structure; it impacts how DNA is folded and “packed” in cells, which triggers important biological functions; and it strongly influences how proteins and drugs bind to DNA.
The research combines theoretical modeling and experiments in a study of ion distribution around DNA. It was led by PNNL physical scientist Maria Sushko, computational biologist Dennis Thomas, and applied mathematician Nathan Baker, in concert with colleagues from Cornell University and Virginia Tech.
Earlier approaches have been used to simulate the distribution of ions around biomolecules like DNA. But only roughly. The PNNL-led study goes beyond commonplace electrostatics to propose a more refined but still computationally efficient model of what happens in these critical ionic atmospheres.
“The main idea was to dissect the complex interplay of interactions, and to understand the main forces driving ions deep inside the DNA helix and the forces keeping them on its surface,” said Sushko, the paper’s first author. That interplay includes the correlation of ions within the solution, how they move, how they interact with one another, and how they interact with the DNA.
The new model has two key advantages over older simulations: It allows researchers to turn ion-water and ion-ion interactions on and off at will. “We can calculate important interactions independently,” she said, a flexibility not present in previous simulations. And the new model is computationally efficient, allowing researchers to cheaply simulate a large-scale molecular event over a long time scale.
Results: Importantly, both previous and new experiments by the Cornell colleagues identified the number of bound ions around DNA. Previous simplified models were also able to reproduce this number. But the new model “is richer than that,” said Sushko, because it gives more details on how ions are distributed along the surface of DNA and within DNA’s critical grooves. “DNA interaction will strongly depend on where those ions sit,” she said. For one, the presence of ions in the grooves relates to how compact DNA will be. “The more ions within the grooves,” said Sushko, “the more compact the structure.”
The researchers confirmed that biological “correlation,” a measure of ion affinity, allowed DNA to pack more tightly by effectively neutralizing DNA’s electrostatic charge. Researchers also observed how ions get distributed through a solution, a water-ion interaction called solvation. The stronger the water-ion interaction, the larger the effective ion size, and therefore the less likely the ion was to settle in the DNA’s grooves. More strongly solvated ions, therefore, create a different environment for DNA folding.
Researchers observed results regarding the activity of three types of salts within the simulated ionic environment. Small, single-charge ions did not strongly react with water; about 50 percent of these bond ions could penetrate into DNA grooves. Large ions with triple charges were not strongly hydrated, but their size prevented penetration into the grooves. (“They just decorate the surface,” said Sushko.)
Only 15 to 20 percent of ions with double charges, which were strongly hydrated and strongly correlated, settled in DNA grooves. That showed a “very delicate interplay” of ion-to-ion and ion-to-water interactions, according to Sushko.
Why It Matters: These results highlight important aspects of the properties of electrolyte solutions influencing the ionic atmosphere that impacts DNA condensation. This “packing” of DNA, which is otherwise one of the longest molecules in nature, is essential to DNA’s role in gene regulation. DNA condensation is also the key to protein binding and drug binding. It therefore points to practical applications in medicine and biotechnology.
This research also highlights the impact of the ionic atmosphere on the interaction between biomolecules and a ligand: that is, the molecule, ion, or protein that binds with a protein or the DNA double helix for some biological purpose.
But it is the “methodology itself,” not the designed simulations of DNA, that is most important, said Sushko, in part because it provides a new computational model of how to see into complex molecular systems. “We get a better fundamental understanding of the important forces.”
Methods: Researchers employed two coarse-grained models to simulate the DNA macroion, which is a large colloidal ion carrying a charge. The goal was to capture two versions of detail on how ions spread out in a solvent and how they interact with simulations of DNA topology.
One DNA model posited an infinitely long cylinder with a uniform charge density along one axis. Sushko called it “a very crude model used a lot in the past. It explains quite a lot about ion interactions, but it is deficient in some ways.” The second, more complex “discrete charge” model posited three types of spheres in a helical array that mimics B-form DNA. It had a 3D-like character that allowed ions to penetrate into DNA grooves.
The DNA simulations were run through four computational models of classical density function theory to assess the energetics of different ion-DNA interactions. Results were also compared to data from what Sushko called “state-of-the-art experiments” that used anomalous small-angle x-ray scattering. This technique, used to investigate the spatial dimensions of structures in the nanometer range, always yields a lot of detail about how ions are distributed around a biomolecule.
The uniformly charged cylinder model was not good at simulating the ionic atmosphere around DNA. “This model is a very common simplification,” said Sushko. “You get the same number of ions attached to DNA, but the distribution is completely wrong. In this model, ions will just sit somewhere on the surface.”
But their more complex discrete charge model provided a much more naturalistic portrait of ion distribution in an ionic atmosphere. Its simulations showed ions both clinging to the helical DNA surface and also penetrating into the DNA’s grooves. “The small details of ion penetration are very important for the way DNA will package the chromosome,” she said.
What’s Next? Researchers plan to study the role of the ionic atmosphere in mediating interactions between DNA molecules. They also plan to extend their DNA model to include DNA sequence-specific effects, which often influence ion binding, and DNA sequence-dependent structural variations.
PNNL scientists conduct basic and applied research and development to strengthen U.S. scientific foundations for fundamental research and innovation; prevent and counter acts of terrorism through applied research in information analysis, cyber security, and the nonproliferation of weapons of mass destruction; increase the U.S. energy capacity and reduce dependence on imported oil; and reduce the effects of human activity on the environment. PNNL has been operated by Battelle Memorial Institute since 1965.
While glass might be thought of in terms of holding wine or as a window, the stability of glass affects areas as diverse as nuclear waste storage, pharmaceuticals, and ice cream. Recently, chemical physicists at Pacific Northwest National Laboratory made a key discovery about how glass forms.
They discovered that the temperature at which glass-forming materials are deposited on a substrate affects the stability. Their findings, published in The Journal of Physical Chemistry Letters, show the ability of a technique called inert gas permeation to tell at what temperature a solid “melts.” Their work brings more understanding to the fundamental properties of glass.
“Glasses are metastable materials with the mechanical properties of a solid-you can touch and hold them, versus a gas,” said Dr. Scott Smith, a co-author on the paper. “But they are not like crystalline materials, which are in a perfect array. The molecules in glasses are arranged in a disordered pattern. In liquids the molecules are constantly moving, if you suddenly freeze a liquid, the molecules are randomly oriented and unstructured. In some sense, a glass can be thought of as a frozen liquid.”
Why It Matters: No matter how glass is made, understanding its properties is important. For example, the reason some medications have expiration dates is that their physical state changes from amorphous to crystalline. Once that happens, the medication doesn’t dissolve as readily when taken and is thus ineffective. Finding ways to increase its stability and effectiveness would extend its shelf life. Similarly, when nuclear waste is put into a glass matrix, the glass must remain stable to keep the radionuclides from being released. And as most ice cream lovers know, when you open a carton and see crystals have formed on the surface, it has lost much of its flavor.
Methods: “Our research is fundamental work that could be important for stable glass manufacture by adding to understanding of liquids and liquid behavior,” Smith said. Glasses depend on temperature for stability. At the correct temperature, a glass remains stable because its molecules stay put. At warmer temperatures, it transforms into a supercooled liquid and then crystallizes.
To create a glass, the materials must be cooled rapidly to a temperature low enough that the molecules don’t have enough time or energy to find the lowest energy configuration (a crystal). That temperature is called the glass transition temperature, or Tg, and it varies depending on the experimental conditions and the cooling rate.
Smith and colleagues Dr. Alan May and Dr. Bruce Kay took the glass-forming materials toluene and ethylbenzene and super cooled them by depositing them onto a surface at 30 K. When the materials hit the surface, they formed an amorphous solid—a glass. The researchers then heated the sample. A layer of krypton deposited between two layers of glassy material (a sandwich) remained trapped until the glass transformed into a supercooled liquid (see Figure). The onset of gas release revealed at what temperature the glass transformed into a supercooled liquid.
The researchers varied the material deposition temperature from 40 to 130 K. They observed that the stability of the glass depended on the deposition temperature. They found that for both toluene and ethylbenzene, deposition at a temperature a few degrees less than Tg, created the most stable glass-one that was the most resistant to turning into a supercooled liquid. These results are consistent with the calorimetric studies of Prof. Mark Ediger at the University of Wisconsin-Madison.
“We found we can control one variable: deposition temperature. Even a difference of one Kelvin can result in years of difference in material lifetime and stability,” said Smith.
Acknowledgments:
Sponsors: This work was supported by the U.S. Department of Energy (DOE), Office of Science, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences, and Biosciences. The research was performed using EMSL, a DOE Office of Science User Facility sponsored by the Office of Biological and Environmental Research and located at PNNL.
PNNL scientists conduct basic and applied research and development to strengthen U.S. scientific foundations for fundamental research and innovation; prevent and counter acts of terrorism through applied research in information analysis, cyber security, and the nonproliferation of weapons of mass destruction; increase the U.S. energy capacity and reduce dependence on imported oil; and reduce the effects of human activity on the environment. PNNL has been operated by Battelle Memorial Institute since 1965.
Using microbial consortia may boost success of biotechnologies
Results: Around the world, researchers are studying microbes to see if these tiny organisms can be used to solve a host of problems, from cleaning up toxic waste to providing renewable energy. Unfortunately, attempts to develop biotechnologies often fall short because they focus on a limited set of single, highly engineered organisms. Such organisms frequently do not perform as efficiently or stably in an application as they do in the laboratory.
Now, an internationally recognized group of scientists, organized by Pacific Northwest National Laboratory microbiologists Dr. Stephen Lindemann and Dr. Alexander Beliaev, has reviewed the state of the science to determine how biotechnological use of communities of multiple microbes, or microbial consortia, might transcend the limitations of single organisms.
They posit that the time is ripe for design and control of microbial communities, and that achieving the ability to engineer microbial ecosystems will require a level of understanding of the mechanisms driving microbial community function only possible from combining recent advances in systems biology, computational modeling, and synthetic biology.
These new perspectives stemmed from a panel at the 15th International Symposium on Microbial Ecology in Seoul and appear in the International Society for Microbial Ecology’s (ISME) official publication, The ISME Journal.
Why It Matters: Agriculture has long known that monocultures, or growing only one type of crop, can be susceptible to changes in the environment. For example, relatively small or poorly timed changes in rainfall can cause major losses in production for some crops. In contrast, growing several crops with different tolerances to drought might more stably provide food, no matter the weather for a given year. The same principle applies to microbes, which are drivers of global geochemical cycles and catalysts for renewable fuels and chemicals. Microbial communities can prove to be more reliable than engineered “superbugs” and more robust against unpredictable environment than individual microbes. This reliability is the key to using them for industrial purposes.
“The promise that this field has to offer is great,” said Beliaev. “Transformative biotechnologies will help overcome the energy, health, and environmental problems of the future, and the process of learning to design and control ecological phenomena has and will undoubtedly continue to yield new insights on the fundamentals of life.”
Methods: Seven scientists from PNNL, Montana State University, Fred Hutchinson Cancer Research Center, and the Swiss Federal Institute of Technology brought perspectives from different scientific approaches, research programs, and countries to analyze the state of the science. They used questions posed by experts who attended the ISME symposium to outline key issues.
Drawing on their years of experience and amassed knowledge, the group determined that successful biosystems design is contingent both on the understanding of microbial physiology and accuracy of computational models that describe how organisms interact. An iterative design-build-test approach that can predict interspecies dynamics and analyze energy and material flows in a community will help scientists better understand how these consortia can be used for biotechnologies.
What’s Next? PNNL’s microbial research program continues to expand the foundation of biological systems design. Ideally, advances in this field will allow scientists to control safety, productivity, and stability of natural and designed microbial ecosystems.
Acknowledgments
Sponsors: The U.S. Department of Energy’s Office of Science, Office of Biological and Environmental Research, supported this work via the Genomic Science Program under the PNNL Foundational Scientific Focus Area. MWF is supported by the Scientific Focus Area Program at Lawrence Berkeley National Laboratory. HCB participated with support from the Linus Pauling Distinguished Postdoctoral Fellowship, a Laboratory Directed Research and Development Program at PNNL.
Research Team: Alexander S. Beliaev, Hans C. Bernstein, Jim K. Fredrickson, Stephen R. Lindemann, and Hyun-Seob Song, Pacific Northwest National Laboratory; Matthew W. Fields, Montana State University; Wenying Shou, Fred Hutchinson Cancer Research Center; and David R. Johnson, Swiss Federal Institute of Technology.
Reference: Lindemann SR, HC Bernstein, H-S Song, JK Fredrickson, MW Fields, W Shou, DR Johnson, and AS Beliaev. 2016. “Engineering Microbial Consortia for Controllable Outputs.” The ISME Journal: Multidisciplinary Journal of Microbial Ecology. Advance online publication 11 March 2016. DOI: 10.1038/ismej.2016.26.
PNNL scientists conduct basic and applied research and development to strengthen U.S. scientific foundations for fundamental research and innovation; prevent and counter acts of terrorism through applied research in information analysis, cyber security, and the nonproliferation of weapons of mass destruction; increase the U.S. energy capacity and reduce dependence on imported oil; and reduce the effects of human activity on the environment. PNNL has been operated by Battelle Memorial Institute since 1965.
Reply