Tagged: Lawrence Berkeley National laboratory Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 2:58 pm on October 30, 2014 Permalink | Reply
    Tags: , , Lawrence Berkeley National laboratory, ,   

    From LBL: “Lord of the Microrings” 

    Berkeley Logo

    Berkeley Lab

    October 30, 2014
    Lynn Yarris (510) 486-5375

    A significant breakthrough in laser technology has been reported by the U.S. Department of Energy (DOE)’s Lawrence Berkeley National Laboratory (Berkeley Lab) and the University of California (UC) Berkeley. Scientists led by Xiang Zhang, a physicist with joint appointments at Berkeley Lab and UC Berkeley, have developed a unique microring laser cavity that can produce single-mode lasing even from a conventional multi-mode laser cavity. This ability to provide single-mode lasing on demand holds ramifications for a wide range of applications including optical metrology and interferometry, optical data storage, high-resolution spectroscopy and optical communications.

    “Losses are typically undesirable in optics but, by deliberately exploiting the interplay between optical loss and gain based on the concept of parity-time symmetry, we have designed a microring laser cavity that exhibits intrinsic single-mode lasing regardless of the gain spectral bandwidth,” says Zhang, who directs Berkeley Lab’s Materials Sciences Division and is UC Berkeley’s Ernest S. Kuh Endowed Chair Professor. “This approach also provides an experimental platform to study parity-time symmetry and phase transition phenomena that originated from quantum field theory yet have been inaccessible so far in experiments. It can fundamentally broaden optical science at both semi-classical and quantum levels”

    zz
    Xiang Zhang, director of Berkeley Lab’s Materials Sciences Division. (Photo by Roy Kaltschmidt)

    Zhang, who also directs the National Science Foundation’s Nano-scale Science and Engineering Center, and is a member of the Kavli Energy NanoSciences Institute at Berkeley, is the corresponding author of a paper in Science that describes this work. The paper is titled Single-Mode Laser by Parity-time Symmetry Breaking. Co-authors are Liang Feng, Zi Jing Wong, Ren-Min Ma and Yuan Wang.

    A laser cavity or resonator is the mirrored component of a laser in which light reflected multiple times yields a standing wave at certain resonance frequencies called modes. Laser cavities typically support multiple modes because their dimensions are much larger than optical wavelengths. Competition between modes limits the optical gain in amplitude and results in random fluctuations and instabilities in the emitted laser beams.

    “For many applications, single-mode lasing is desirable for its stable operation, better beam quality, and easier manipulation,” Zhang says. “Light emission from a single-mode laser is monochromatic with low phase and intensity noises, but creating sufficiently modulated optical gain and loss to obtain single-mode lasing has been a challenge.”
    Scanning electron microscope image of the fabricated PT symmetry microring laser cavity.

    image
    Scanning electron microscope image of the fabricated PT symmetry microring laser cavity.

    While mode manipulation and selection strategies have been developed to achieve single-mode lasing, each of these strategies has only been applicable to specific configurations. The microring laser cavity developed by Zhang’s group is the first successful concept for a general design. The key to their success is using the concept of the breaking of parity-time (PT) symmetry. The law of parity-time symmetry dictates that the properties of a system, like a beam of light, remain the same even if the system’s spatial configuration is reversed, like a mirror image, or the direction of time runs backward. Zhang and his group discovered a phenomenon called “thresholdless parity-time symmetry breaking” that provides them with unprecedented control over the resonant modes of their microring laser cavity, a critical requirement for emission control in laser physics and applications.

    lf
    Liang Feng

    “Thresholdless PT symmetry breaking means that our light beam undergoes symmetry breaking once the gain/loss contrast is introduced no matter how large this contrast is,” says Liang Feng, lead author of the Science paper, a recent posdoc in Zhang’s group and now an assistant professor with the University at Buffalo. “In other words, the threshold for PT symmetry breaking is zero gain/loss contrast.”

    Zhang, Feng and the other members of the team were able to exploit the phenomenon of thresholdless PT symmetry breaking through the fabrication of a unique microring laser cavity. This cavity consists of bilayered structures of chromium/germanium arranged periodically in the azimuthal direction on top of a microring resonator made from an indium-gallium-arsenide-phosphide compound on a substrate of indium phosphide. The diameter of the microring is 9 micrometers.

    “The introduced rotational symmetry in our microring resonator is continuous, mimicking an infinite system,” says Feng. “The counterintuitive discovery we made is that PT symmetry does not hold even at an infinitesimal gain/loss modulation when a system is rotationally symmetric. This was not observed in previous one-dimensional PT modulation systems because those finite systems did not support any continuous symmetry operations.”

    Using the continuous rotational symmetry of their microring laser cavity to facilitate thresholdless PT symmetry breaking,

    Zhang, Feng and their collaborators are able to delicately manipulate optical gain and loss in such a manner as to ultimately yield single-mode lasing.

    “PT symmetry breaking means an optical mode can be gain-dominant for lasing, whereas PT symmetry means all the modes remain passive,” says Zi-Jing Wong, co-lead author and a graduate student in Zhang’s group. “With our microring laser cavity, we facilitate a desired mode in PT symmetry breaking, while keeping all other modes PT symmetric. Although PT symmetry breaking by itself cannot guarantee single-mode lasing, when acting together with PT symmetry for all other modes, it facilitates single-mode lasing.”

    In their Science paper, the researchers suggest that single-mode lasing through PT-symmetry breaking could pave the way to next generation optoelectronic devices for communications and computing as it enables the independent manipulation of multiple laser beams without the “crosstalk” problems that plague today’s systems. Their microring laser cavity concept might also be used to engineer optical modes in a typical multi-mode laser cavity to create a desired lasing mode and emission pattern.

    “Our microring laser cavities could also replace the large laser boxes that are routinely used in labs and industry today,” Feng says. “Moreover, the demonstrated single-mode operation regardless of gain spectral bandwidth may create a laser chip carrying trillions of informational signals at different frequencies. This would make it possible to shrink a huge datacenter onto a tiny photonic chip.”

    This research was supported by the Office of Naval Research MURI program.

    See the full article here.

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 3:20 pm on October 29, 2014 Permalink | Reply
    Tags: , , , , , Lawrence Berkeley National laboratory   

    From LBL: “New Lab Startup Afingen Uses Precision Method to Enhance Plants” 

    Berkeley Logo

    Berkeley Lab

    October 29, 2014
    Julie Chao (510) 486-6491

    Imagine being able to precisely control specific tissues of a plant to enhance desired traits without affecting the plant’s overall function. Thus a rubber tree could be manipulated to produce more natural latex. Trees grown for wood could be made with higher lignin content, making for stronger yet lighter-weight lumber. Crops could be altered so that only the leaves and certain other tissues had more wax, thus enhancing the plant’s drought tolerance, while its roots and other functions were unaffected.

    By manipulating a plant’s metabolic pathways, two scientists at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab), Henrik Scheller and Dominique Loqué, have figured out a way to genetically rewire plants to allow for an exceptionally high level of control over the spatial pattern of gene expression, while at the same time boosting expression to very high levels. Now they have launched a startup company called Afingen to apply this technology for developing low-cost biofuels that could be cost-competitive with gasoline and corn ethanol.

    two
    Henrik Scheller (left) and Dominique Loque hold a tray of Arabidopsis Thaliana plants, which they used in their research. (Berkeley Lab photo)

    “With this tool we seem to have found a way to control very specifically what tissue or cell type expresses whatever we want to express,” said Scheller. “It’s a new way that people haven’t thought about to increase metabolic pathways. It could be for making more cell wall, for increasing the stress tolerance response in a specific tissue. We think there are many different applications.”

    Cost-competitive biofuels

    Afingen was awarded a Small Business Innovation Research (SBIR) grant earlier this year for $1.72 million to engineer switchgrass plants that will contain 20 percent more fermentable sugar and 40 percent less lignin in selected structures. The grant was provided under a new SBIR program at DOE that combines an SBIR grant with an option to license a specific technology produced at a national laboratory or university through DOE-supported research.

    “Techno-economic modeling done at (the Joint BioEnergy Institute, or JBEI) has shown that you would get a 23 percent reduction in the price of the biofuel with just a 20 percent reduction in lignin,” said Loqué. “If we could also increase the sugar content and make it easier to extract, that would reduce the price even further. But of course it also depends on the downstream efficiency.”

    Scheller and Loqué are plant biologists with the Department of Energy’s Joint BioEnergy Institute (JBEI), a Berkeley Lab-led research center established in 2007 to pursue breakthroughs in the production of cellulosic biofuels. Scheller heads the Feedstocks Division and Loqué leads the cell wall engineering group.

    The problem with too much lignin in biofuel feedstocks is that it is difficult and expensive to break down; reducing lignin content would allow the carbohydrates to be released and converted into fuels much more cost-effectively. Although low-lignin plants have been engineered, they grow poorly because important tissues lack the strength and structural integrity provided by the lignin. With Afingen’s technique, the plant can be manipulated to retain high lignin levels only in its water-carrying vascular cells, where cell-wall strength is needed for survival, but low levels throughout the rest of the plant.

    The centerpiece of Afingen’s technology is an “artificial positive feedback loop,” or APFL. The concept targets master transcription factors, which are molecules that regulate the expression of genes involved in certain biosynthetic processes, that is, whether certain genes are turned “on” or “off.” The APFL technology is a breakthrough in plant biotechnology, and Loqué and Scheller recently received an R&D 100 Award for the invention.

    An APFL is a segment of artificially produced DNA coded with instructions to make additional copies of a master transcription factor; when it is inserted at the start of a chosen biosynthetic pathway—such as the pathway that produces cellulose in fiber tissues—the plant cell will synthesize the cellulose and also make a copy of the master transcription factor that launched the cycle in the first place. Thus the cycle starts all over again, boosting cellulose production.

    The process differs from classical genetic engineering. “Some people distinguish between ‘transgenic’ and ‘cisgenic.’ We’re using only pieces of DNA that are already in that plant and just rearranging them in a new way,” said Scheller. “We’re not bringing in foreign DNA.”

    Other licensees and applications

    This breakthrough technique can also be used in fungi and for a wide variety of uses in plants, for example, to increase food crop yields or to boost production of highly specialized molecules used by the pharmaceutical and chemical industries. “It could also increase the quality of forage crops, such as hay fed to cows, by increasing the sugar content or improving the digestibility,” Loqué said.

    Another intriguing application is for biomanufacturing. By engineering plants to grow entirely new pharmaceuticals, specialty chemicals, or polymer materials, the plant essentially becomes a “factory.” “We’re interested in using the plant itself as a host for production,” Scheller said. “Just like you can upregulate pathways in plants that make cell walls or oil, you can also upregulate pathways that make other compounds or properties of interest.”

    Separately, two other companies are using the APFL technology. Tire manufacturer Bridgestone has a cooperative research and development agreement (CRADA) with JBEI to develop more productive rubber-producing plants. FuturaGene, a Brazilian paper and biomass company, has licensed the technology for exclusive use with eucalyptus trees and several other crops; APFL can enhance or develop traits to optimize wood quality for pulping and bioenergy applications.

    “The inventors/founders of Afingen made the decision to not compete for a license in fields of use that were of interest to other companies that had approached JBEI. This allowed JBEI to move the technology forward more quickly on several fronts,” said Robin Johnston, Berkeley Lab’s Acting Deputy Chief Technology Transfer Officer. “APFL is a very insightful platform technology, and I think only a fraction of the applications have even been considered yet.”

    Afingen currently has one employee—Ai Oikawa, a former postdoctoral researcher and now the director of plant engineering—and will be hiring three more in November. It is the third startup company to spin out of JBEI. The first two were Lygos, which uses synthetic biology tools to produce chemical compounds, and TeselaGen, which makes tools for DNA synthesis and cloning.

    See the full article here.

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 4:05 pm on October 28, 2014 Permalink | Reply
    Tags: , CUORE collaboration, Lawrence Berkeley National laboratory,   

    From LBL: “Creating the Coldest Cubic Meter in the Universe” 

    Berkeley Logo

    Berkeley Lab

    October 28, 2014
    Kate Greene 510-486-4404

    In an underground laboratory in Italy, an international team of scientists has created the coldest cubic meter in the universe. The cooled chamber—roughly the size of a vending machine—was chilled to 6 milliKelvin or -273.144 degrees Celsius in preparation for a forthcoming experiment that will study neutrinos, ghostlike particles that could hold the key to the existence of matter around us.

    cube
    Scientist inspect the cryostat of the of the Cryogenic Underground Observatory for Rare Events. Credit: CUORE collaboration

    The collaboration responsible for the record-setting refrigeration is called the Cryogenic Underground Observatory for Rare Events (CUORE), supported jointly by the Istituto Nazionale di Fisica Nucleare (INFN) in Italy, and the Department of Energy’s Office of Science and National Science Foundation in the US. Lawrence Berkeley National Lab (Berkeley Lab) manages the CUORE project in the US. The CUORE collaboration is made of 157 scientists from the U.S., Italy, China, Spain, and France, and is based in the underground Italian facility called Laboratori Nazionali del Gran Sasso (LNGS) of the INFN.

    “We’ve been building this experiment for almost ten years,” says Yury Kolomensky, senior faculty scientist in the Physics Division of Berkeley Lab, professor of physics at UC Berkeley, and U.S. spokesperson for the CUORE collaboration. “This is a tremendous feat of cryogenics. We’ve exceeded our goal of 10 milliKelvin. Nothing in the universe this large has ever been as cold.”

    The chamber, technically called a cryostat, was designed and built in Italy, and maintained the ultra-cold temperature for more than two weeks. An international team of physicists, including students and postdoctoral scholars from Italy and the US, worked for over two years to assemble the cryostat, iron out the kinks, and demonstrate its record-breaking performance. The claim that no other object of similar size and temperature – either natural or man-made – exists in the universe was detailed in a recent paper by Jonathan Ouellet, Berkeley Lab Nuclear Science staff and UC Berkeley graduate student.

    In order to achieve such a low-temperature cryostat, the team used a multi chamber design that looks something like Russian nesting dolls: six chambers in total, each becoming progressively smaller and colder.

    dolls
    An illustration of the cross-section of the cryostat with a human figure for scale. Credit: CUORE collaboration

    The chambers are evacuated, isolating the insides from the room temperature, like in a thermos. The outer chambers are cooled to the temperature of liquid helium with mechanical coolers called pulse tubes – which do not require expensive cryogenic liquids. The innermost chamber is cooled using a process similar to traditional refrigeration in which a fluid evaporates and takes heat along with it. The only fluid that operates at such cold temperatures, however, is liquid helium. The researchers use a mixture of Helium-3 and Helium-4 that continuously circulates in a specialized cryogenic unit called dilution refrigerator, removing any remnant heat energy from the smallest chamber. The CUORE dilution refrigerator, built by Leiden Cryogenics in Netherlands, is one of the most powerful in the world. “It’s a Mack truck of dilution refrigerators,” Kolomensky says.

    The ultimate purpose for the coldest cubic meter in the universe is to house a new ultra-sensitive detector. The goal of CUORE is to observe a hypothesized rare process called neutrinoless double-beta decay. Detection of this process would allow researchers to demonstrate, for the first time, that neutrinos are their own antiparticles, thereby offering a possible explanation for the abundance of matter over anti-matter in our universe —in other words, why the galaxies, stars, and ultimately people exist in the universe at all.

    To detect neutrinoless double-beta decay, the team is using a detector made of 19 independent towers of tellurium dioxide (TeO2) crystals. Fifty-two crystals, each a little smaller than a Rubik’s cube, make up each tower. The team expects that they would be able to see evidence of the rare radioactive process within these cube-shaped crystals because the phenomenon would produce a barely detectable temperature rise, picked up by highly sensitive temperature sensors.

    Berkeley Lab, with Lawrence Livermore National Lab, has supplied roughly half the crystals for the CUORE project. In addition, Berkeley Lab designed and fabricated the highly sensitive temperature sensors – Neutron Transmutation Doped thermistors invented by Eugene Haller, UC Berkeley faculty and senior faculty scientist in the Material Science Division.

    UC postdocs Tom Banks and Tommy O’Donnell, who also have joint appointments with the Nuclear Science Division at Berkeley Lab, led the international team of physicists, engineers, and technicians to assemble over ten thousand parts into towers in nitrogen-filled glove boxes, including and bonding almost 8000 25-micron gold wires to 100-micron sized pads on the temperature sensors and on copper pads connected to detector wiring.

    The last of the 19 towers has recently been completed; all towers are now safely stored underground at LNGS, waiting to occupy the record-breaking vessel. The coldest cubic meter in the known universe is not just the feat of engineering; it will become a premier science instrument next year.

    US-CUORE team was lead by late Prof. Stuart Freedman until his untimely passing in 2012. Other current and former Berkeley Lab members of the CUORE collaboration not previously mentioned include US Contractor Project Manager Sergio Zimmermann (Engineering Division), former US Contractor Project Manager Richard Kadel (Physics Division, retired), staff scientists Jeffrey Beeman (Materials Science Division), Brian Fujikawa (Nuclear Science Division), Sarah Morgan (Engineering), Alan Smith (EH&S), postdocs Raul Hennings-Yeomans (UCB and NSD), Ke Han (NSD, now Yale), and Yuan Mei (NSD), graduate students Alexey Drobizhev and Sachi Wagaarachchi (UCB and NSD), and engineers David Biare, Lucio di Paolo (NSD and LNGS), and Joseph Wallig (Engineering).

    For more information: CUORE collaboration news release here.

    See the full article here.

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 8:43 pm on October 3, 2014 Permalink | Reply
    Tags: , , , Lawrence Berkeley National laboratory,   

    From LBL: “News Center RCas9: A Programmable RNA Editing Tool” 

    Berkeley Logo

    Berkeley Lab

    October 3, 2014
    Lynn Yarris (510) 486-5375

    A powerful scientific tool for editing the DNA instructions in a genome can now also be applied to RNA, the molecule that translates DNA’s genetic instructions into the production of proteins. A team of researchers with Berkeley Lab and the University of California (UC) Berkeley has demonstrated a means by which the CRISPR/Cas9 protein complex can be programmed to recognize and cleave RNA at sequence-specific target sites. This finding has the potential to transform the study of RNA function by paving the way for direct RNA transcript detection, analysis and manipulation.

    sch
    Schematic shows how RNA-guided Cas9 working with PAMmer can target ssRNA for programmable, sequence-specific cleavage.

    Led by Jennifer Doudna, biochemist and leading authority on the CRISPR/Cas9 complex, the Berkeley team showed how the Cas9 enzyme can work with short DNA sequences known as “PAM,” for protospacer adjacent motif, to identify and bind with specific site of single-stranded RNA (ssRNA). The team is designating this RNA-targeting CRISPR/Cas9 complex as RCas9.

    “Using specially designed PAM-presenting oligonucleotides, or PAMmers, RCas9 can be specifically directed to bind or cut RNA targets while avoiding corresponding DNA sequences, or it can be used to isolate specific endogenous messenger RNA from cells,” says Doudna, who holds joint appointments with Berkeley Lab’s Physical Biosciences Division and UC Berkeley’s Department of Molecular and Cell Biology and Department of Chemistry, and is also an investigator with the Howard Hughes Medical Institute (HHMI). “Our results reveal a fundamental connection between PAM binding and substrate selection by RCas9, and highlight the utility of RCas9 for programmable RNA transcript recognition without the need for genetically introduced tags.”

    jd
    Biochemist Jennifer Doudna is leading authority on the CRISPR/Cas9 complex (Photo by Roy Kaltschmidt)

    From safer, more effective medicines and clean, green, renewable fuels, to the clean-up and restoration of our air, water and land, the potential is there for genetically engineered bacteria and other microbes to produce valuable goods and perform critical services. To exploit the vast potential of microbes, scientists must be able to precisely edit their genetic information.

    In recent years, the CRISPR/Cas complex has emerged as one of the most effective tools for doing this. CRISPR, which stands for Clustered Regularly Interspaced Short Palindromic Repeats, is a central part of the bacterial immune system and handles sequence recognition. Cas9 – Cas stands for CRISPR-assisted – is an RNA-guided enzyme that handles the sniping of DNA strands at the specified sequence site.

    Together, CRISPR and Cas9 can be used to precisely edit the DNA instructions in a targeted genome for making desired types of proteins. The DNA is cut at a specific location so that old DNA instructions can be removed and/or new instructions inserted.

    Until now, it was thought that Cas9 could not be used on the RNA molecules that transcribe those DNA instructions into the desired proteins.

    “Just as Cas9 can be used to cut or bind DNA in a sequence-specific manner, RCas9 can cut or bind RNA in a sequence-specific manner,” says Mitchell O’Connell, a member of Doudna’s research group and the lead author of a paper in Nature that describes this research titled Programmable RNA recognition and cleavage by CRISPR/Cas9. Doudna is the corresponding author. Other co-authors are Benjamin Oakes, Samuel Sternberg, Alexandra East Seletsky and Matias Kaplan.

    two
    Benjamin Oakes and Mitch O’Connell are part of the collaboration led by Jennifer Doudna that showed how the CRISPR/Cas9 complex can serve as a programmable RNA editor. (Photo by Roy Kaltschmidt)

    In an earlier study, Doudna and her group showed that the genome editing ability of Cas9 is made possible by presence of PAM, which marks where cutting is to commence and activates the enzyme’s strand-cleaving activity. In this latest study, Doudna, Mitchell and their collaborators show that PAMmers, in a similar manner, can also stimulate site-specific endonucleolytic cleavage of ssRNA targets. They used Cas9 enzymes from the bacterium Streptococcus pyogenes to perform a variety of in vitro cleavage experiments using a panel of RNA and DNA targets.

    “While RNA interference has proven useful for manipulating gene regulation in certain organisms, there has been a strong motivation to develop an orthogonal nucleic-acid-based RNA-recognition system such as RCas9,” Doudna says. “The molecular basis for RNA recognition by RCas9 is now clear and requires only the design and synthesis of a matching guide RNA and complementary PAMmer.”

    The researchers envision a wide range of potential applications for RCas9. For example, an RCas9 tethered to a protein translation initiation factor and targeted to a specific mRNA could essentially act as a designer translation factor to “up-” or “down-” regulate protein synthesis from that mRNA.

    “Tethering RCas9 to beads could be used to isolate RNA or native RNA–protein complexes of interest from cells for downstream analysis or assays,” Mitchell says. “RCsa9 fused to select protein domains could promote or exclude specific introns or exons, and RCas9 tethered to a fluorescent proteins could be used to observe RNA localization and transport in living cells.”

    This research was primarily supported by the NIH-funded Center for RNA Systems Biology.

    See the full article here.

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 8:21 pm on October 2, 2014 Permalink | Reply
    Tags: , , , , Lawrence Berkeley National laboratory, , ,   

    From LBL: “A Closer Look at the Perfect Fluid” 

    Berkeley Logo

    Berkeley Lab

    October 2, 2014
    Kate Greene 510-486-4404

    Researchers at Berkeley Lab and their collaborators have honed a way to probe the quark-gluon plasma, the kind of matter that dominated the universe immediately after the big bang.

    gp
    A simulated collision of lead ions, courtesy the ALICE experiment at CERN. – See more at: http://newscenter.lbl.gov/2014/10/02/a-closer-look-at-the-perfect-fluid/#sthash.LuD3V5BH.dpuf

    By combining data from two high-energy accelerators, nuclear scientists have refined the measurement of a remarkable property of exotic matter known as quark-gluon plasma. The findings reveal new aspects of the ultra-hot, “perfect fluid” that give clues to the state of the young universe just microseconds after the big bang.

    The multi-institutional team known as the JET Collaboration, led by researchers at the U.S. Department of Energy’s Lawrence Berkeley National Lab (Berkeley Lab), published their results in a recent issue of Physical Review C. The JET Collaboration is one of the Topical Collaborations in nuclear theory established by the DOE Office of Science in 2010. JET, which stands for Quantitative Jet and Electromagnetic Tomography, aims to study the probes used to investigate high-energy, heavy-ion collisions. The Collaboration currently has 12 participating institutions with Berkeley Lab as the leading institute.

    “We have made, by far, the most precise extraction to date of a key property of the quark-gluon plasma, which reveals the microscopic structure of this almost perfect liquid,” says Xin-Nian Wang, physicist in the Nuclear Science Division at Berkeley Lab and managing principal investigator of the JET Collaboration. Perfect liquids, Wang explains, have the lowest viscosity-to-density ratio allowed by quantum mechanics, which means they essentially flow without friction.

    Hot Plasma Soup

    To create and study the quark-gluon plasma, nuclear scientists used particle accelerators called the Relativistic Heavy-ion Collider (RHIC) at the Brookhaven National Laboratory in New York and the Large Hadron Collider (LHC) at CERN in Switzerland. By accelerating heavy atomic nuclei to high energies and blasting them into each other, scientists are able to recreate the hot temperature conditions of the early universe.

    BNL RHIC Campus
    BNL RHIC
    BNL RHIC schematic
    RHIC at BNL

    CERN LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    Inside protons and neutrons that make up the colliding atomic nuclei are elementary particles called quarks, which are bound together tightly by other elementary particles called gluons. Only under extreme conditions, such as collisions in which temperatures exceed by a million times those at the center of the sun, do quarks and gluons pull apart to become the ultra-hot, frictionless perfect fluid known as quark-gluon plasma.

    “The temperature is so high that the boundaries between different nuclei disappear so everything becomes a hot-plasma soup of quarks and gluons,” says Wang. This ultra-hot soup is contained within a chamber in the particle accelerator, but it is short-lived—quickly cooling and expanding—making it a challenge to measure. Experimentalists have developed sophisticated tools to overcome the challenge, but translating experimental observations into precise quantitative understanding of the quark-gluon plasma has been difficult to achieve until now, he says.

    Looking Inside

    In this new work, Wang’s team refined a probe that makes use of a phenomenon researchers at Berkeley Lab first theoretically outlined 20 years ago: energy loss of a high-energy particle, called a jet, inside the quark gluon plasma.

    “When a hot quark-gluon plasma is generated, sometimes you also produce these very energetic particles with an energy a thousand times larger than that of the rest of the matter,” says Wang. This jet propagates through the plasma, scatters, and loses energy on its way out.

    Since the researchers know the energy of the jet when it is produced, and can measure its energy coming out, they can calculate its energy loss, which provides clues to the density of the plasma and the strength of its interaction with the jet. “It’s like an x-ray going through a body so you can see inside,” says Wang.

    we
    Xin Nian Wang, physicist in the Nuclear Science Division at Berkeley Lab and managing principal investigator of the JET Collaboration.

    One difficulty in using a jet as an x-ray of the quark-gluon plasma is the fact that a quark-gluon plasma is a rapidly expanding ball of fire—it doesn’t sit still. “You create this hot fireball that expands very fast as it cools down quickly to ordinary matter,” Wang says. So it’s important to develop a model to accurately describe the expansion of plasma, he says. The model must rely on a branch of theory called relativistic hydrodynamics in which the motion of fluids is described by equations from Einstein’s theory of special relativity.

    Over the past few years, researchers from the JET Collaboration have developed such a model that can describe the process of expansion and the observed phenomena of an ultra-hot perfect fluid. “This allows us to understand how a jet propagates through this dynamic fireball,” says Wang

    Employing this model for the quark-gluon plasma expansion and jet propagation, the researchers analyzed combined data from the PHENIX and STAR experiments at RHIC and the ALICE and CMS experiments at LHC since each accelerator created quark-gluon plasma at different initial temperatures. The team determined one particular property of the quark-gluon plasma, called the jet transport coefficient, which characterizes the strength of interaction between the jet and the ultra-hot matter. “The determined values of the jet transport coefficient can help to shed light on why the ultra-hot matter is the most ideal liquid the universe has ever seen,” Wang says.

    BNL Phenix
    PHENIX at BNL

    BNL Star
    STAR at BNL

    CERN ALICE New
    ALICE at CERN

    CERN CMS New
    CMS at CERN

    Peter Jacobs, head of the experimental group at Berkeley Lab that carried out the first jet and flow measurements with the STAR Collaboration at RHIC, says the new result is “very valuable as a window into the precise nature of the quark gluon plasma. The approach taken by the JET Collaboration to achieve it, by combining efforts of several groups of theorists and experimentalists, shows how to make other precise measurements of properties of the quark gluon plasma in the future.”

    The team’s next steps are to analyze future data at lower RHIC energies and higher LHC energies to see how these temperatures might affect the plasma’s behavior, especially near the phase transition between ordinary matter and the exotic matter of the quark-gluon plasma.

    This work was supported by the DOE Office of Science, Office of Nuclear Physics and used the facilities of the National Energy Research Scientific Computing Center (NERSC) located at Berkeley Lab.

    See the full article here.

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 4:04 pm on September 29, 2014 Permalink | Reply
    Tags: , , , Lawrence Berkeley National laboratory   

    From LBL: “MaxBin: Automated Sorting Through Metagenomes” 

    Berkeley Logo

    Berkeley Lab

    September 29, 2014
    Lynn Yarris (510) 486-5375

    Microbes – the single-celled organisms that dominate every ecosystem on Earth – have an amazing ability to feed on plant biomass and convert it into other chemical products. Tapping into this talent has the potential to revolutionize energy, medicine, environmental remediation and many other fields. The success of this effort hinges in part on metagenomics, the emerging technology that enables researchers to read all the individual genomes of a sample microbial community at once. However, given that even a teaspoon of soil can contain billions of microbes, there is a great need to be able to cull the genomes of individual microbial species from a metagenomic sequence.

    Enter MaxBin, an automated software program for binning (sorting) the genomes of individual microbial species from metagenomic sequences. Developed at the U.S. Department of Energy (DOE)’s Joint BioEnergy Institute (JBEI), under the leadership of Steve Singer, who directs JBEI’s Microbial Communities Group, MaxBin facilitates the genomic analysis of uncultivated microbial populations that can hold the key to the production of new chemical materials, such as advanced biofuels or pharmaceutical drugs.

    cd
    MaxBin, an automated software program for binning the genomes of individual microbial species from metagenomic sequences, is available on-line through JBEI.

    “MaxBin automates the binning of assembled metagenomic scaffolds using an expectation-maximization algorithm after the assembly of metagenomic sequencing reads,” says Singer, a chemist who also holds an appointment with Berkeley Lab’s Earth Sciences Division. “Previous binning methods either required a significant amount of work by the user, or required a large number of samples for comparison. MaxBin requires only a single sample and is a push-button operation for users.”

    three
    JBEI researchers Yu-Wei Wu, Steve Singer and Danny Tang developed MaxBin to automatically recover individual genomes from metagenomes using an expectation-maximization algorithm. (Photo by Roy Kaltschmidt)

    The key to the success of MaxBin is its expectation-maximization algorithm, which was developed by Yu-Wei Wu, a post-doctoral researcher in Singer’s group. This algorithm enables the classification of metagenomic sequences into discrete bins that represent the genomes of individual microbial populations within a sample community.

    “Using our expectation-maximization algorithm, MaxBin combines information from tetranucleotide frequencies and scaffold coverage levels to organize metagenomic sequences into the individual bins, which are predicted from an initial identification of marker genes in assembled sequences,” Wu says.

    MaxBin was successfully tested on samples from the Human Microbiome Project and from green waste compost. In these tests, which were carried out by Yung-Tsu Tang, a student intern from the City College of San Francisco, MaxBin proved to be highly accurate in its ability to recover individual genomes from metagenomic datasets with variable sequencing coverages.

    “Applying MaxBin to an enriched cellulolytic consortia enabled us to identify a number of uncultivated cellulolytic bacteria, including a myxobacterium that possesses a remarkably reduced genome and expanded set of genes for biomass deconstruction compared to its closest sequenced relatives,” Singer says. “This demonstrates that the processes required for recovering genomes from metagenomic datasets can be applied to understanding biomass breakdown in the environment”.

    MaxBin is now being used at JBEI in its efforts to use microbes for the production of advanced biofuels – gasoline, diesel and jet fuel – from plant biomass. MaxBin is also available for downloading. To date, more than 150 researchers have accessed it.

    A paper describing MaxBin in detail has been published in the journal Microbiome. The paper is titled MaxBin: an automated binning method to recover individual genomes from metagenomes using an expectation-maximization algorithm. Co-authoring this paper in addition to Singer, Wu and Tang, were Susannah Tringe of the Joint Genome Institute, and Blake Simmons of JBEI.

    • See the full article here.

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 3:41 pm on September 27, 2014 Permalink | Reply
    Tags: , CO2 studies, Lawrence Berkeley National laboratory, ,   

    From LBL: “Pore models track reactions in underground carbon capture” 

    Berkeley Logo

    Berkeley Lab

    September 25, 2014

    Using tailor-made software running on top-tier supercomputers, a Lawrence Berkeley National Laboratory team is creating microscopic pore-scale simulations that complement or push beyond laboratory findings.

    image
    Computed pH on calcite grains at 1 micron resolution. The iridescent grains mimic crushed material geoscientists extract from saline aquifers deep underground to study with microscopes. Researchers want to model what happens to the crystals’ geochemistry when the greenhouse gas carbon dioxide is injected underground for sequestration. Image courtesy of David Trebotich, Lawrence Berkeley National Laboratory.

    The models of microscopic underground pores could help scientists evaluate ways to store carbon dioxide produced by power plants, keeping it from contributing to global climate change.

    The models could be a first, says David Trebotich, the project’s principal investigator. “I’m not aware of any other group that can do this, not at the scale at which we are doing it, both in size and computational resources, as well as the geochemistry.” His evidence is a colorful portrayal of jumbled calcite crystals derived solely from mathematical equations.

    The iridescent menagerie is intended to act just like the real thing: minerals geoscientists extract from saline aquifers deep underground. The goal is to learn what will happen when fluids pass through the material should power plants inject carbon dioxide underground.

    Lab experiments can only measure what enters and exits the model system. Now modelers would like to identify more of what happens within the tiny pores that exist in underground materials, as chemicals are dissolved in some places but precipitate in others, potentially resulting in preferential flow paths or even clogs.

    Geoscientists give Trebotich’s group of modelers microscopic computerized tomography (CT, similar to the scans done in hospitals) images of their field samples. That lets both camps probe an anomaly: reactions in the tiny pores happen much more slowly in real aquifers than they do in laboratories.

    Going deep

    Deep saline aquifers are underground formations of salty water found in sedimentary basins all over the planet. Scientists think they’re the best deep geological feature to store carbon dioxide from power plants.

    But experts need to know whether the greenhouse gas will stay bottled up as more and more of it is injected, spreading a fluid plume and building up pressure. “If it’s not going to stay there (geoscientists) will want to know where it is going to go and how long that is going to take,” says Trebotich, who is a computational scientist in Berkeley Lab’s Applied Numerical Algorithms Group.

    He hopes their simulation results ultimately will translate to field scale, where “you’re going to be able to model a CO2 plume over a hundred years’ time and kilometers in distance.” But for now his group’s focus is at the microscale, with attention toward the even smaller nanoscale.

    At such tiny dimensions, flow, chemical transport, mineral dissolution and mineral precipitation occur within the pores where individual grains and fluids commingle, says a 2013 paper Trebotich coauthored with geoscientists Carl Steefel (also of Berkeley Lab) and Sergi Molins in the journal Reviews in Mineralogy and Geochemistry.

    These dynamics, the paper added, create uneven conditions that can produce new structures and self-organized materials – nonlinear behavior that can be hard to describe mathematically.

    Modeling at 1 micron resolution, his group has achieved “the largest pore-scale reactive flow simulation ever attempted” as well as “the first-ever large-scale simulation of pore-scale reactive transport processes on real-pore-space geometry as obtained from experimental data,” says the 2012 annual report of the lab’s National Energy Research Scientific Computing Center (NERSC).

    The simulation required about 20 million processor hours using 49,152 of the 153,216 computing cores in Hopper, a Cray XE6 that at the time was NERSC’s flagship supercomputer.

    cray hopper
    Cray Hopper at NERSC

    “As CO2 is pumped underground, it can react chemically with underground minerals and brine in various ways, sometimes resulting in mineral dissolution and precipitation, which can change the porous structure of the aquifer,” the NERSC report says. “But predicting these changes is difficult because these processes take place at the pore scale and cannot be calculated using macroscopic models.

    “The dissolution rates of many minerals have been found to be slower in the field than those measured in the laboratory. Understanding this discrepancy requires modeling the pore-scale interactions between reaction and transport processes, then scaling them up to reservoir dimensions. The new high-resolution model demonstrated that the mineral dissolution rate depends on the pore structure of the aquifer.”

    Trebotich says “it was the hardest problem that we could do for the first run.” But the group redid the simulation about 2½ times faster in an early trial of Edison, a Cray XC-30 that succeeded Hopper. Edison, Trebotich says, has larger memory bandwidth.

    cray edison
    Cray Edison at NERSC

    Rapid changes

    Generating 1-terabyte data sets for each microsecond time step, the Edison run demonstrated how quickly conditions can change inside each pore. It also provided a good workout for the combination of interrelated software packages the Trebotich team uses.

    The first, Chombo, takes its name from a Swahili word meaning “toolbox” or “container” and was developed by a different Applied Numerical Algorithms Group team. Chombo is a supercomputer-friendly platform that’s scalable: “You can run it on multiple processor cores, and scale it up to do high-resolution, large-scale simulations,” he says.

    Trebotich modified Chombo to add flow and reactive transport solvers. The group also incorporated the geochemistry components of CrunchFlow, a package Steefel developed, to create Chombo-Crunch, the code used for their modeling work. The simulations produce resolutions “very close to imaging experiments,” the NERSC report said, combining simulation and experiment to achieve a key goal of the Department of Energy’s Energy Frontier Research Center for Nanoscale Control of Geologic CO2

    Now Trebotich’s team has three huge allocations on DOE supercomputers to make their simulations even more detailed. The Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program is providing 80 million processor hours on Mira, an IBM Blue Gene/Q at Argonne National Laboratory. Through the Advanced Scientific Computing Research Leadership Computing Challenge (ALCC), the group has another 50 million hours on NERSC computers and 50 million on Titan, a Cray XK78 at Oak Ridge National Laboratory’s Leadership Computing Center. The team also held an ALCC award last year for 80 million hours at Argonne and 25 million at NERSC.

    mira
    MIRA at Argonne

    titan
    TITAN at Oak Ridge

    With the computer time, the group wants to refine their image resolutions to half a micron (half of a millionth of a meter). “This is what’s known as the mesoscale: an intermediate scale that could make it possible to incorporate atomistic-scale processes involving mineral growth at precipitation sites into the pore scale flow and transport dynamics,” Trebotich says.

    Meanwhile, he thinks their micron-scale simulations already are good enough to provide “ground-truthing” in themselves for the lab experiments geoscientists do.

    See the full article here.

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 1:49 pm on September 3, 2014 Permalink | Reply
    Tags: , , , Lawrence Berkeley National laboratory, , Peptoids   

    From LBL: “Peptoid Nanosheets at the Oil/Water Interface” 

    Berkeley Logo

    Berkeley Lab

    September 3, 2014
    Lynn Yarris (510) 486-5375

    From the people who brought us peptoid nanosheets that form at the interface between air and water, now come peptoid nanosheets that form at the interface between oil and water. Scientists at the U.S. Department of Energy (DOE)’s Lawrence Berkeley National Laboratory (Berkeley Lab) have developed peptoid nanosheets – two-dimensional biomimetic materials with customizable properties – that self-assemble at an oil-water interface. This new development opens the door to designing peptoid nanosheets of increasing structural complexity and chemical functionality for a broad range of applications, including improved chemical sensors and separators, and safer, more effective drug delivery vehicles.

    Supramolecular assembly at an oil-water interface is an effective way to produce 2D nanomaterials from peptoids because that interface helps pre-organize the peptoid chains to facilitate their self-interaction,” says Ron Zuckermann, a senior scientist at the Molecular Foundry, a DOE nanoscience center hosted at Berkeley Lab. “This increased understanding of the peptoid assembly mechanism should enable us to scale-up to produce large quantities, or scale- down to screen many different nanosheets for novel functions.”

    nano
    Peptoid nanosheets are among the largest and thinnest free-floating organic crystals ever made, with an area-to-thickness equivalent of a plastic sheet covering a football field. Peptoid nanosheets can be engineered to carry out a wide variety of functions.
    two
    Ron Zuckerman and Geraldine Richmond led the development of peptoid nanosheets that form at the interface between oil and water, opening the door to increased structural complexity and chemical functionality for a broad range of applications.

    Zuckermann, who directs the Molecular Foundry’s Biological Nanostructures Facility, and Geraldine Richmond of the University of Oregon are the corresponding authors of a paper reporting these results in the Proceedings of the National Academy of Sciences (PNAS). The paper is titled Assembly and molecular order of two-dimensional peptoid nanosheets at the oil-water interface. Co-authors are Ellen Robertson, Gloria Olivier, Menglu Qian and Caroline Proulx.

    Peptoids are synthetic versions of proteins. Like their natural counterparts, peptoids fold and twist into distinct conformations that enable them to carry out a wide variety of specific functions. In 2010, Zuckermann and his group at the Molecular Foundry discovered a technique to synthesize peptoids into sheets that were just a few nanometers thick but up to 100 micrometers in length. These were among the largest and thinnest free-floating organic crystals ever made, with an area-to-thickness equivalent of a plastic sheet covering a football field. Just as the properties of peptoids can be chemically customized through robotic synthesis, the properties of peptoid nanosheets can also be engineered for specific functions.

    “Peptoid nanosheet properties can be tailored with great precision,” Zuckermann says, “and since peptoids are less vulnerable to chemical or metabolic breakdown than proteins, they are a highly promising platform for self-assembling bio-inspired nanomaterials.”

    In this latest effort, Zuckermann, Richmond and their co-authors used vibrational sum frequency spectroscopy to probe the molecular interactions between the peptoids as they assembled at the oil-water interface. These measurements revealed that peptoid polymers adsorbed to the interface are highly ordered, and that this order is greatly influenced by interactions between neighboring molecules.

    “We can literally see the polymer chains become more organized the closer they get to one another,” Zuckermann says.

    ft
    Peptoid polymers adsorbed to the oil-water interface are highly ordered thanks to interactions between neighboring molecules.

    The substitution of oil in place of air creates a raft of new opportunities for the engineering and production of peptoid nanosheets. For example, the oil phase could contain chemical reagents, serve to minimize evaporation of the aqueous phase, or enable microfluidic production.

    “The production of peptoid nanosheets in microfluidic devices means that we should soon be able to make combinatorial libraries of different functionalized nanosheets and screen them on a very small scale,” Zuckermann says. “This would be advantageous in the search for peptoid nanosheets with the molecular recognition and catalytic functions of proteins.”

    Zuckermann and his group at the Molecular Foundry are now investigating the addition of chemical reagents or cargo to the oil phase, and exploring their interactions with the peptoid monolayers that form during the nanosheet assembly process.

    “In the future we may be able to produce nanosheets with drugs, dyes, nanoparticles or other solutes trapped in the interior,” he says. “These new nanosheets could have a host of interesting biomedical, mechanical and optical properties.”

    This work was primarily funded by the DOE Office of Science and the Defense Threat Reduction Agency. Part of the research was performed at the Molecular Foundry and the Advanced Light Source, which are DOE Office of Science User Facilities.

    See the full article here.

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 1:41 pm on August 29, 2014 Permalink | Reply
    Tags: , , , , , Lawrence Berkeley National laboratory   

    From LBL: “Going to Extremes for Enzymes” 

    Berkeley Logo

    Berkeley Lab

    August 29, 2014
    Lynn Yarris (510) 486-5375

    In the age-old nature versus nurture debate, Douglas Clark, a faculty scientist with Berkeley Lab and the University of California (UC) Berkeley, is not taking sides. In the search for enzymes that can break lignocellulose down into biofuel sugars under the extreme conditions of a refinery, he has prospected for extremophilic microbes and engineered his own cellulases.

    ext
    Extremophiles thriving in thermal springs where the water temperature can be close to boiling can be a rich source of enzymes for the deconstruction of lignocellulose.

    Speaking at the national meeting of the American Chemical Society (ACS) in San Francisco, Clark discussed research for the Energy Biosciences Institute (EBI) in which he and his collaborators are investigating ways to release plant sugars from lignin for the production of liquid transportation fuels. Sugars can be fermented into fuels once the woody matter comprised of cellulose, hemicellulose, and lignin is broken down, but lignocellulose is naturally recalcitrant.

    “Lignocellulose is designed by nature to stand tall and resist being broken down, and lignin in particular acts like a molecular glue to help hold it together” said Clark, who holds appointments with Berkeley Lab’s Physical Biosciences Division and UC Berkeley’s Chemical and Biomolecular Engineering Department where he currently serves as dean of the College of Chemistry. “Consequently, lignocellulosic biomass must undergo either chemical or enzymatic deconstruction to release the sugars that can be fermented to biofuels.”

    dc
    Douglas Clark holds joint appointments with Berkeley Lab and UC Berkeley and is a principal investigator with the Energy Biosciences Institute. (Photo by Roy Kaltschmidt)

    For various chemical reasons, all of which add up to cost-competitiveness, biorefineries could benefit if the production of biofuels from lignocellulosic biomass is carried out at temperatures between 65 and 70 degrees Celsius. The search by Clark and his EBI colleagues for cellulases that can tolerate these and even harsher conditions led them to thermal springs near Gerlach, Nevada, where the water temperature can be close to boiling. There they discovered a consortium of three hyperthermophilic Archaea that could grow on crystalline cellulose at 90 degrees Celsius.

    “This consortium represents the first instance of Archaea able to deconstruct lignocellulose optimally above 90°C,” Clark said.

    Following metagenomic studies on the consortium, the most active high-temperature cellulase was identified and named EBI-244.

    “The EBI-244 cellulase is active at temperatures as high as 108 degrees Celsius, the most extremely heat-tolerant enzyme ever found in any cellulose-digesting microbe,” Clark said.

    The most recent expedition of Clark and his colleagues was to thermal hot springs in Lassen Volcanic National Park, where they found an enzyme active on cellulose up to 100°C under highly acidic conditions – pH approximately 2.2.

    “The Lassen enzyme is the most acidothermophilic cellulase yet discovered,” Clark said. “The final products that it forms are similar to those produced by EBI244.”

    three
    A consortium of three hyperthermophilic Archaea that could grow on crystalline cellulose at 90 degrees Celsius yielded EBI-244, the most active high-temperature cellulase ever identified.

    In addition to bioprospecting for heat tolerant enzymes, Clark and his colleagues have developed a simple and effective mutagenesis method to enhance the properties of natural enzymes. Most recently they used this technique to increase the optimal temperature and enhance the thermostability of Ce17A, a fungal cellulase that is present in high concentrations in commercial cellulase cocktails. They engineered yeast to produce this enzyme with encouraging results.

    “The yeast Saccharomyces cerevisiae has often been used both in the engineering and basic study of Cel7A; however, Cel7A enzymes recombinantly expressed in yeast are often less active and less stable than their native counterparts,” Clark said. “We discovered that an important post-translational modification that was sometimes absent in the yeast-expressed enzyme was the underlying cause of this disparity and successfully carried out the post-translational modification in vitro. After this treatment, the properties of Cel7A recombinantly expressed in yeast were improved to match those of the native enzyme.”

    Collaborators in this research include Harvey Blanch, who also holds joint appointments with Berkeley Lab and UC Berkeley, and Frank Robb from the University of Maryland.

    EBI, which provided the funding for this research, is a collaborative partnership between BP, the funding agency, UC Berkeley, Berkeley Lab and the University of Illinois at Urbana-Champaign.

    See the full article here.

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 8:06 pm on August 27, 2014 Permalink | Reply
    Tags: , , , , Lawrence Berkeley National laboratory   

    From Berkeley Lab: “Encyclopedia of How Genomes Function Gets Much Bigger” 

    Berkeley Logo

    Berkeley Lab

    August 27, 2014
    Dan Krotz 510-486-4019

    A big step in understanding the mysteries of the human genome was unveiled today in the form of three analyses that provide the most detailed comparison yet of how the genomes of the fruit fly, roundworm, and human function.

    The research, appearing August 28 in in the journal Nature, compares how the information encoded in the three species’ genomes is “read out,” and how their DNA and proteins are organized into chromosomes.

    The results add billions of entries to a publicly available archive of functional genomic data. Scientists can use this resource to discover common features that apply to all organisms. These fundamental principles will likely offer insights into how the information in the human genome regulates development, and how it is responsible for diseases.

    mod
    Berkeley Lab scientists contributed to an NHGRI effort that provides the most detailed comparison yet of how the genomes of the fruit fly, roundworm, and human function. (Credit: Darryl Leja, NHGRI)

    The analyses were conducted by two consortia of scientists that include researchers from the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab). Both efforts were funded by the National Institutes of Health’s National Human Genome Research Institute.

    One of the consortiums, the “model organism Encyclopedia of DNA Elements” (modENCODE) project, catalogued the functional genomic elements in the fruit fly and roundworm. Susan Celniker and Gary Karpen of Berkeley Lab’s Life Sciences Division led two fruit fly research groups in this consortium. Ben Brown, also with the Life Sciences Division, participated in another consortium, ENCODE, to identify the functional elements in the human genome.

    The consortia are addressing one of the big questions in biology today: now that the human genome and many other genomes have been sequenced, how does the information encoded in an organism’s genome make an organism what it is? To find out, scientists have for the past several years studied the genomes of model organisms such as the fruit fly and roundworm, which are smaller than our genome, yet have many genes and biological pathways in common with humans. This research has led to a better understanding of human gene function, development, and disease.

    Comparing Transcriptomes

    In all organisms, the information encoded in genomes is transcribed into RNA molecules that are either translated into proteins, or utilized to perform functions in the cell. The collection of RNA molecules expressed in a cell is known as its transcriptome, which can be thought of as the “read out” of the genome.

    In the research announced today, dozens of scientists from several institutions looked for similarities and differences in the transcriptomes of human, roundworm, and fruit fly. They used deep sequencing technology and bioinformatics to generate large amounts of matched RNA-sequencing data for the three species. This involved 575 experiments that produced more than 67 billion sequence reads.

    A team led by Celniker, with help from Brown and scientists from several other labs, conducted the fruit fly portion of this research. They mapped the organism’s transcriptome at 30 time points of its development. They also explored how environmental perturbations such as heavy metals, herbicides, caffeine, alcohol and temperature affect the fly’s transcriptome. The result is the finest time-resolution analysis of the fly genome’s “read out” to date—and a mountain of new data.

    “We went from two billion reads in research we published in 2011, to 20 billion reads today,” says Celniker. “As a result, we found that the transcriptome is much more extensive and complex than previously thought. It has more long non-coding RNAs and more promoters.”

    When the scientists compared transcriptome data from all three species, they discovered 16 gene-expression modules corresponding to processes such as transcription and cell division that are conserved in the three animals. They also found a similar pattern of gene expression at an early stage of embryonic development in all three organisms.

    This work is described in a Nature article entitled “Comparative analysis of the transcriptome across distant species.”

    Comparing chromatin

    Another group, also consisting of dozens of scientists from several institutions, analyzed chromatin, which is the combination of DNA and proteins that organize an organism’s genome into chromosomes. Chromatin influences nearly every aspect of genome function.

    Karpen led the fruit fly portion of this work, with Harvard Medical School’s Peter Park contributing on the bioinformatics side, and scientists from several other labs also participating. The team mapped the distribution of chromatin proteins in the fruit fly genome. They also learned how chemical modifications to chromatin proteins impact genome functions.

    Their results were compared with results from human and roundworm chromatin research. In all, the group generated 800 new chromatin datasets from different cell lines and developmental stages of the three species, bringing the total number of datasets to more than 1400. These datasets are presented in a Nature article entitled “Comparative analysis of metazoan chromatin organization.”

    Here again, the scientists found many conserved chromatin features among the three organisms. They also found significant differences, such as in the composition and locations of repressive chromatin.

    But perhaps the biggest scientific dividend is the data itself.

    “We found many insights that need follow-up,” says Karpen. “And we’ve also greatly increased the amount of data that others can access. These datasets and analyses will provide a rich resource for comparative and species-specific investigations of how genomes, including the human genome, function.”

    See the full article here.

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
Follow

Get every new post delivered to your Inbox.

Join 347 other followers

%d bloggers like this: