Tagged: Lawrence Berkeley National laboratory Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 8:43 pm on October 3, 2014 Permalink | Reply
    Tags: , , , Lawrence Berkeley National laboratory,   

    From LBL: “News Center RCas9: A Programmable RNA Editing Tool” 

    Berkeley Logo

    Berkeley Lab

    October 3, 2014
    Lynn Yarris (510) 486-5375

    A powerful scientific tool for editing the DNA instructions in a genome can now also be applied to RNA, the molecule that translates DNA’s genetic instructions into the production of proteins. A team of researchers with Berkeley Lab and the University of California (UC) Berkeley has demonstrated a means by which the CRISPR/Cas9 protein complex can be programmed to recognize and cleave RNA at sequence-specific target sites. This finding has the potential to transform the study of RNA function by paving the way for direct RNA transcript detection, analysis and manipulation.

    sch
    Schematic shows how RNA-guided Cas9 working with PAMmer can target ssRNA for programmable, sequence-specific cleavage.

    Led by Jennifer Doudna, biochemist and leading authority on the CRISPR/Cas9 complex, the Berkeley team showed how the Cas9 enzyme can work with short DNA sequences known as “PAM,” for protospacer adjacent motif, to identify and bind with specific site of single-stranded RNA (ssRNA). The team is designating this RNA-targeting CRISPR/Cas9 complex as RCas9.

    “Using specially designed PAM-presenting oligonucleotides, or PAMmers, RCas9 can be specifically directed to bind or cut RNA targets while avoiding corresponding DNA sequences, or it can be used to isolate specific endogenous messenger RNA from cells,” says Doudna, who holds joint appointments with Berkeley Lab’s Physical Biosciences Division and UC Berkeley’s Department of Molecular and Cell Biology and Department of Chemistry, and is also an investigator with the Howard Hughes Medical Institute (HHMI). “Our results reveal a fundamental connection between PAM binding and substrate selection by RCas9, and highlight the utility of RCas9 for programmable RNA transcript recognition without the need for genetically introduced tags.”

    jd
    Biochemist Jennifer Doudna is leading authority on the CRISPR/Cas9 complex (Photo by Roy Kaltschmidt)

    From safer, more effective medicines and clean, green, renewable fuels, to the clean-up and restoration of our air, water and land, the potential is there for genetically engineered bacteria and other microbes to produce valuable goods and perform critical services. To exploit the vast potential of microbes, scientists must be able to precisely edit their genetic information.

    In recent years, the CRISPR/Cas complex has emerged as one of the most effective tools for doing this. CRISPR, which stands for Clustered Regularly Interspaced Short Palindromic Repeats, is a central part of the bacterial immune system and handles sequence recognition. Cas9 – Cas stands for CRISPR-assisted – is an RNA-guided enzyme that handles the sniping of DNA strands at the specified sequence site.

    Together, CRISPR and Cas9 can be used to precisely edit the DNA instructions in a targeted genome for making desired types of proteins. The DNA is cut at a specific location so that old DNA instructions can be removed and/or new instructions inserted.

    Until now, it was thought that Cas9 could not be used on the RNA molecules that transcribe those DNA instructions into the desired proteins.

    “Just as Cas9 can be used to cut or bind DNA in a sequence-specific manner, RCas9 can cut or bind RNA in a sequence-specific manner,” says Mitchell O’Connell, a member of Doudna’s research group and the lead author of a paper in Nature that describes this research titled Programmable RNA recognition and cleavage by CRISPR/Cas9. Doudna is the corresponding author. Other co-authors are Benjamin Oakes, Samuel Sternberg, Alexandra East Seletsky and Matias Kaplan.

    two
    Benjamin Oakes and Mitch O’Connell are part of the collaboration led by Jennifer Doudna that showed how the CRISPR/Cas9 complex can serve as a programmable RNA editor. (Photo by Roy Kaltschmidt)

    In an earlier study, Doudna and her group showed that the genome editing ability of Cas9 is made possible by presence of PAM, which marks where cutting is to commence and activates the enzyme’s strand-cleaving activity. In this latest study, Doudna, Mitchell and their collaborators show that PAMmers, in a similar manner, can also stimulate site-specific endonucleolytic cleavage of ssRNA targets. They used Cas9 enzymes from the bacterium Streptococcus pyogenes to perform a variety of in vitro cleavage experiments using a panel of RNA and DNA targets.

    “While RNA interference has proven useful for manipulating gene regulation in certain organisms, there has been a strong motivation to develop an orthogonal nucleic-acid-based RNA-recognition system such as RCas9,” Doudna says. “The molecular basis for RNA recognition by RCas9 is now clear and requires only the design and synthesis of a matching guide RNA and complementary PAMmer.”

    The researchers envision a wide range of potential applications for RCas9. For example, an RCas9 tethered to a protein translation initiation factor and targeted to a specific mRNA could essentially act as a designer translation factor to “up-” or “down-” regulate protein synthesis from that mRNA.

    “Tethering RCas9 to beads could be used to isolate RNA or native RNA–protein complexes of interest from cells for downstream analysis or assays,” Mitchell says. “RCsa9 fused to select protein domains could promote or exclude specific introns or exons, and RCas9 tethered to a fluorescent proteins could be used to observe RNA localization and transport in living cells.”

    This research was primarily supported by the NIH-funded Center for RNA Systems Biology.

    See the full article here.

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 8:21 pm on October 2, 2014 Permalink | Reply
    Tags: , , , , Lawrence Berkeley National laboratory, , ,   

    From LBL: “A Closer Look at the Perfect Fluid” 

    Berkeley Logo

    Berkeley Lab

    October 2, 2014
    Kate Greene 510-486-4404

    Researchers at Berkeley Lab and their collaborators have honed a way to probe the quark-gluon plasma, the kind of matter that dominated the universe immediately after the big bang.

    gp
    A simulated collision of lead ions, courtesy the ALICE experiment at CERN. – See more at: http://newscenter.lbl.gov/2014/10/02/a-closer-look-at-the-perfect-fluid/#sthash.LuD3V5BH.dpuf

    By combining data from two high-energy accelerators, nuclear scientists have refined the measurement of a remarkable property of exotic matter known as quark-gluon plasma. The findings reveal new aspects of the ultra-hot, “perfect fluid” that give clues to the state of the young universe just microseconds after the big bang.

    The multi-institutional team known as the JET Collaboration, led by researchers at the U.S. Department of Energy’s Lawrence Berkeley National Lab (Berkeley Lab), published their results in a recent issue of Physical Review C. The JET Collaboration is one of the Topical Collaborations in nuclear theory established by the DOE Office of Science in 2010. JET, which stands for Quantitative Jet and Electromagnetic Tomography, aims to study the probes used to investigate high-energy, heavy-ion collisions. The Collaboration currently has 12 participating institutions with Berkeley Lab as the leading institute.

    “We have made, by far, the most precise extraction to date of a key property of the quark-gluon plasma, which reveals the microscopic structure of this almost perfect liquid,” says Xin-Nian Wang, physicist in the Nuclear Science Division at Berkeley Lab and managing principal investigator of the JET Collaboration. Perfect liquids, Wang explains, have the lowest viscosity-to-density ratio allowed by quantum mechanics, which means they essentially flow without friction.

    Hot Plasma Soup

    To create and study the quark-gluon plasma, nuclear scientists used particle accelerators called the Relativistic Heavy-ion Collider (RHIC) at the Brookhaven National Laboratory in New York and the Large Hadron Collider (LHC) at CERN in Switzerland. By accelerating heavy atomic nuclei to high energies and blasting them into each other, scientists are able to recreate the hot temperature conditions of the early universe.

    BNL RHIC Campus
    BNL RHIC
    BNL RHIC schematic
    RHIC at BNL

    CERN LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    Inside protons and neutrons that make up the colliding atomic nuclei are elementary particles called quarks, which are bound together tightly by other elementary particles called gluons. Only under extreme conditions, such as collisions in which temperatures exceed by a million times those at the center of the sun, do quarks and gluons pull apart to become the ultra-hot, frictionless perfect fluid known as quark-gluon plasma.

    “The temperature is so high that the boundaries between different nuclei disappear so everything becomes a hot-plasma soup of quarks and gluons,” says Wang. This ultra-hot soup is contained within a chamber in the particle accelerator, but it is short-lived—quickly cooling and expanding—making it a challenge to measure. Experimentalists have developed sophisticated tools to overcome the challenge, but translating experimental observations into precise quantitative understanding of the quark-gluon plasma has been difficult to achieve until now, he says.

    Looking Inside

    In this new work, Wang’s team refined a probe that makes use of a phenomenon researchers at Berkeley Lab first theoretically outlined 20 years ago: energy loss of a high-energy particle, called a jet, inside the quark gluon plasma.

    “When a hot quark-gluon plasma is generated, sometimes you also produce these very energetic particles with an energy a thousand times larger than that of the rest of the matter,” says Wang. This jet propagates through the plasma, scatters, and loses energy on its way out.

    Since the researchers know the energy of the jet when it is produced, and can measure its energy coming out, they can calculate its energy loss, which provides clues to the density of the plasma and the strength of its interaction with the jet. “It’s like an x-ray going through a body so you can see inside,” says Wang.

    we
    Xin Nian Wang, physicist in the Nuclear Science Division at Berkeley Lab and managing principal investigator of the JET Collaboration.

    One difficulty in using a jet as an x-ray of the quark-gluon plasma is the fact that a quark-gluon plasma is a rapidly expanding ball of fire—it doesn’t sit still. “You create this hot fireball that expands very fast as it cools down quickly to ordinary matter,” Wang says. So it’s important to develop a model to accurately describe the expansion of plasma, he says. The model must rely on a branch of theory called relativistic hydrodynamics in which the motion of fluids is described by equations from Einstein’s theory of special relativity.

    Over the past few years, researchers from the JET Collaboration have developed such a model that can describe the process of expansion and the observed phenomena of an ultra-hot perfect fluid. “This allows us to understand how a jet propagates through this dynamic fireball,” says Wang

    Employing this model for the quark-gluon plasma expansion and jet propagation, the researchers analyzed combined data from the PHENIX and STAR experiments at RHIC and the ALICE and CMS experiments at LHC since each accelerator created quark-gluon plasma at different initial temperatures. The team determined one particular property of the quark-gluon plasma, called the jet transport coefficient, which characterizes the strength of interaction between the jet and the ultra-hot matter. “The determined values of the jet transport coefficient can help to shed light on why the ultra-hot matter is the most ideal liquid the universe has ever seen,” Wang says.

    BNL Phenix
    PHENIX at BNL

    BNL Star
    STAR at BNL

    CERN ALICE New
    ALICE at CERN

    CERN CMS New
    CMS at CERN

    Peter Jacobs, head of the experimental group at Berkeley Lab that carried out the first jet and flow measurements with the STAR Collaboration at RHIC, says the new result is “very valuable as a window into the precise nature of the quark gluon plasma. The approach taken by the JET Collaboration to achieve it, by combining efforts of several groups of theorists and experimentalists, shows how to make other precise measurements of properties of the quark gluon plasma in the future.”

    The team’s next steps are to analyze future data at lower RHIC energies and higher LHC energies to see how these temperatures might affect the plasma’s behavior, especially near the phase transition between ordinary matter and the exotic matter of the quark-gluon plasma.

    This work was supported by the DOE Office of Science, Office of Nuclear Physics and used the facilities of the National Energy Research Scientific Computing Center (NERSC) located at Berkeley Lab.

    See the full article here.

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 4:04 pm on September 29, 2014 Permalink | Reply
    Tags: , , , Lawrence Berkeley National laboratory   

    From LBL: “MaxBin: Automated Sorting Through Metagenomes” 

    Berkeley Logo

    Berkeley Lab

    September 29, 2014
    Lynn Yarris (510) 486-5375

    Microbes – the single-celled organisms that dominate every ecosystem on Earth – have an amazing ability to feed on plant biomass and convert it into other chemical products. Tapping into this talent has the potential to revolutionize energy, medicine, environmental remediation and many other fields. The success of this effort hinges in part on metagenomics, the emerging technology that enables researchers to read all the individual genomes of a sample microbial community at once. However, given that even a teaspoon of soil can contain billions of microbes, there is a great need to be able to cull the genomes of individual microbial species from a metagenomic sequence.

    Enter MaxBin, an automated software program for binning (sorting) the genomes of individual microbial species from metagenomic sequences. Developed at the U.S. Department of Energy (DOE)’s Joint BioEnergy Institute (JBEI), under the leadership of Steve Singer, who directs JBEI’s Microbial Communities Group, MaxBin facilitates the genomic analysis of uncultivated microbial populations that can hold the key to the production of new chemical materials, such as advanced biofuels or pharmaceutical drugs.

    cd
    MaxBin, an automated software program for binning the genomes of individual microbial species from metagenomic sequences, is available on-line through JBEI.

    “MaxBin automates the binning of assembled metagenomic scaffolds using an expectation-maximization algorithm after the assembly of metagenomic sequencing reads,” says Singer, a chemist who also holds an appointment with Berkeley Lab’s Earth Sciences Division. “Previous binning methods either required a significant amount of work by the user, or required a large number of samples for comparison. MaxBin requires only a single sample and is a push-button operation for users.”

    three
    JBEI researchers Yu-Wei Wu, Steve Singer and Danny Tang developed MaxBin to automatically recover individual genomes from metagenomes using an expectation-maximization algorithm. (Photo by Roy Kaltschmidt)

    The key to the success of MaxBin is its expectation-maximization algorithm, which was developed by Yu-Wei Wu, a post-doctoral researcher in Singer’s group. This algorithm enables the classification of metagenomic sequences into discrete bins that represent the genomes of individual microbial populations within a sample community.

    “Using our expectation-maximization algorithm, MaxBin combines information from tetranucleotide frequencies and scaffold coverage levels to organize metagenomic sequences into the individual bins, which are predicted from an initial identification of marker genes in assembled sequences,” Wu says.

    MaxBin was successfully tested on samples from the Human Microbiome Project and from green waste compost. In these tests, which were carried out by Yung-Tsu Tang, a student intern from the City College of San Francisco, MaxBin proved to be highly accurate in its ability to recover individual genomes from metagenomic datasets with variable sequencing coverages.

    “Applying MaxBin to an enriched cellulolytic consortia enabled us to identify a number of uncultivated cellulolytic bacteria, including a myxobacterium that possesses a remarkably reduced genome and expanded set of genes for biomass deconstruction compared to its closest sequenced relatives,” Singer says. “This demonstrates that the processes required for recovering genomes from metagenomic datasets can be applied to understanding biomass breakdown in the environment”.

    MaxBin is now being used at JBEI in its efforts to use microbes for the production of advanced biofuels – gasoline, diesel and jet fuel – from plant biomass. MaxBin is also available for downloading. To date, more than 150 researchers have accessed it.

    A paper describing MaxBin in detail has been published in the journal Microbiome. The paper is titled MaxBin: an automated binning method to recover individual genomes from metagenomes using an expectation-maximization algorithm. Co-authoring this paper in addition to Singer, Wu and Tang, were Susannah Tringe of the Joint Genome Institute, and Blake Simmons of JBEI.

    • See the full article here.

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 3:41 pm on September 27, 2014 Permalink | Reply
    Tags: , CO2 studies, Lawrence Berkeley National laboratory, ,   

    From LBL: “Pore models track reactions in underground carbon capture” 

    Berkeley Logo

    Berkeley Lab

    September 25, 2014

    Using tailor-made software running on top-tier supercomputers, a Lawrence Berkeley National Laboratory team is creating microscopic pore-scale simulations that complement or push beyond laboratory findings.

    image
    Computed pH on calcite grains at 1 micron resolution. The iridescent grains mimic crushed material geoscientists extract from saline aquifers deep underground to study with microscopes. Researchers want to model what happens to the crystals’ geochemistry when the greenhouse gas carbon dioxide is injected underground for sequestration. Image courtesy of David Trebotich, Lawrence Berkeley National Laboratory.

    The models of microscopic underground pores could help scientists evaluate ways to store carbon dioxide produced by power plants, keeping it from contributing to global climate change.

    The models could be a first, says David Trebotich, the project’s principal investigator. “I’m not aware of any other group that can do this, not at the scale at which we are doing it, both in size and computational resources, as well as the geochemistry.” His evidence is a colorful portrayal of jumbled calcite crystals derived solely from mathematical equations.

    The iridescent menagerie is intended to act just like the real thing: minerals geoscientists extract from saline aquifers deep underground. The goal is to learn what will happen when fluids pass through the material should power plants inject carbon dioxide underground.

    Lab experiments can only measure what enters and exits the model system. Now modelers would like to identify more of what happens within the tiny pores that exist in underground materials, as chemicals are dissolved in some places but precipitate in others, potentially resulting in preferential flow paths or even clogs.

    Geoscientists give Trebotich’s group of modelers microscopic computerized tomography (CT, similar to the scans done in hospitals) images of their field samples. That lets both camps probe an anomaly: reactions in the tiny pores happen much more slowly in real aquifers than they do in laboratories.

    Going deep

    Deep saline aquifers are underground formations of salty water found in sedimentary basins all over the planet. Scientists think they’re the best deep geological feature to store carbon dioxide from power plants.

    But experts need to know whether the greenhouse gas will stay bottled up as more and more of it is injected, spreading a fluid plume and building up pressure. “If it’s not going to stay there (geoscientists) will want to know where it is going to go and how long that is going to take,” says Trebotich, who is a computational scientist in Berkeley Lab’s Applied Numerical Algorithms Group.

    He hopes their simulation results ultimately will translate to field scale, where “you’re going to be able to model a CO2 plume over a hundred years’ time and kilometers in distance.” But for now his group’s focus is at the microscale, with attention toward the even smaller nanoscale.

    At such tiny dimensions, flow, chemical transport, mineral dissolution and mineral precipitation occur within the pores where individual grains and fluids commingle, says a 2013 paper Trebotich coauthored with geoscientists Carl Steefel (also of Berkeley Lab) and Sergi Molins in the journal Reviews in Mineralogy and Geochemistry.

    These dynamics, the paper added, create uneven conditions that can produce new structures and self-organized materials – nonlinear behavior that can be hard to describe mathematically.

    Modeling at 1 micron resolution, his group has achieved “the largest pore-scale reactive flow simulation ever attempted” as well as “the first-ever large-scale simulation of pore-scale reactive transport processes on real-pore-space geometry as obtained from experimental data,” says the 2012 annual report of the lab’s National Energy Research Scientific Computing Center (NERSC).

    The simulation required about 20 million processor hours using 49,152 of the 153,216 computing cores in Hopper, a Cray XE6 that at the time was NERSC’s flagship supercomputer.

    cray hopper
    Cray Hopper at NERSC

    “As CO2 is pumped underground, it can react chemically with underground minerals and brine in various ways, sometimes resulting in mineral dissolution and precipitation, which can change the porous structure of the aquifer,” the NERSC report says. “But predicting these changes is difficult because these processes take place at the pore scale and cannot be calculated using macroscopic models.

    “The dissolution rates of many minerals have been found to be slower in the field than those measured in the laboratory. Understanding this discrepancy requires modeling the pore-scale interactions between reaction and transport processes, then scaling them up to reservoir dimensions. The new high-resolution model demonstrated that the mineral dissolution rate depends on the pore structure of the aquifer.”

    Trebotich says “it was the hardest problem that we could do for the first run.” But the group redid the simulation about 2½ times faster in an early trial of Edison, a Cray XC-30 that succeeded Hopper. Edison, Trebotich says, has larger memory bandwidth.

    cray edison
    Cray Edison at NERSC

    Rapid changes

    Generating 1-terabyte data sets for each microsecond time step, the Edison run demonstrated how quickly conditions can change inside each pore. It also provided a good workout for the combination of interrelated software packages the Trebotich team uses.

    The first, Chombo, takes its name from a Swahili word meaning “toolbox” or “container” and was developed by a different Applied Numerical Algorithms Group team. Chombo is a supercomputer-friendly platform that’s scalable: “You can run it on multiple processor cores, and scale it up to do high-resolution, large-scale simulations,” he says.

    Trebotich modified Chombo to add flow and reactive transport solvers. The group also incorporated the geochemistry components of CrunchFlow, a package Steefel developed, to create Chombo-Crunch, the code used for their modeling work. The simulations produce resolutions “very close to imaging experiments,” the NERSC report said, combining simulation and experiment to achieve a key goal of the Department of Energy’s Energy Frontier Research Center for Nanoscale Control of Geologic CO2

    Now Trebotich’s team has three huge allocations on DOE supercomputers to make their simulations even more detailed. The Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program is providing 80 million processor hours on Mira, an IBM Blue Gene/Q at Argonne National Laboratory. Through the Advanced Scientific Computing Research Leadership Computing Challenge (ALCC), the group has another 50 million hours on NERSC computers and 50 million on Titan, a Cray XK78 at Oak Ridge National Laboratory’s Leadership Computing Center. The team also held an ALCC award last year for 80 million hours at Argonne and 25 million at NERSC.

    mira
    MIRA at Argonne

    titan
    TITAN at Oak Ridge

    With the computer time, the group wants to refine their image resolutions to half a micron (half of a millionth of a meter). “This is what’s known as the mesoscale: an intermediate scale that could make it possible to incorporate atomistic-scale processes involving mineral growth at precipitation sites into the pore scale flow and transport dynamics,” Trebotich says.

    Meanwhile, he thinks their micron-scale simulations already are good enough to provide “ground-truthing” in themselves for the lab experiments geoscientists do.

    See the full article here.

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 1:49 pm on September 3, 2014 Permalink | Reply
    Tags: , , , Lawrence Berkeley National laboratory, , Peptoids   

    From LBL: “Peptoid Nanosheets at the Oil/Water Interface” 

    Berkeley Logo

    Berkeley Lab

    September 3, 2014
    Lynn Yarris (510) 486-5375

    From the people who brought us peptoid nanosheets that form at the interface between air and water, now come peptoid nanosheets that form at the interface between oil and water. Scientists at the U.S. Department of Energy (DOE)’s Lawrence Berkeley National Laboratory (Berkeley Lab) have developed peptoid nanosheets – two-dimensional biomimetic materials with customizable properties – that self-assemble at an oil-water interface. This new development opens the door to designing peptoid nanosheets of increasing structural complexity and chemical functionality for a broad range of applications, including improved chemical sensors and separators, and safer, more effective drug delivery vehicles.

    Supramolecular assembly at an oil-water interface is an effective way to produce 2D nanomaterials from peptoids because that interface helps pre-organize the peptoid chains to facilitate their self-interaction,” says Ron Zuckermann, a senior scientist at the Molecular Foundry, a DOE nanoscience center hosted at Berkeley Lab. “This increased understanding of the peptoid assembly mechanism should enable us to scale-up to produce large quantities, or scale- down to screen many different nanosheets for novel functions.”

    nano
    Peptoid nanosheets are among the largest and thinnest free-floating organic crystals ever made, with an area-to-thickness equivalent of a plastic sheet covering a football field. Peptoid nanosheets can be engineered to carry out a wide variety of functions.
    two
    Ron Zuckerman and Geraldine Richmond led the development of peptoid nanosheets that form at the interface between oil and water, opening the door to increased structural complexity and chemical functionality for a broad range of applications.

    Zuckermann, who directs the Molecular Foundry’s Biological Nanostructures Facility, and Geraldine Richmond of the University of Oregon are the corresponding authors of a paper reporting these results in the Proceedings of the National Academy of Sciences (PNAS). The paper is titled Assembly and molecular order of two-dimensional peptoid nanosheets at the oil-water interface. Co-authors are Ellen Robertson, Gloria Olivier, Menglu Qian and Caroline Proulx.

    Peptoids are synthetic versions of proteins. Like their natural counterparts, peptoids fold and twist into distinct conformations that enable them to carry out a wide variety of specific functions. In 2010, Zuckermann and his group at the Molecular Foundry discovered a technique to synthesize peptoids into sheets that were just a few nanometers thick but up to 100 micrometers in length. These were among the largest and thinnest free-floating organic crystals ever made, with an area-to-thickness equivalent of a plastic sheet covering a football field. Just as the properties of peptoids can be chemically customized through robotic synthesis, the properties of peptoid nanosheets can also be engineered for specific functions.

    “Peptoid nanosheet properties can be tailored with great precision,” Zuckermann says, “and since peptoids are less vulnerable to chemical or metabolic breakdown than proteins, they are a highly promising platform for self-assembling bio-inspired nanomaterials.”

    In this latest effort, Zuckermann, Richmond and their co-authors used vibrational sum frequency spectroscopy to probe the molecular interactions between the peptoids as they assembled at the oil-water interface. These measurements revealed that peptoid polymers adsorbed to the interface are highly ordered, and that this order is greatly influenced by interactions between neighboring molecules.

    “We can literally see the polymer chains become more organized the closer they get to one another,” Zuckermann says.

    ft
    Peptoid polymers adsorbed to the oil-water interface are highly ordered thanks to interactions between neighboring molecules.

    The substitution of oil in place of air creates a raft of new opportunities for the engineering and production of peptoid nanosheets. For example, the oil phase could contain chemical reagents, serve to minimize evaporation of the aqueous phase, or enable microfluidic production.

    “The production of peptoid nanosheets in microfluidic devices means that we should soon be able to make combinatorial libraries of different functionalized nanosheets and screen them on a very small scale,” Zuckermann says. “This would be advantageous in the search for peptoid nanosheets with the molecular recognition and catalytic functions of proteins.”

    Zuckermann and his group at the Molecular Foundry are now investigating the addition of chemical reagents or cargo to the oil phase, and exploring their interactions with the peptoid monolayers that form during the nanosheet assembly process.

    “In the future we may be able to produce nanosheets with drugs, dyes, nanoparticles or other solutes trapped in the interior,” he says. “These new nanosheets could have a host of interesting biomedical, mechanical and optical properties.”

    This work was primarily funded by the DOE Office of Science and the Defense Threat Reduction Agency. Part of the research was performed at the Molecular Foundry and the Advanced Light Source, which are DOE Office of Science User Facilities.

    See the full article here.

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 1:41 pm on August 29, 2014 Permalink | Reply
    Tags: , , , , , Lawrence Berkeley National laboratory   

    From LBL: “Going to Extremes for Enzymes” 

    Berkeley Logo

    Berkeley Lab

    August 29, 2014
    Lynn Yarris (510) 486-5375

    In the age-old nature versus nurture debate, Douglas Clark, a faculty scientist with Berkeley Lab and the University of California (UC) Berkeley, is not taking sides. In the search for enzymes that can break lignocellulose down into biofuel sugars under the extreme conditions of a refinery, he has prospected for extremophilic microbes and engineered his own cellulases.

    ext
    Extremophiles thriving in thermal springs where the water temperature can be close to boiling can be a rich source of enzymes for the deconstruction of lignocellulose.

    Speaking at the national meeting of the American Chemical Society (ACS) in San Francisco, Clark discussed research for the Energy Biosciences Institute (EBI) in which he and his collaborators are investigating ways to release plant sugars from lignin for the production of liquid transportation fuels. Sugars can be fermented into fuels once the woody matter comprised of cellulose, hemicellulose, and lignin is broken down, but lignocellulose is naturally recalcitrant.

    “Lignocellulose is designed by nature to stand tall and resist being broken down, and lignin in particular acts like a molecular glue to help hold it together” said Clark, who holds appointments with Berkeley Lab’s Physical Biosciences Division and UC Berkeley’s Chemical and Biomolecular Engineering Department where he currently serves as dean of the College of Chemistry. “Consequently, lignocellulosic biomass must undergo either chemical or enzymatic deconstruction to release the sugars that can be fermented to biofuels.”

    dc
    Douglas Clark holds joint appointments with Berkeley Lab and UC Berkeley and is a principal investigator with the Energy Biosciences Institute. (Photo by Roy Kaltschmidt)

    For various chemical reasons, all of which add up to cost-competitiveness, biorefineries could benefit if the production of biofuels from lignocellulosic biomass is carried out at temperatures between 65 and 70 degrees Celsius. The search by Clark and his EBI colleagues for cellulases that can tolerate these and even harsher conditions led them to thermal springs near Gerlach, Nevada, where the water temperature can be close to boiling. There they discovered a consortium of three hyperthermophilic Archaea that could grow on crystalline cellulose at 90 degrees Celsius.

    “This consortium represents the first instance of Archaea able to deconstruct lignocellulose optimally above 90°C,” Clark said.

    Following metagenomic studies on the consortium, the most active high-temperature cellulase was identified and named EBI-244.

    “The EBI-244 cellulase is active at temperatures as high as 108 degrees Celsius, the most extremely heat-tolerant enzyme ever found in any cellulose-digesting microbe,” Clark said.

    The most recent expedition of Clark and his colleagues was to thermal hot springs in Lassen Volcanic National Park, where they found an enzyme active on cellulose up to 100°C under highly acidic conditions – pH approximately 2.2.

    “The Lassen enzyme is the most acidothermophilic cellulase yet discovered,” Clark said. “The final products that it forms are similar to those produced by EBI244.”

    three
    A consortium of three hyperthermophilic Archaea that could grow on crystalline cellulose at 90 degrees Celsius yielded EBI-244, the most active high-temperature cellulase ever identified.

    In addition to bioprospecting for heat tolerant enzymes, Clark and his colleagues have developed a simple and effective mutagenesis method to enhance the properties of natural enzymes. Most recently they used this technique to increase the optimal temperature and enhance the thermostability of Ce17A, a fungal cellulase that is present in high concentrations in commercial cellulase cocktails. They engineered yeast to produce this enzyme with encouraging results.

    “The yeast Saccharomyces cerevisiae has often been used both in the engineering and basic study of Cel7A; however, Cel7A enzymes recombinantly expressed in yeast are often less active and less stable than their native counterparts,” Clark said. “We discovered that an important post-translational modification that was sometimes absent in the yeast-expressed enzyme was the underlying cause of this disparity and successfully carried out the post-translational modification in vitro. After this treatment, the properties of Cel7A recombinantly expressed in yeast were improved to match those of the native enzyme.”

    Collaborators in this research include Harvey Blanch, who also holds joint appointments with Berkeley Lab and UC Berkeley, and Frank Robb from the University of Maryland.

    EBI, which provided the funding for this research, is a collaborative partnership between BP, the funding agency, UC Berkeley, Berkeley Lab and the University of Illinois at Urbana-Champaign.

    See the full article here.

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 8:06 pm on August 27, 2014 Permalink | Reply
    Tags: , , , , Lawrence Berkeley National laboratory   

    From Berkeley Lab: “Encyclopedia of How Genomes Function Gets Much Bigger” 

    Berkeley Logo

    Berkeley Lab

    August 27, 2014
    Dan Krotz 510-486-4019

    A big step in understanding the mysteries of the human genome was unveiled today in the form of three analyses that provide the most detailed comparison yet of how the genomes of the fruit fly, roundworm, and human function.

    The research, appearing August 28 in in the journal Nature, compares how the information encoded in the three species’ genomes is “read out,” and how their DNA and proteins are organized into chromosomes.

    The results add billions of entries to a publicly available archive of functional genomic data. Scientists can use this resource to discover common features that apply to all organisms. These fundamental principles will likely offer insights into how the information in the human genome regulates development, and how it is responsible for diseases.

    mod
    Berkeley Lab scientists contributed to an NHGRI effort that provides the most detailed comparison yet of how the genomes of the fruit fly, roundworm, and human function. (Credit: Darryl Leja, NHGRI)

    The analyses were conducted by two consortia of scientists that include researchers from the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab). Both efforts were funded by the National Institutes of Health’s National Human Genome Research Institute.

    One of the consortiums, the “model organism Encyclopedia of DNA Elements” (modENCODE) project, catalogued the functional genomic elements in the fruit fly and roundworm. Susan Celniker and Gary Karpen of Berkeley Lab’s Life Sciences Division led two fruit fly research groups in this consortium. Ben Brown, also with the Life Sciences Division, participated in another consortium, ENCODE, to identify the functional elements in the human genome.

    The consortia are addressing one of the big questions in biology today: now that the human genome and many other genomes have been sequenced, how does the information encoded in an organism’s genome make an organism what it is? To find out, scientists have for the past several years studied the genomes of model organisms such as the fruit fly and roundworm, which are smaller than our genome, yet have many genes and biological pathways in common with humans. This research has led to a better understanding of human gene function, development, and disease.

    Comparing Transcriptomes

    In all organisms, the information encoded in genomes is transcribed into RNA molecules that are either translated into proteins, or utilized to perform functions in the cell. The collection of RNA molecules expressed in a cell is known as its transcriptome, which can be thought of as the “read out” of the genome.

    In the research announced today, dozens of scientists from several institutions looked for similarities and differences in the transcriptomes of human, roundworm, and fruit fly. They used deep sequencing technology and bioinformatics to generate large amounts of matched RNA-sequencing data for the three species. This involved 575 experiments that produced more than 67 billion sequence reads.

    A team led by Celniker, with help from Brown and scientists from several other labs, conducted the fruit fly portion of this research. They mapped the organism’s transcriptome at 30 time points of its development. They also explored how environmental perturbations such as heavy metals, herbicides, caffeine, alcohol and temperature affect the fly’s transcriptome. The result is the finest time-resolution analysis of the fly genome’s “read out” to date—and a mountain of new data.

    “We went from two billion reads in research we published in 2011, to 20 billion reads today,” says Celniker. “As a result, we found that the transcriptome is much more extensive and complex than previously thought. It has more long non-coding RNAs and more promoters.”

    When the scientists compared transcriptome data from all three species, they discovered 16 gene-expression modules corresponding to processes such as transcription and cell division that are conserved in the three animals. They also found a similar pattern of gene expression at an early stage of embryonic development in all three organisms.

    This work is described in a Nature article entitled “Comparative analysis of the transcriptome across distant species.”

    Comparing chromatin

    Another group, also consisting of dozens of scientists from several institutions, analyzed chromatin, which is the combination of DNA and proteins that organize an organism’s genome into chromosomes. Chromatin influences nearly every aspect of genome function.

    Karpen led the fruit fly portion of this work, with Harvard Medical School’s Peter Park contributing on the bioinformatics side, and scientists from several other labs also participating. The team mapped the distribution of chromatin proteins in the fruit fly genome. They also learned how chemical modifications to chromatin proteins impact genome functions.

    Their results were compared with results from human and roundworm chromatin research. In all, the group generated 800 new chromatin datasets from different cell lines and developmental stages of the three species, bringing the total number of datasets to more than 1400. These datasets are presented in a Nature article entitled “Comparative analysis of metazoan chromatin organization.”

    Here again, the scientists found many conserved chromatin features among the three organisms. They also found significant differences, such as in the composition and locations of repressive chromatin.

    But perhaps the biggest scientific dividend is the data itself.

    “We found many insights that need follow-up,” says Karpen. “And we’ve also greatly increased the amount of data that others can access. These datasets and analyses will provide a rich resource for comparative and species-specific investigations of how genomes, including the human genome, function.”

    See the full article here.

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 1:46 pm on August 26, 2014 Permalink | Reply
    Tags: , , Lawrence Berkeley National laboratory, , ,   

    From Berkeley Lab: “Competition for Graphene” 

    Berkeley Logo

    Berkeley Lab

    August 26, 2014
    Lynn Yarris (510) 486-5375

    A new argument has just been added to the growing case for graphene being bumped off its pedestal as the next big thing in the high-tech world by the two-dimensional semiconductors known as MX2 materials. An international collaboration of researchers led by a scientist with the U.S. Department of Energy (DOE)’s Lawrence Berkeley National Laboratory (Berkeley Lab) has reported the first experimental observation of ultrafast charge transfer in photo-excited MX2 materials. The recorded charge transfer time clocked in at under 50 femtoseconds, comparable to the fastest times recorded for organic photovoltaics.

    “We’ve demonstrated, for the first time, efficient charge transfer in MX2 heterostructures through combined photoluminescence mapping and transient absorption measurements,” says Feng Wang, a condensed matter physicist with Berkeley Lab’s Materials Sciences Division and the University of California (UC) Berkeley’s Physics Department. “Having quantitatively determined charge transfer time to be less than 50 femtoseconds, our study suggests that MX2 heterostructures, with their remarkable electrical and optical properties and the rapid development of large-area synthesis, hold great promise for future photonic and optoelectronic applications.”

    fw
    Feng Wang is a condensed matter physicist with Berkeley Lab’s Materials Sciences Division and UC Berkeley’s Physics Department. (Photo by Roy Kaltschmidt)

    Wang is the corresponding author of a paper in Nature Nanotechnology describing this research. The paper is titled Ultrafast charge transfer in atomically thin MoS2/WS2 heterostructures. Co-authors are Xiaoping Hong, Jonghwan Kim, Su-Fei Shi, Yu Zhang, Chenhao Jin, Yinghui Sun, Sefaattin Tongay, Junqiao Wu and Yanfeng Zhang.

    MX2 monolayers consist of a single layer of transition metal atoms, such as molybdenum (Mo) or tungsten (W), sandwiched between two layers of chalcogen atoms, such as sulfur (S). The resulting heterostructure is bound by the relatively weak intermolecular attraction known as the van der Waals force. These 2D semiconductors feature the same hexagonal “honeycombed” structure as graphene and superfast electrical conductance, but, unlike graphene, they have natural energy band-gaps. This facilitates their application in transistors and other electronic devices because, unlike graphene, their electrical conductance can be switched off.

    “Combining different MX2 layers together allows one to control their physical properties,” says Wang, who is also an investigator with the Kavli Energy NanoSciences Institute (Kavli-ENSI). “For example, the combination of MoS2 and WS2 forms a type-II semiconductor that enables fast charge separation. The separation of photoexcited electrons and holes is essential for driving an electrical current in a photodetector or solar cell.”

    In demonstrating the ultrafast charge separation capabilities of atomically thin samples of MoS2/WS2 heterostructures, Wang and his collaborators have opened up potentially rich new avenues, not only for photonics and optoelectronics, but also for photovoltaics.

    photo
    Photoluminescence mapping of a MoS2/WS2 heterostructure with the color scale representing photoluminescence intensity shows strong quenching of the MoS2 photoluminescence. (Image courtesy of Feng Wang group)

    “MX2 semiconductors have extremely strong optical absorption properties and compared with organic photovoltaic materials, have a crystalline structure and better electrical transport properties,” Wang says. “Factor in a femtosecond charge transfer rate and MX2 semiconductors provide an ideal way to spatially separate electrons and holes for electrical collection and utilization.”

    Wang and his colleagues are studying the microscopic origins of charge transfer in MX2 heterostructures and the variation in charge transfer rates between different MX2 materials.

    “We’re also interested in controlling the charge transfer process with external electrical fields as a means of utilizing MX2 heterostructures in photovoltaic devices,” Wang says.

    This research was supported by an Early Career Research Award from the DOE Office of Science through UC Berkeley, and by funding agencies in China through the Peking University in Beijing.

    See the full article here.

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 12:30 pm on August 22, 2014 Permalink | Reply
    Tags: , Lawrence Berkeley National laboratory, ,   

    From Berkeley Lab: “Shaping the Future of Nanocrystals” 

    Berkeley Logo

    Berkeley Lab

    August 21, 2014
    Lynn Yarris

    The first direct observations of how facets form and develop on platinum nanocubes point the way towards more sophisticated and effective nanocrystal design and reveal that a nearly 150 year-old scientific law describing crystal growth breaks down at the nanoscale.

    Researchers with the U.S. Department of Energy (DOE)’s Lawrence Berkeley National Laboratory (Berkeley Lab) used highly sophisticated transmission electron microscopes and an advanced high-resolution, fast-detection camera to capture the physical mechanisms that control the evolution of facets – flat faces – on the surfaces of platinum nanocubes formed in liquids. Understanding how facets develop on a nanocrystal is critical to controlling the crystal’s geometric shape, which in turn is critical to controlling the crystal’s chemical and electronic properties.

    “For years, predictions of the equilibrium shape of a nanocrystal have been based on the surface energy minimization proposal by Josiah Willard Gibbs in the 1870s to describe the equilibrium shape of a water droplet,” says Haimei Zheng, a staff scientist in Berkeley Lab’s Materials Sciences Division who led this study. “For nanocrystals, the idea is that during crystal growth, high-energy facets will grow at a higher rate than low-energy facets and eventually disappear, resulting in a nanocrystal whose shape is configured to minimize surface energy.”

    The research of Zheng and her collaborators showed that at the molecular level, the geometric shape of nanocrystals during synthesis in solution is actually driven by differences in the mobility of ligands across the surfaces of different facets.

    “By choosing ligands that selectively bind on the facets, we should be able to control the shape of the nanocrystal as it grows,” she says. “This would provide a new way to design nanomaterials for advanced applications, including nanostructures for bio-imaging, catalysts for solar conversion, and energy storage.”

    two
    Haimei Zheng and Hong-Gang Liao used TEMs at the National Center for Electron Microscopy and a K2-IS camera to record the first direct observations facet formation in platinum nanocubes. (Photo by Kelly Owen)

    Zheng is the corresponding author of a paper in Science titled Facet Development During Platinum Nanocube Growth. Hong-Gang Liao is the lead author. Co-authors are Danylo Zherebetskyy, Huolin Xin, Cory Czarnik, Peter Ercius, Hans Elmlund, Ming Pan and Lin-Wang Wang.

    The performance of nanocrystals in such surface-enhanced applications as catalysis, sensing and photo-optics is strongly influenced by shape. While significant advances have been made in the synthesis of nanocrystals featuring a variety of shapes – cube, octahedron, tetrahedron, decahedron, icosahedron, etc., – controlling these shapes is often difficult and unpredictable.

    “A major roadblock has been that the atomic pathways of facet development in nanocrystals are mostly unknown due to the lack of direct observation,” Zheng says. “It has been assumed that commonly used surfactants modify the energy of specific facets through preferential adsorption, thereby influencing the relative growth rate of different facets and the shape of the final nanocrystal. However, this assumption was based on post-reaction characterizations that did not account for how facet dynamics evolve during crystal growth.”

    As a crystal undergoes growth, its constituent atoms or molecules fan out along specific directional planes whose coordinates are denoted by a three-digit system called the Miller Index. Facets form when the surfaces along different planes grow at different rates. Three of the most critical facets for determining a crystal’s geometric shape are the so-called “low index facets,” which are designated under the Miller Index as {100}, {110} and {111}.

    image
    Berkeley Lab researchers found that differences in ligand mobility during crystallization cause the low index facets – {100}, {110} and {111} – to stop growing at different times, resulting in the crystal’s final cubic shape. (Image courtesy of Haimei Zheng group)

    Working with platinum, one of the most effective industrial catalysts in use today, Zheng and her collaborators initiated the growth of nanocubes in a thin layer of liquid sandwiched between two silicon nitride membranes. This microfabricated liquid cell can encapsulate and maintain the liquid inside the high vacuum of a transmission electron microscope (TEM) for an extended period of time, enabling in situ observations of single nanoparticle growth trajectories.

    “With the liquid cells, we’re able to use TEMs to observe the growth of nanocrystals that remarkably resemble nanocrystals synthesized in flasks,” Zheng says. “We found that the growth rates of all low index facets are similar until the {100} facets stop growing. The {110} facets will continue to grow until they reach two neighboring {100} facets, at which point they form the edge of a cube whose corners will be filled in by the continued growth of {111} facets. The arrested growth of the {100} facets that triggers this process is determined by ligand mobility on the {100} facets, which is much lower than on the {110} and {111} facets.”

    For their observations, Zeng and her collaborators were able to use several of the TEMs at Berkeley Lab’s National Center for Electron Microscopy (NCEM), a DOE Office of Science user facility, including the TEAM 0.5 instrument, the world’s most powerful TEM. In addition, they were able to use a K2-IS camera from Gatan, Inc., which can capture electron images directly onto a CMOS sensor at 400 frames per second (fps) with 2K-by-2K pixel resolution.

    “The K2-IS camera can also be configured to capture images at up to 1600 fps with appropriate scaling of the field of view, which is critical for observing particles that are moving dynamically in the field of view,” says lead author Liao, a member of Zheng’s research group. “The elimination of the traditional scintillation process during image detection results in significant improvement in both sensitivity and resolution. High resolution imaging is also facilitated by the thin silicon nitride membranes of our liquid cell window, which is about 10 nanometers thick per membrane.”

    The lower ligand mobility and arrested growth of selected facets experimentally observed by Zheng and Liao, were supported by ab initio calculations carried out under the leadership of co-author Wang, a senior scientist with the Materials Sciences Division who heads the Computational Material Science and Nano Science group.

    “At first, we thought the continued growth in the {111} direction might be a result of higher surface energy on the {111} plane,” says co-author Zherebetskyy, a member of Wang’s group. “The experimental observations forced us to consider alternative mechanisms and our calculations show that the relatively low energy barrier on the {111} plane allow the ligand molecules on that plane to be very mobile.”

    Says Wang, “Our collaboration with Haimei Zheng’s group showcases how ab initio calculations can be combined with experimental observations to shed new light on hidden molecular processes.”

    Zheng and her group are now in the process of determining whether the ligand mobility in platinum that shaped the formation of cube-shaped nanocrystals also applies to ligands in other nanomaterials and the formation of nanocrystals in other geometric shapes.

    This research was supported by the DOE Office of Science.

    See the full article here.

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 4:09 pm on August 21, 2014 Permalink | Reply
    Tags: , Lawrence Berkeley National laboratory, ,   

    From Berkeley Lab: “Researchers Map Quantum Vortices Inside Superfluid Helium Nanodroplets” 

    Berkeley Logo

    Berkeley Lab

    August 21, 2014
    Kate Greene

    Scientists have, for the first time, characterized so-called quantum vortices that swirl within tiny droplets of liquid helium. The research, led by scientists at the U.S. Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab), the University of Southern California, and SLAC National Accelerator Laboratory, confirms that helium nanodroplets are in fact the smallest possible superfluidic objects and opens new avenues for studying quantum rotation.

    “The observation of quantum vortices is one of the most clear and unique demonstrations of the quantum properties of these microscopic objects,” says Oliver Gessner, senior scientist in the Chemical Sciences Division at Berkeley Lab. Gessner and colleagues, Andrey Vilesov of the University of Southern California and Christoph Bostedt of SLAC National Accelerator Laboratory at Stanford, led the multi-facility and multi-university team that published the work this week in Science.

    Droplet_art_fin3drop
    Illustration of analysis of superfluid helium nanodroplets. Droplets are emitted via a cooled nozzle (upper right) and probed with x-ray from the free-electron laser. The multicolored pattern (upper left) represents a diffraction pattern that reveals the shape of a droplet and the presence of quantum vortices such as those represented in the turquoise circle with swirls (bottom center). Credit: Felix P. Sturm and Daniel S. Slaughter, Berkeley Lab.

    The finding could have implications for other liquid or gas systems that contain vortices, says USC’s Vilesov. “The quest for quantum vortices in superfluid droplets has stretched for decades,” he says. “But this is the first time they have been seen in superfluid droplets.”

    Superfluid helium has long captured scientist’s imagination since its discovery in the 1930s. Unlike normal fluids, superfluids have no viscosity, a feature that leads to strange and sometimes unexpected properties such as crawling up the walls of containers or dripping through barriers that contained the liquid before it transitioned to a superfluid.

    Helium superfluidity can be achieved when helium is cooled to near absolute zero (zero kelvin or about -460 degrees F). At this temperature, the atoms within the liquid no longer vibrate with heat energy and instead settle into a calm state in which all atoms act together in unison, as if they were a single particle.

    For decades, researchers have known that when superfluid helium is rotated–in a little spinning bucket, say–the rotation produces quantum vortices, swirls that are regularly spaced throughout the liquid. But the question remained whether anyone could see this behavior in an isolated, nanoscale droplet. If the swirls were there, it would confirm that helium nanodroplets, which can range in size from tens of nanometers to microns, are indeed superfluid throughout and that the motion of the entire liquid drop is that of a single quantum object rather than a mixture of independent particles.

    But measuring liquid flow in helium nanodroplets has proven to be a serious challenge. “The way these droplets are made is by passing helium through a tiny nozzle that is cryogenically cooled down to below 10 Kelvin,” says Gessner. “Then, the nanoscale droplets shoot through a vacuum chamber at almost 200 meters-per-second. They live once for a few milliseconds while traversing the experimental chamber and then they’re gone. How do you show that these objects, which are all different from one another, have quantum vortices inside?”

    og
    Oliver Gessner, Chemical Sciences Division, Berkeley Lab. Credit: Roy Kaltschmidt

    The researchers turned to a facility at SLAC called the Linac Coherent Light Source (LCLS), a DOE Office of Science user facility that is the world’s first x-ray free-electron laser. This laser produces very short light pulses, lasting just a ten-trillionth of a second, which contain a huge number of high-energy photons. These intense x-ray pulses can effectively take snapshots of single, ultra-fast, ultra-small objects and phenomena.

    slac
    Inside the SLAC LCLS

    “With the new x-ray free electron laser, we can now image phenomenon and look at processes far beyond what we could imagine just a decade ago,” says Bostedt of SLAC. “Looking at the droplets gave us a beautiful glimpse into the quantum world. It really opens the door to fascinating sciences.”

    In the experiment, the researchers blasted a stream of helium nanodroplets across the x-ray laser beam inside a vacuum chamber; a detector caught the pattern that formed when the x-ray light diffracted off the drops.

    The diffraction patterns immediately revealed that the shape of many droplets were not spheres, as was previously assumed. Instead, they were oblate. Just as the Earth’s rotation causes it to bulge at the equator, so too do rotating nanodroplets expand around the middle and flatten at the top and bottom.

    But the vortices themselves are invisible to x-ray diffraction, so the researchers used a trick of adding xenon atoms to the droplets. The xenon atoms get pulled into the vortices and cluster together.

    “It’s similar to pulling the plug in a bathtub and watching the kids’ toys gather in the vortex,” says Gessner. The xenon atoms diffract x-ray light much stronger than the surrounding helium, making the regular arrays of vortices inside the droplet visible. In this way, the researchers confirmed that vortices in nanodroplets behave as those found in larger amounts of rotating superfluid helium.

    Armed with this new information, the researchers were able to determine the rotational speed of the nanodroplets. They were surprised to find that the nanodroplets spin up to 100,000 times faster than any other superfluid helium sample ever studied in a laboratory.

    Moreover, while normal liquid drops will change shape as they spin faster and faster–to resemble a peanut or multi-lobed globule, for instance–the researchers saw no evidence of such shapeshifting in the helium nanodroplets. “Essentially, we’re exploring a new regime of quantum rotation with this matter,” Gessner says.

    “It’s a new kind of matter in a sense because it is a self-contained isolated superfluid,” he adds. “It’s just all by itself, held together by its own surface tension. It’s pretty perfect to study these systems if one wants to understand superfluidity and isolate it as much as possible.”

    This research was supported by the DOE Office of Science, Office of Basic Energy Sciences, Chemical Sciences, Geosciences and Biosciences Division as well as the National Science Foundation.

    See the full article here.

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
Follow

Get every new post delivered to your Inbox.

Join 342 other followers

%d bloggers like this: