Tagged: LBNL Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 10:59 am on June 19, 2018 Permalink | Reply
    Tags: , , LBNL, , , Searching Science Data   

    From Lawrence Berkeley National Lab: “Berkeley Lab Researchers Use Machine Learning to Search Science Data” 

    Berkeley Logo

    From Lawrence Berkeley National Lab

    1
    A screenshot of image-based results in the Science Search interface. In this case, the user performed an image search for nanoparticles. (Credit: Gonzalo Rodrigo/Berkeley Lab)

    As scientific datasets increase in both size and complexity, the ability to label, filter and search this deluge of information has become a laborious, time-consuming and sometimes impossible task, without the help of automated tools.

    With this in mind, a team of researchers from the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) and UC Berkeley are developing innovative machine learning tools to pull contextual information from scientific datasets and automatically generate metadata tags for each file. Scientists can then search these files via a web-based search engine for scientific data, called Science Search, that the Berkeley team is building.

    As a proof-of-concept, the team is working with staff at Berkeley Lab’s Molecular Foundry, to demonstrate the concepts of Science Search on the images captured by the facility’s instruments. A beta version of the platform has been made available to Foundry researchers.

    LBNL Molecular Foundry – No image credits found

    “A tool like Science Search has the potential to revolutionize our research,” says Colin Ophus, a Molecular Foundry research scientist within the National Center for Electron Microscopy (NCEM) and Science Search Collaborator. “We are a taxpayer-funded National User Facility, and we would like to make all of the data widely available, rather than the small number of images chosen for publication. However, today, most of the data that is collected here only really gets looked at by a handful of people—the data producers, including the PI (principal investigator), their postdocs or graduate students—because there is currently no easy way to sift through and share the data. By making this raw data easily searchable and shareable, via the Internet, Science Search could open this reservoir of ‘dark data’ to all scientists and maximize our facility’s scientific impact.”

    The Challenges of Searching Science Data

    2
    This screen capture of the Science Search interface shows how users can easily validate metadata tags that have been generated via machine learning, or add information that hasn’t already been captured. (Credit: Gonzalo Rodrigo/Berkeley Lab)

    Today, search engines are ubiquitously used to find information on the Internet but searching science data presents a different set of challenges. For example, Google’s algorithm relies on more than 200 clues to achieve an effective search. These clues can come in the form of key words on a webpage, metadata in images or audience feedback from billions of people when they click on the information they are looking for. In contrast, scientific data comes in many forms that are radically different than an average web page, requires context that is specific to the science and often also lacks the metadata to provide context that is required for effective searches.

    At National User Facilities like the Molecular Foundry, researchers from all over the world apply for time and then travel to Berkeley to use extremely specialized instruments free of charge. Ophus notes that the current cameras on microscopes at the Foundry can collect up to a terabyte of data in under 10 minutes. Users then need to manually sift through this data to find quality images with “good resolution” and save that information on a secure shared file system, like Dropbox, or on an external hard drive that they eventually take home with them to analyze.

    Oftentimes, the researchers that come to the Molecular Foundry only have a couple of days to collect their data. Because it is very tedious and time consuming to manually add notes to terabytes of scientific data and there is no standard for doing it, most researchers just type shorthand descriptions in the filename. This might make sense to the person saving the file, but often doesn’t make much sense to anyone else.

    “The lack of real metadata labels eventually causes problems when the scientist tries to find the data later or attempts to share it with others,” says Lavanya Ramakrishnan, a staff scientist in Berkeley Lab’s Computational Research Division (CRD) and co-principal investigator of the Science Search project. “But with machine-learning techniques, we can have computers help with what is laborious for the users, including adding tags to the data. Then we can use those tags to effectively search the data.”

    3
    In addition to images, Science Search can also be used to look for proposals and papers. This is a screenshot of the paper search results. (Credit: Gonzalo Rodrigo/Berkeley Lab). [No hot links.]

    To address the metadata issue, the Berkeley Lab team uses machine-learning techniques to mine the “science ecosystem”—including instrument timestamps, facility user logs, scientific proposals, publications and file system structures—for contextual information. The collective information from these sources including timestamp of the experiment, notes about the resolution and filter used and the user’s request for time, all provides critical contextual information. The Berkeley lab team has put together an innovative software stack that uses machine-learning techniques including natural language processing pull contextual keywords about the scientific experiment and automatically create metadata tags for the data.

    For the proof-of-concept, Ophus shared data from the Molecular Foundry’s TEAM 1 electron microscope at NCEM that was recently collected by the facility staff, with the Science Search Team.

    LBNL National Center for Electron Microscopy (NCEM)

    He also volunteered to label a few thousand images to give the machine-learning tools some labels from which to start learning. While this is a good start, Science Search co-principal investigator Gunther Weber notes that most successful machine-learning applications typically require significantly more data and feedback to deliver better results. For example, in the case of search engines like Google, Weber notes that training datasets are created and machine-learning techniques are validated when billions of people around the world verify their identity by clicking on all the images with street signs or storefronts after typing in their passwords, or on Facebook when they’re tagging their friends in an image.

    “In the case of science data only a handful of domain experts can create training sets and validate machine-learning techniques, so one of the big ongoing problems we face is an extremely small number of training sets,” says Weber, who is also a staff scientist in Berkeley Lab’s CRD.

    To overcome this challenge, the Berkeley Lab researchers used transfer learning to limit the degrees of freedom, or parameter counts, on their convolutional neural networks (CNNs). Transfer learning is a machine learning method in which a model developed for a task is reused as the starting point for a model on a second task, which allows the user to get more accurate results from a smaller training set. In the case of the TEAM I microscope, the data produced contains information about which operation mode the instrument was in at the time of collection. With that information, Weber was able to train the neural network on that classification so it could generate that mode of operation label automatically. He then froze that convolutional layer of the network, which meant he’d only have to retrain the densely connected layers. This approach effectively reduces the number of parameters on the CNN, allowing the team to get some meaningful results from their limited training data.

    Machine Learning to Mine the Scientific Ecosystem

    In addition to generating metadata tags through training datasets, the Berkeley Lab team also developed tools that use machine-learning techniques for mining the science ecosystem for data context. For example, the data ingest module can look at a multitude of information sources from the scientific ecosystem—including instrument timestamps, user logs, proposals and publications—and identify commonalities. Tools developed at Berkeley Lab that use natural language-processing methods can then identify and rank words that give context to the data and facilitate meaningful results for users later on. The user will see something similar to the results page of an Internet search, where content with the most text matching the user’s search words will appear higher on the page. The system also learns from user queries and the search results they click on.

    Because scientific instruments are generating an ever-growing body of data, all aspects of the Berkeley team’s science search engine needed to be scalable to keep pace with the rate and scale of the data volumes being produced. The team achieved this by setting up their system in a Spin instance on the Cori supercomputer at the National Energy Research Scientific Computing Center (NERSC).

    NERSC

    NERSC Cray Cori II supercomputer at NERSC at LBNL, named after Gerty Cori, the first American woman to win a Nobel Prize in science

    LBL NERSC Cray XC30 Edison supercomputer


    The Genepool system is a cluster dedicated to the DOE Joint Genome Institute’s computing needs. Denovo is a smaller test system for Genepool that is primarily used by NERSC staff to test new system configurations and software.

    NERSC PDSF


    PDSF is a networked distributed computing cluster designed primarily to meet the detector simulation and data analysis requirements of physics, astrophysics and nuclear science collaborations.

    Spin is a Docker-based edge-services technology developed at NERSC that can access the facility’s high performance computing systems and storage on the back end.

    “One of the reasons it is possible for us to build a tool like Science Search is our access to resources at NERSC,” says Gonzalo Rodrigo, a Berkeley Lab postdoctoral researcher who is working on the natural language processing and infrastructure challenges in Science Search. “We have to store, analyze and retrieve really large datasets, and it is useful to have access to a supercomputing facility to do the heavy lifting for these tasks. NERSC’s Spin is a great platform to run our search engine that is a user-facing application that requires access to large datasets and analytical data that can only be stored on large supercomputing storage systems.”

    An Interface for Validating and Searching Data

    When the Berkeley Lab team developed the interface for users to interact with their system, they knew that it would have to accomplish a couple of objectives, including effective search and allowing human input to the machine learning models. Because the system relies on domain experts to help generate the training data and validate the machine-learning model output, the interface needed to facilitate that.

    “The tagging interface that we developed displays the original data and metadata available, as well as any machine-generated tags we have so far. Expert users then can browse the data and create new tags and review any machine-generated tags for accuracy,” says Matt Henderson, who is a Computer Systems Engineer in CRD and leads the user interface development effort.

    To facilitate an effective search for users based on available information, the team’s search interface provides a query mechanism for available files, proposals and papers that the Berkeley-developed machine-learning tools have parsed and extracted tags from. Each listed search result item represents a summary of that data, with a more detailed secondary view available, including information on tags that matched this item. The team is currently exploring how to best incorporate user feedback to improve the models and tags.

    “Having the ability to explore datasets is important for scientific breakthroughs, and this is the first time that anything like Science Search has been attempted,” says Ramakrishnan. “Our ultimate vision is to build the foundation that will eventually support a ‘Google’ for scientific data, where researchers can even search distributed datasets. Our current work provides the foundation needed to get to that ambitious vision.”

    “Berkeley Lab is really an ideal place to build a tool like Science Search because we have a number of user facilities, like the Molecular Foundry, that have decades worth of data that would provide even more value to the scientific community if the data could be searched and shared,” adds Katie Antypas, who is the principal investigator of Science Search and head of NERSC’s Data Department. “Plus we have great access to machine-learning expertise in the Berkeley Lab Computing Sciences Area as well as HPC resources at NERSC in order to build these capabilities.”

    In addition to Antypas, Ramakrishnan and Weber, UC Berkeley Computer Science Professor Joseph Hellerstein is also a principal investigator.

    This work was supported by the DOE Office of Advanced Scientific Computing Research (ASCR). Both the Molecular Foundry and NERSC are DOE Office of Science User Facilities located at Berkeley Lab.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

    Advertisements
     
  • richardmitnick 1:40 pm on June 18, 2018 Permalink | Reply
    Tags: , Convert nanoparticle-coated microscopic beads into lasers smaller than red blood cells, , LBNL,   

    From Lawrence Berkeley National Lab: “Scientists Create Continuously Emitting Microlasers With Nanoparticle-Coated Beads” 

    Berkeley Logo

    From Lawrence Berkeley National Lab

    Glenn Roberts Jr.
    geroberts@lbl.gov
    (510) 486-5582

    1
    At left, a tiny bead struck by a laser (at the yellowish spot shown at the top of the image) produces optical modes that circulate around the interior of the bead (pinkish ring). At right, a simulation of how the optical field inside a 5-micron (5 millionths of a meter) bead is distributed. (Credit: Angel Fernandez-Bravo/Berkeley Lab, Kaiyuan Yao)

    Researchers have found a way to convert nanoparticle-coated microscopic beads into lasers smaller than red blood cells.

    These microlasers, which convert infrared light into light at higher frequencies, are among the smallest continuously emitting lasers of their kind ever reported and can constantly and stably emit light for hours at a time, even when submerged in biological fluids such as blood serum.

    The innovation, discovered by an international team of scientists at the U.S. Department of Energy’s Lawrence Berkeley Laboratory (Berkeley Lab), opens up the possibility for imaging or controlling biological activity with infrared light, and for the fabrication of light-based computer chips. Their findings are detailed in a report published online June 18 in Nature Nanotechnology.

    The unique properties of these lasers, which measure 5 microns (millionths of a meter) across, were discovered by accident as researchers were studying the potential for the polymer (plastic) beads, composed of a translucent substance known as a colloid, to be used in brain imaging.

    2
    A scanning electron micrograph image (left) of a 5-micron-diameter polystyrene bead that is coated with nanoparticles, and a transmission electron micrograph image (right) that shows a cross-section of a bead, with nanoparticles along its outer surface. The scale bar at left is 1 micron, and the scale bar at right is 20 nanometers. (Credit: Angel Fernandez-Bravo, Shaul Aloni/Berkeley Lab)

    Angel Fernandez-Bravo, a postdoctoral researcher at Berkeley Lab’s Molecular Foundry, who was the lead author of study, mixed the beads with sodium yttrium fluoride nanoparticles “doped,” or embedded, with thulium, an element belonging to a group of metals known as lanthanides. The Molecular Foundry is a nanoscience research center open to researchers from around the world.

    LBNL Molecular Foundry – No image credits found

    Emory Chan, a Staff Scientist at the Molecular Foundry, had in 2016 used computational models to predict that thulium-doped nanoparticles exposed to infrared laser light at a specific frequency could emit light at a higher frequency than this infrared light in a counterintuitive process known as “upconversion.”

    Also at that time, Elizabeth Levy, then a participant in the Lab’s Summer Undergraduate Laboratory Internship (SULI) program, noticed that beads coated with these “upconverting nanoparticles” emitted unexpectedly bright light at very specific wavelengths, or colors.

    “These spikes were clearly periodic and clearly reproducible,” said Emory Chan, who co-led the study along with Foundry Staff Scientists Jim Schuck (now at Columbia University) and Bruce Cohen.

    The periodic spikes that Chan and Levy had observed are a light-based analog to so-called “whispering gallery” acoustics that can cause sound waves to bounce along the walls of a circular room so that even a whisper can be heard on the opposite side of the room. This whispering-gallery effect was observed in the dome of St. Paul’s Cathedral in London in the late 1800s, for example.

    In the latest study, Fernandez-Bravo and Schuck found that when an infrared laser excites the thulium-doped nanoparticles along the outer surface of the beads, the light emitted by the nanoparticles can bounce around the inner surface of the bead just like whispers bouncing along the walls of the cathedral.

    3
    A wide-field image showing the light emitted by microlasers in a self-assembled 2D array. (Credit: Angel Fernandez-Bravo)

    Light can make thousands of trips around the circumference of the microsphere in a fraction of a second, causing some frequencies of light to interact (or “interfere”) with themselves to produce brighter light while other frequencies cancel themselves out. This process explains the unusual spikes that Chan and Levy observed.

    When the intensity of light traveling around these beads reaches a certain threshold, the light can stimulate the emission of more light with the exact same color, and that light, in turn, can stimulate even more light. This amplification of light, the basis for all lasers, produces intense light at a very narrow range of wavelengths in the beads.

    Schuck had considered lanthanide-doped nanoparticles as potential candidates for microlasers, and he became convinced of this when Chan shared with him the periodic whispering-gallery data.

    Fernandez-Bravo found that when he exposed the beads to an infrared laser with enough power the beads turned into upconverting lasers, with higher frequencies than the original laser.

    He also found that beads could produce laser light at the lowest powers ever recorded for upconverting nanoparticle-based lasers.

    “The low thresholds allow these lasers to operate continuously for hours at much lower powers than previous lasers,” said Fernandez-Bravo.

    Other upconverting nanoparticle lasers operate only intermittently; they are only exposed to short, powerful pulses of light because longer exposure would damage them.

    “Most nanoparticle-based lasers heat up very quickly and die within minutes,” Schuck said. “Our lasers are always on, which allows us to adjust their signals for different applications.”

    In this case, researchers found that their microlasers performed stably after five hours of continuous use. “We can take the beads off the shelf months or years later, and they still lase,” Fernandez-Bravo said.

    Researchers are also exploring how to carefully tune the output light from the continuously emitting microlasers by simply changing the size and composition of the beads. And they have used a robotic system at the Molecular Foundry known as WANDA (Workstation for Automated Nanomaterial Discovery and Analysis) to combine different dopant elements and tune the nanoparticles’ performance.

    The researchers also noted that there are many potential applications for the microlasers, such as in controlling the activity of neurons or optical microchips, sensing chemicals, and detecting environmental and temperature changes.

    “At first these microlasers only worked in air, which was frustrating because we wanted to introduce them into living systems,” Cohen said. “But we found a simple trick of dipping them in blood serum, which coats the beads with proteins that allow them to lase in water. We’ve now seen that these beads can be trapped along with cells in laser beams and steered with the same lasers we use to excite them.”

    The latest study, and the new paths of study it has opened up, shows how fortuitous an unexpected result can be, he said. “We just happened to have the right nanoparticles and coating process to produce these lasers,” Schuck said.

    Researchers from UC Berkeley, the National Laboratory of Astana in Kazakhstan, the Polytechnic University of Milan, and Columbia University in New York also participated in this study. This work was supported by the DOE Office of Science, and by the Ministry of Education and Science of the Republic of Kazakhstan.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

     
  • richardmitnick 1:13 pm on June 18, 2018 Permalink | Reply
    Tags: , , , LBNL,   

    From Lawrence Berkeley National Lab: “Faster, Cheaper, Better: A New Way to Synthesize DNA” 

    Berkeley Logo

    From Lawrence Berkeley National Lab

    June 18, 2018
    Julie Chao
    JHChao@lbl.gov
    (510) 486-6491

    1
    Sebastian Palluk (left) and Daniel Arlow of the Joint BioEnergy Institute (JBEI) have pioneered a new way to synthesize DNA sequences. (Credit: Marilyn Chung/Berkeley Lab)

    In the rapidly growing field of synthetic biology, in which organisms can be engineered to do things like decompose plastic and manufacture biofuels and medicines, production of custom DNA sequences is a fundamental tool for scientific discovery. Yet the process of DNA synthesis, which has remained virtually unchanged for more than 40 years, can be slow and unreliable.

    Now in what could address a critical bottleneck in biology research, researchers at the Department of Energy’s Joint BioEnergy Institute (JBEI), based at Lawrence Berkeley National Laboratory (Berkeley Lab), announced they have pioneered a new way to synthesize DNA sequences through a creative use of enzymes that promises to be faster, cheaper, and more accurate. The discovery, led by JBEI graduate students Sebastian Palluk and Daniel Arlow, was published in Nature Biotechnology in a paper titled De novo DNA Synthesis Using Polymerase-Nucleotide Conjugates.

    “DNA synthesis is at the core of everything we try to do when we build biology,” said JBEI CEO Jay Keasling, the corresponding author on the paper and also a Berkeley Lab senior faculty scientist. “Sebastian and Dan have created what I think will be the best way to synthesize DNA since [Marvin] Caruthers invented solid-phase DNA synthesis almost 40 years ago. What this means for science is that we can engineer biology much less expensively – and in new ways – than we would have been able to do in the past.”

    The Caruthers process uses the tools of organic chemistry to attach DNA building blocks one at a time and has become the standard method used by DNA synthesis companies and labs around the world. However, it has drawbacks, the main ones being that it reaches its limit at about 200 bases, partly due to side reactions than can occur during the synthesis procedure, and that it produces hazardous waste. For researchers, even 1,000 bases is considered a small gene, so to make longer sequences, the shorter ones are stitched together using a process that is failure-prone and can’t make certain sequences.

    Buying your genes online

    A DNA sequence is made up of a combination of four chemical bases, represented by the letters A, C, T, and G. Researchers regularly work with genes of several thousand bases in length. To obtain them, they either need to isolate the genes from an existing organism, or they can order the genes from a company.

    “You literally paste the sequence into a website, then wait two weeks,” Arlow said. “Let’s say you buy 10 genes. Maybe nine of them will be delivered to you on time. In addition, if you want to test a thousand genes, at $300 per gene, the costs add up very quickly.”

    Palluk and Arlow were motivated to work on this problem because, as students, they were spending many long, tedious hours making DNA sequences for their experiments when they would much rather have been doing the actual experiment.

    “DNA is a huge biomolecule,” Palluk said. “Nature makes biomolecules using enzymes, and those enzymes are amazingly good at handling DNA and copying DNA. Typically our organic chemistry processes are not anywhere close to the precision that natural enzymes offer.”


    Faster, Cheaper, Better Way to Make DNA

    Thinking outside the box

    The idea of using an enzyme to make DNA is not new – scientists have been trying for decades to find a way to do it, without success. The enzyme of choice is called TdT (terminal deoxynucleotidyl transferase), which is found in the immune system of vertebrates and is one of the few enzymes in nature that writes new DNA from scratch rather than copying DNA. What’s more, it’s fast, able to add 200 bases per minute.

    In order to harness TdT to synthesize a desired sequence, the key requirement is to make it add just one nucleotide, or DNA building block, and then stop before it keeps adding the same nucleotide repeatedly. All of the previous proposals envisioned using nucleotides modified with special blocking groups to prevent multiple additions. However, the problem is that the catalytic site of the enzyme is not large enough to accept the nucleotide with a blocking group attached. “People have basically tried to ‘dig a hole’ in the enzyme by mutating it to make room for this blocking group,” Arlow said. “It’s tricky because you need to make space for it but also not screw up the activity of the enzyme.”

    Palluk and Arlow came up with a different approach. “Instead of trying to dig a hole in the enzyme, what we do is tether one nucleotide to each TdT enzyme via a cleavable linker,” Arlow said. “That way, after extending a DNA molecule using its tethered nucleotide, the enzyme has no other nucleotides available to add, so it stops. A key advantage of this approach is that the backbone of the DNA – the part that actually does the chemical reaction – is just like natural DNA, so we can try to get the full speed out of the enzyme.”

    Once the nucleotide is added to the DNA molecule, the enzyme is cleaved off. Then the cycle can begin again with the next nucleotide tethered to another TdT enzyme.

    Keasling finds the approach clever and counterintuitive. “Rather than reusing an enzyme as a catalyst, they said, ‘Hey, we can make enzymes really inexpensively. Let’s just throw it away.’ So the enzyme becomes a reagent rather than a catalyst,” he said. “That kind of thinking then allowed them to do something very different from what’s been proposed in the literature and – I think – accomplish something really important.”

    They demonstrated their method by manually making a DNA sequence of 10 bases. Not surprisingly, the two students were initially met with skepticism. “Even when we had first results, people would say, ‘It doesn’t make sense; it doesn’t seem right. That’s not how you use an enzyme,’” Palluk recalled.

    The two still have much work to do to optimize their method, but they are reasonably confident that they will be able to eventually make a gene with 1,000 bases in one go at many times the speed of the chemical method.

    Berkeley Lab has world-renowned capabilities in synthetic biology, technology development for biology, and engineering for biological process development. A number of technologies developed at JBEI and by the Lab’s Biosciences Area researchers have been spun into startups, including Lygos, Afingen, TeselaGen, and CinderBio.

    “After decades of optimization and fine-tuning, the conventional method now typically achieves a yield of about 99.5 percent per step. Our proof-of-concept synthesis had a yield of 98 percent per step, so it’s not quite on par yet, but it’s a promising starting point,” Palluk said. “We think that we’ll catch up soon and believe that we can push the system far beyond the current limitations of chemical synthesis.”

    “Our dream is to make a gene overnight,” Arlow said. “For companies trying to sustainably biomanufacture useful products, new pharmaceuticals, or tools for more environmentally friendly agriculture, and for JBEI and DOE, where we’re trying to produce fuels and chemicals from biomass, DNA synthesis is a key step. If you speed that up, it could drastically accelerate the whole process of discovery.”

    JBEI is a DOE Bioenergy Research Center funded by DOE’s Office of Science, and is dedicated to developing advanced biofuels. Other co-authors on the paper are: Tristan de Rond, Sebastian Barthel, Justine Kang, Rathin Bector, Hratch Baghdassarian, Alisa Truong, Peter Kim, Anup Singh, and Nathan Hillson.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

     
  • richardmitnick 3:56 pm on June 11, 2018 Permalink | Reply
    Tags: , , , , , , Experiments at Berkeley Lab Help Trace Interstellar Dust Back to Solar System’s Formation, LBNL,   

    From Lawrence Berkeley National Lab: “Experiments at Berkeley Lab Help Trace Interstellar Dust Back to Solar System’s Formation” 

    Berkeley Logo

    From Lawrence Berkeley National Lab

    June 11, 2018
    Glenn Roberts Jr.
    geroberts@lbl.gov
    (510) 486-5582

    Chemical studies show that dust particles originated in a low-temperature environment.

    1
    This energy dispersive X-ray spectrometry (EDS) map of tiny glassy grains (blue with green specks) inside a cometary-type interplanetary dust particle was produced using the FEI TitanX microscope at Berkeley Lab’s Molecular Foundry.

    LBNL FEI TitanX microscope


    Carbonaceous material (red) holds these objects together. (Credit: Hope Ishii/University of Hawaii; Berkeley Lab; reproduced with permission from PNAS)

    Experiments conducted at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) helped to confirm that samples of interplanetary particles – collected from Earth’s upper atmosphere and believed to originate from comets – contain dust leftover from the initial formation of the solar system.

    An international team, led by Hope Ishii, a researcher at the University of Hawaii at Manoa (UH Manoa), studied the particles’ chemical composition using infrared light at Berkeley Lab’s Advanced Light Source (ALS).

    LBNL/ALS

    Scientists also explored their nanoscale chemical makeup using electron microscopes at the Lab’s Molecular Foundry, which specializes in nanoscale R&D, and at the University of Hawaii’s Advanced Electron Microscopy Center.

    LBNL Molecular Foundry – No image credits found

    University of Hawaii’s Advanced Electron Microscopy Center

    The study was published online June 11 in the journal Proceedings of the National Academy of Sciences.

    The initial solids from which the solar system formed consisted almost entirely of carbon, ices, and disordered (amorphous) silicate, the team concluded. This dust was mostly destroyed and reworked by processes that led to the formation of planets. Surviving samples of pre-solar dust are most likely to be preserved in comets – small, cold bodies that formed in the outer solar nebula.

    In a relatively obscure class of these interplanetary dust particles believed to originate from comets, there are tiny glassy grains called GEMS (glass embedded with metal and sulfides) that are typically only tens to hundreds of nanometers in diameter, or less than a hundredth of the thickness of a human hair. Researchers embedded the sample grains in an epoxy that was cut into thin slices for the various experiments.

    Using transmission electron microscopy at the Molecular Foundry, the research team made maps of the element distributions and discovered that these glassy grains are made up of subgrains that aggregated together in a different environment prior to the formation of the comet.

    The nanoscale GEMS subgrains are bound together by dense organic carbon in clusters comprising the GEMS grains. These GEMS grains were later glued together with other components of the cometary dust by a distinct, lower-density organic carbon matrix.

    The types of carbon that rim the subgrains and that form the matrix in these particles decompose with even weak heating, suggesting that the GEMS could not have formed in the hot inner solar nebula, and instead formed in a cold, radiation-rich environment, such as the outer solar nebula or pre-solar molecular cloud.

    Jim Ciston, a staff scientist at the Molecular Foundry, said the particle-mapping process of the microscopy techniques provided key clues to their origins. “The presence of specific types of organic carbon in both the inner and outer regions of the particles suggests the formation process occurred entirely at low temperatures,” he said.

    3
    This cometary-type interplanetary dust particle was collected by a NASA stratospheric aircraft. Its porous aggregate structure is evident in this scanning electron microscope image. (Credit: Hope Ishii/University of Hawaii)

    “Therefore, these interplanetary dust particles survived from the time before formation of the planetary bodies in the solar system, and provide insight into the chemistry of those ancient building blocks.”

    He also noted that the “sticky” organics that covered the particles may be a clue to how these nanoscale particles could gather into larger bodies without the need for extreme heat and melting.

    Ishii, who is based at the UH Manoa’s Hawaii Institute of Geophysics and Planetology, said, “Our observations suggest that these exotic grains represent surviving pre-solar interstellar dust that formed the very building blocks of planets and stars. If we have at our fingertips the starting materials of planet formation from 4.6 billion years ago, that is thrilling and makes possible a deeper understanding of the processes that formed and have since altered them.”

    Hans Bechtel, a research scientist in the Scientific Support Group at Berkeley Lab’s ALS, said that the research team also employed infrared spectroscopy at the ALS to confirm the presence of organic carbon and identify the coupling of carbon with nitrogen and oxygen, which corroborated the electron microscopy measurements.

    The ALS measurements provided micron-scale (millionths of a meter) resolution that gave an average of measurements for entire samples, while the Molecular Foundry’s measurements provided nanometer-scale (billionths of a meter) resolution that allowed scientists to explore tiny portions of individual grains.

    In the future, the team plans to search the interiors of additional comet dust particles, especially those that were well-protected during their passage through the Earth’s atmosphere, to increase understanding of the distribution of carbon within GEMS and the size distributions of GEMS subgrains.

    Berkeley Lab’s ALS and Molecular Foundry are DOE Office of Science User Facilities.

    The research team included scientists from the University of Washington, NASA Ames Research Center, and the Laboratory for Space Sciences. The work was supported by NASA’s Cosmochemistry, Emerging Worlds, and Laboratory Analysis of Returned Samples programs; the ALS and Molecular Foundry are supported by the DOE Office of Basic Energy Sciences.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

     
  • richardmitnick 3:23 pm on June 11, 2018 Permalink | Reply
    Tags: , , , , , From Moon Rocks to Space Dust: Berkeley Lab’s Extraterrestrial Research, , LBNL,   

    From Lawrence Berkeley National Lab: “From Moon Rocks to Space Dust: Berkeley Lab’s Extraterrestrial Research” 

    Berkeley Logo

    From Lawrence Berkeley National Lab

    Glenn Roberts Jr.
    geroberts@lbl.gov
    (510) 486-5582

    Specialized equipment, techniques, and expertise attract samples from far, far away.

    1
    The Barringer Crater shown in the adjacent image (Ref) is only the best-preserved of large meteor impacts. There is evidence for many more. http://www.pas.rochester.edu/~blackman/ast104/impacts.html

    2
    Libyan Desert Glass: An extraordinary highly translucent 239.1-gram Libyan Desert Glass individual covered in pseudo regmaglypts, which are strikingly similar in appearance to the thumbprints found on certain meteorites. Some impact specialists have theorized that at the time of impact, molten jelly-like blobs of desert glass were thrown far up into the air, and then fell back to earth acquiring regmaglypts in the process. A more widely accepted view is that pseudo regmaglypts are the result of long term desert erosion by wind and sand. However they are formed, their resemblance to meteoritic regmaglypts is remarkable. Photograph by Leigh Anne DelRay, copyright Aerolite Meteorites.

    From moon rocks to meteorites, and from space dust to a dinosaur-destroying impact, the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) has a well-storied expertise in exploring samples of extraterrestrial origin.

    This research – which has helped us to understand the makeup and origins of objects within and beyond our solar system – stems from the Lab’s long-standing core capabilities and credentials in structural and chemical analyses and measurement at the microscale and nanoscale.

    Berkeley Lab’s participation in a new study, detailed June 11 in the journal Proceedings of the National Academy of Sciences (see related news release), focused on the chemical composition of tiny glassy grains of interplanetary particles – likely deposited in Earth’s upper atmosphere by comets – that contain dust leftover from the formative period of our solar system.

    4
    Petrographic relationship between organic carbon and amorphous silicates in cometary IDPs. (A) High-angle annular darkfield (HAADF) image of a section through the middle of a single GEMS grain in U217B19 and (B) corresponding carbon element map showing organic rims on subgrains within the GEMS grain. (C) HAADF image of a section through the middle of a GEMS grain in LT39 and (D) corresponding carbon element map showing a higher brightness organic carbon rim mantling the GEMS exterior surface. The higher brightness rim corresponds to higher-density organic carbon with higher C/O ratio (SI Appendix). (E) HAADF image of PAH-rich nanoglobules (ng) comprised of higher-density organic carbon and (F) element map. Red, C; blue, Mg; green, Fe; and yellow, S. One nanoglobule has a partial GEMS mantle shown in Inset. (G) HAADF image of a nanoglobule heavily decorated with GEMS. (H) Brightfield image of two carbon-rich GEMS, with one on right a torus with an organic carbon interior and inorganic exterior. [From above cited science paper.]

    That study involved experiments at the Lab’s Molecular Foundry, a nanoscale research facility, and the Advanced Light Source (ALS), which supplies different types of light, from infrared light to X-rays, for dozens of simultaneous experiments.

    More than a decade ago, NASA’s Stardust spacecraft mission, which had a rendezvous with comet 81P/Wild 2, returned samples of cometary and interstellar dust to Earth. Ever since, researchers have been working to study this material in detail.

    In one study, published in 2014 [Science], scientists used X-rays and infrared light to study particles from this mission. In another study [Wiley], published in 2015, researchers studied two comet particles using several high-resolution electron microscopes and a focused ion beam at Berkeley Lab’s National Center for Electron Microscopy (NCEM), which is now part of the Molecular Foundry.

    LBNL National Center for Electron Microscopy (NCEM)

    LBNL Molecular Foundry – No image credits found

    They found that the microscopic rocks, named Iris and Callie, had formed from molten droplets that crystallized rapidly in outer space.

    Interplanetary dust particles were also the focus of a 2014 study that involved NCEM and the ALS. That study [PNAS]explored pockets of water that were directly formed on the dust particles via irradiation by the solar wind, and their findings suggest that this mechanism could be responsible for transporting water throughout the solar system.

    In other studies, the ALS has been used to reveal liquid water and complex organic compounds like hydrocarbons and amino acids in meteorites – one of which may have traveled here from a dwarf planet – and ALS scientists have been working with NASA to study the microscopic makeup of asteroids to better understand how meteoroids break apart in Earth’s atmosphere.

    The Lab also had a role in analyzing dust from moon rocks collected in the Apollo 11 and Apollo 12 moon missions – the late Melvin Calvin, who was a former associate director at the Lab, participated in a study of carbon compounds in lunar samples that was published [<em>PSLSC] in 1971.

    And in the 1970s, Berkeley Lab Nobel laureate Luis Alvarez teamed with his son, Walter Alvarez, then an associate professor of geology at UC Berkeley, to unravel the mystery of the dinosaur die-off some 65 million years ago. The Alvarezes, working with Lab nuclear chemists Frank Asaro and Helen Michel, used a technique known as neutron activation analysis to precisely measure an unusual abundance in the element iridium in sedimentary deposits that dated back to the time of the dinosaurs’ disappearance and the mass extinction of many other species. [LBL Science Beat]

    Iridium, which is rare on Earth, was known to be associated with extraterrestrial objects such as asteroids, and later studies [Science] would confirm that a massive meteorite impact is the most likely cause of that ancient extinction event.

    Besides studying materials of extraterrestrial origin, Berkeley Lab researchers have also worked to synthesize and simulate the chemistry, materials, conditions, and effects found outside of Earth – from lab-treated materials that are analogous to exotic minerals that formed in space from the presence of corrosive gases in the early solar system to simulated mergers of neutron stars and black holes, and the creation of simulated Martian meteorites.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

     
  • richardmitnick 12:29 pm on June 8, 2018 Permalink | Reply
    Tags: , High-stability high-resolution Thermo Fischer “ThemIS” transmission electron microscope, LBNL, LBNL Molecular Foundry   

    From Lawrence Berkeley National Lab: “There’s a New Microscope in Town: ThemIS, anyone?” 

    Berkeley Logo

    From Lawrence Berkeley National Lab

    June 7, 2018
    Glenn Roberts Jr.
    geroberts@lbl.gov
    (510) 486-5582

    1
    The high-stability, high-resolution Thermo Fischer “ThemIS” transmission electron microscope. (Credit: Marilyn Chung/Berkeley Lab)

    New Tool at Berkeley Lab’s Molecular Foundry offers atomic-scale imaging in real time.

    LBNL Molecular Foundry – No image credits found

    Researchers at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) now have access to a unique new microscope that combines atomic-scale imaging capabilities with the ability to observe real-world sample properties and behavior in real time.

    Housed at Berkeley Lab’s Molecular Foundry in partnership with the Materials Sciences Division, the new instrument is a high-stability, high-resolution Thermo Fischer “ThemIS” transmission electron microscope (TEM). The “IS” in its name emphasizes that it has been customized for in situ experiments, enabling researchers to study materials and reactions under natural conditions.

    The ThemIS microscope will provide unprecedented insight into fundamental atomic-scale materials transformations that occur at solid-liquid interfaces, which is essential for making advances in battery and desalination technologies, for example.

    Haimei Zheng, a staff scientist in Berkeley Lab’s Materials Sciences Division whose research group focuses on understanding, engineering, and controlling materials at solid-liquid interfaces – with a particular emphasis on sustainable energy, clean water, and the environment – was instrumental in acquiring the new microscope.

    “In situ transmission electron microscopy enables direct observation of reactions and atomic pathways, provides key information on the structural dynamics of a material during transformations, and has the ability to correlate a material’s structure and properties,” she said, “but it has ultimately been limited by the speed and resolution of the microscope.”

    She added, “With this advanced TEM and newly developed technologies, we are now able to image chemical reactions and study materials dynamics in liquids with a resolution that was previously impossible.”

    2
    A silicon sample is seen at nanoscale resolution in this first image produced by the ThemIS transmission electron microscope (scale bar is 1 nanometer). (Credit: Molecular Foundry/Berkeley Lab)

    Electron microscopes have expanded scientists’ ability to understand the world by making visible what was once invisible. A TEM uses electromagnetic lenses to focus electrons. Its focused beam of electrons passes through a sample and is scattered into either an image or a diffraction pattern. Since the necessarily thin samples (measuring from 10 to hundreds of nanometers, or billionths of a meter) are subjected to the high-vacuum environment inside the TEM, observations of materials in their relevant environment are challenging.

    In order to overcome this challenge, in situ TEMs utilize special sample holders that allow a researcher to observe the physical behavior of materials in response to external stimuli such as temperature, environment, stress, and applied fields.

    By studying samples in liquids or gases using these special holders, researchers can observe the atomic-scale details of nanoparticles and how they undergo changes in their reactive environments. This capability not only provides for a deeper understanding of chemical reactions, but it also allows for the study of a wider variety of nanoparticle systems where reaction pathways are still unknown.

    “The microscope will provide researchers with new tools that expand the existing imaging and chemical analysis capabilities of our TitanX microscope, which has long been in high demand,” said Andy Minor, the facility director of the Foundry’s National Center for Electron Microscopy (NCEM). The TitanX is a predecessor high-resolution TEM.

    LBNL National Center for Electron Microscopy (NCEM)

    The ThemIS microscope is the product of a joint effort between Berkeley Lab’s Materials Sciences Division and the Molecular Foundry, and is supported by the Department of Energy’s Office of Basic Energy Sciences.

    The microscope’s customization includes the following features that make it optimal for in situ experiments:

    An image corrector for high-resolution TEM imaging.
    A “Ceta” camera for imaging a wide field of view at high resolution and relatively high speed (4,096-by-4,096-pixel resolution at 40 frames per second).
    A specialized “fast gun valve” that protects the microscope from gases that may be released during environmental in situ experiments.
    A “super-X quad-EDS detector” for elemental analysis, expanding NCEM’s high-resolution analytical capabilities.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

     
  • richardmitnick 11:14 am on June 8, 2018 Permalink | Reply
    Tags: , LBNL, Lorentz microscopy, Non-Crystal Clarity: Scientists Find Ordered Magnetic Patterns in Disordered Magnetic Material   

    From Lawrence Berkeley National Lab: “Non-Crystal Clarity: Scientists Find Ordered Magnetic Patterns in Disordered Magnetic Material” 

    Berkeley Logo

    From Lawrence Berkeley National Lab

    June 8, 2018
    Glenn Roberts Jr.
    GERoberts@lbl.gov
    (510) 486-5582

    1
    The top row shows electron phase, the second row shows magnetic induction, and the bottom row shows schematics for the simulated phase of different magnetic domain features in multilayer material samples. The first column is for a symmetric thin-film material and the second column is for an asymmetric thin film containing gadolinium and cobalt. The scale bars are 200 nanometers (billionths of a meter). The dashed lines indicate domain walls and the arrows indicate the chirality or “handedness.” The underlying images in the top two rows were producing using a technique at Berkeley Lab’s Molecular Foundry known as Lorentz microscopy. (Credit: Berkeley Lab)

    Lorentz Microscope Hitachi HF-3000 – 300keV from National Institute for Materials Science Sengen, Namiki, Sakura, Meguro Japan

    A team of scientists working at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) has confirmed a special property known as “chirality” – which potentially could be exploited to transmit and store data in a new way – in nanometers-thick samples of multilayer materials that have a disordered structure.

    While most electronic devices rely on the flow of electrons’ charge, the scientific community is feverishly searching for new ways to revolutionize electronics by designing materials and methods to control other inherent electron traits, such as their orbits around atoms and their spin, which can be thought of as a compass needle tuned to face in different directions.

    These properties, scientists hope, can enable faster, smaller, and more reliable data storage by facilitating spintronics – one facet of which is the use of spin current to manipulate domains and domain walls. Spintronics-driven devices could generate less heat and require less power than conventional devices.

    In the latest study, detailed in the May 23 online edition of the journal Advanced Materials, scientists working at Berkeley Lab’s Molecular Foundry and Advanced Light Source (ALS) confirmed a chirality, or handedness, in the transition regions – called domain walls – between neighboring magnetic domains that have opposite spins.

    Scientists hope to control chirality – analogous to right-handedness or left-handedness – to control magnetic domains and convey zeros and ones as in conventional computer memory.

    The samples were composed of an amorphous alloy of gadolinium and cobalt, sandwiched between ultrathin layers of platinum and iridium, which are known to strongly impact neighboring spins.

    Modern computer circuits commonly use silicon wafers based on a crystalline form of silicon, which has a regularly ordered structure. In this latest study, the material samples used in experiments were amorphous, or noncrystalline, which means their atomic structure was disordered.

    Experiments revealed a dominant chirality in the magnetic properties of these domain walls that could possibly be flipped to its opposite. Such a flipping mechanism is a critical enabling technology for spintronics and variant fields of research that are based on the electron’s spin property.

    The science team worked to identify the right thickness, concentration, and layering of elements, and other factors to optimize this chiral effect.

    “Now we have proof that we can have chiral magnetism in amorphous thin films, which no one had shown before,” said Robert Streubel, the study’s lead author and a postdoctoral researcher in Berkeley Lab’s Materials Sciences Division. The success of the experiments, he said, opens the possibility of controlling some properties of domain walls, such as chirality, with temperature, and of switching a material’s chiral properties with light.

    Amorphous materials, despite their disordered structure, could also be manufactured to overcome some of the limitations of crystalline materials for spintronics applications, Streubel noted. “We wanted to investigate these more complex materials that are easier to make, especially for industrial applications.”

    The research team enlisted a unique, high-resolution electron microscopy technique at Berkeley Lab’s Molecular Foundry, and conducted the experiments in a so-called Lorentz observation mode to image the magnetic properties of the material samples. They combined these results with those of an X-ray technique at the ALS known as magnetic circular dichroism spectroscopy to confirm the nanoscale magnetic chirality in the samples.

    LBNL/ALS

    The Lorentz microscopy technique employed at the Molecular Foundry’s National Center for Electron Microscopy provided the tens-of-nanometers resolution required to resolve the magnetic domain properties known as spin textures.

    LBNL National Center for Electron Microscopy (NCEM)

    “This high spatial resolution at this instrument allowed us to see the chirality in the domain walls – and we looked through the whole stack of materials,” said Peter Fischer, a co-leader of the study and a senior staff scientist in the Lab’s Materials Sciences Division.

    2
    In these rows of sequenced images, produced using X-ray-based techniques, the first column shows the demagnetized state of a multilayer material containing gadolinium and cobalt; the second column shows the residual magnetism in the same samples after an external, positive magnetic field was applied and then removed; and the last column shows the samples when a negative magnetic field is applied. The white arrows in the third row of images indicate gadolinium-rich regions in the material.(Credit: Berkeley Lab)

    Fischer noted that the increasingly precise, high-resolution experimental techniques – which use electron beams and X-rays, for example – now allow scientists to explore complex materials that lack a well-defined structure.

    “We are now looking with new kinds of probes,” he said, that are drilling down to ever-smaller scales. “Novel properties and discoveries can quite often occur at materials’ interfaces, which is why we ask: What happens when you put one layer next to another? And how does that impact the spin textures, which are a material’s magnetic landscapes of spin orientations?”

    The ultimate research tool, Fischer said, which is on the horizon with the next-generation of electron and X-ray probes, would provide scientists the capability to see directly, at atomic resolution, the magnetic switching occurring in a material’s interfaces at femtosecond (quadrillionths of a second) timescales.

    “Our next step is therefore to go into the dynamics of the chirality of these domain walls in an amorphous system: to image these domain walls while they’re moving, and to see how atoms are assembled together,” he said.

    Streubel added, “It was really a profound study in almost every aspect that was needed. Every piece by itself posed challenges.” The Lorentz microscopy results were fed into a mathematical algorithm, customized by Streubel, to identify domain wall types and chirality. Another challenge was in optimizing the sample growth to achieve the chiral effects using a conventional technique known as sputtering.

    The algorithm, and the experimental techniques, can now be applied to a whole set of sample materials in future studies, and “should be generalizable to different materials for different purposes,” he said.

    The research team also hopes that their work may help drive R&D related to spin orbitronics, where “topologically protected” (stable and resilient) spin textures called skyrmions could potentially replace the propagation of tiny domain walls in a material and lead to smaller and faster computing devices with lower power consumption than conventional devices.

    The Molecular Foundry and the ALS are DOE Office of Science User Facilities. This work was supported by the U.S. Department of Energy’s Office of Science.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

     
  • richardmitnick 1:48 pm on May 30, 2018 Permalink | Reply
    Tags: , LBNL, , OLCF Titan supercomputer, , Supercomputers Provide New Window Into the Life and Death of a Neutron,   

    From Lawrence Berkeley National Lab: “Supercomputers Provide New Window Into the Life and Death of a Neutron” 

    Berkeley Logo

    From Lawrence Berkeley National Lab

    May 30, 2018
    Glenn Roberts Jr.
    geroberts@lbl.gov
    (510) 486-5582

    Berkeley Lab-led research team simulates sliver of the universe to tackle subatomic-scale physics problem.

    1
    In this illustration, the grid in the background represents the computational lattice that theoretical physicists used to calculate a particle property known as nucleon axial coupling. This property determines how a W boson (white wavy line) interacts with one of the quarks in a neutron (large transparent sphere in foreground), emitting an electron (large arrow) and antineutrino (dotted arrow) in a process called beta decay. This process transforms the neutron into a proton (distant transparent sphere). (Credit: Evan Berkowitz/Jülich Research Center, Lawrence Livermore National Laboratory)

    Experiments that measure the lifetime of neutrons reveal a perplexing and unresolved discrepancy. While this lifetime has been measured to a precision within 1 percent using different techniques, apparent conflicts in the measurements offer the exciting possibility of learning about as-yet undiscovered physics.

    Now, a team led by scientists in the Nuclear Science Division at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) has enlisted powerful supercomputers to calculate a quantity known as the “nucleon axial coupling,” or gA – which is central to our understanding of a neutron’s lifetime – with an unprecedented precision. Their method offers a clear path to further improvements that may help to resolve the experimental discrepancy.

    To achieve their results, the researchers created a microscopic slice of a simulated universe to provide a window into the subatomic world. Their study was published online May 30 in the journal Nature.

    The nucleon axial coupling is more exactly defined as the strength at which one component (known as the axial component) of the “weak current” of the Standard Model of particle physics couples to the neutron. The weak current is given by one of the four known fundamental forces of the universe and is responsible for radioactive beta decay – the process by which a neutron decays to a proton, an electron, and a neutrino.

    In addition to measurements of the neutron lifetime, precise measurements of neutron beta decay are also used to probe new physics beyond the Standard Model. Nuclear physicists seek to resolve the lifetime discrepancy and augment with experimental results by determining gA more precisely.

    The researchers turned to quantum chromodynamics (QCD), a cornerstone of the Standard Model that describes how quarks and gluons interact with each other. Quarks and gluons are the fundamental building blocks for larger particles, such as neutrons and protons. The dynamics of these interactions determine the mass of the neutron and proton, and also the value of gA.

    But sorting through QCD’s inherent complexity to produce these quantities requires the aid of massive supercomputers. In the latest study, researchers applied a numeric simulation known as lattice QCD, which represents QCD on a finite grid.

    While a type of mirror-flip symmetry in particle interactions called parity (like swapping your right and left hands) is respected by the interactions of QCD, and the axial component of the weak current flips parity – parity is not respected by nature (analogously, most of us are right-handed). And because nature breaks this symmetry, the value of gA can only be determined through experimental measurements or theoretical predictions with lattice QCD.

    The team’s new theoretical determination of gA is based on a simulation of a tiny piece of the universe – the size of a few neutrons in each direction. They simulated a neutron transitioning to a proton inside this tiny section of the universe, in order to predict what happens in nature.

    The model universe contains one neutron amid a sea of quark-antiquark pairs that are bustling under the surface of the apparent emptiness of free space.

    2
    André Walker-Loud, a staff scientist at Berkeley Lab, led the study that calculated a property central to understanding the lifetime of neutrons. (Credit: Marilyn Chung/Berkeley Lab)

    “Calculating gA was supposed to be one of the simple benchmark calculations that could be used to demonstrate that lattice QCD can be utilized for basic nuclear physics research, and for precision tests that look for new physics in nuclear physics backgrounds,” said André Walker-Loud, a staff scientist in Berkeley Lab’s Nuclear Science Division who led the new study. “It turned out to be an exceptionally difficult quantity to determine.”

    This is because lattice QCD calculations are complicated by exceptionally noisy statistical results that had thwarted major progress in reducing uncertainties in previous gA calculations. Some researchers had previously estimated that it would require the next generation of the nation’s most advanced supercomputers to achieve a 2 percent precision for gA by around 2020.

    The team participating in the latest study developed a way to improve their calculations of gA using an unconventional approach and supercomputers at Oak Ridge National Laboratory (Oak Ridge Lab) and Lawrence Livermore National Laboratory (Livermore Lab), The Vulcan IBM Blue Gene/Q system.

    LLNL Vulcan IBM Blue GeneQ system supercomputer

    The study involved scientists from more than a dozen institutions, including researchers from UC Berkeley and several other Department of Energy national labs.

    Chia Cheng “Jason” Chang, the lead author of the publication and a postdoctoral researcher in Berkeley Lab’s Nuclear Science Division for the duration of this work, said, “Past calculations were all performed amidst this more noisy environment,” which clouded the results they were seeking. Chang has also joined the Interdisciplinary Theoretical and Mathematical Sciences Program at RIKEN in Japan as a research scientist.

    Walker-Loud added, “We found a way to extract gA earlier in time, before the noise ‘explodes’ in your face.”

    Chang said, “We now have a purely theoretical prediction of the lifetime of the neutron, and it is the first time we can predict the lifetime of the neutron to be consistent with experiments.”

    “This was an intense 2 1/2-year project that only came together because of the great team of people working on it,” Walker-Loud said.

    This latest calculation also places tighter constraints on a branch of physics theories that stretch beyond the Standard Model – constraints that exceed those set by powerful particle collider experiments at CERN’s Large Hadron Collider. But the calculations aren’t yet precise enough to determine if new physics have been hiding in the gA and neutron lifetime measurements.

    Chang and Walker-Loud noted that the main limitation to improving upon the precision of their calculations is in supplying more computing power.

    “We don’t have to change the technique we’re using to get the precision necessary,” Walker-Loud said.

    The latest work builds upon decades of research and computational resources by the lattice QCD community. In particular, the research team relied upon QCD data generated by the MILC Collaboration; an open source software library for lattice QCD called Chroma, developed by the USQCD collaboration; and QUDA, a highly optimized open source software library for lattice QCD calculations.

    ORNL Cray Titan XK7 Supercomputer

    The team drew heavily upon the power of Titan, a supercomputer at Oak Ridge Lab equipped with graphics processing units, or GPUs, in addition to more conventional central processing units, or CPUs. GPUs have evolved from their early use in accelerating video game graphics to current applications in evaluating large arrays for tackling complicated algorithms pertinent to many fields of science.

    The axial coupling calculations used about 184 million “Titan hours” of computing power – it would take a single laptop computer with a large memory about 600,000 years to complete the same calculations.

    As the researchers worked through their analysis of this massive set of numerical data, they realized that more refinements were needed to reduce the uncertainty in their calculations.

    The team was assisted by the Oak Ridge Leadership Computing Facility staff to efficiently utilize their 64 million Titan-hour allocation, and they also turned to the Multiprogrammatic and Institutional Computing program at Livermore Lab, which gave them more computing time to resolve their calculations and reduce their uncertainty margin to just under 1 percent.

    “Establishing a new way to calculate gA has been a huge rollercoaster,” Walker-Loud said.

    With more statistics from more powerful supercomputers, the research team hopes to drive the uncertainty margin down to about 0.3 percent. “That’s where we can actually begin to discriminate between the results from the two different experimental methods of measuring the neutron lifetime,” Chang said. “That’s always the most exciting part: When the theory has something to say about the experiment.”

    He added, “With improvements, we hope that we can calculate things that are difficult or even impossible to measure in experiments.”

    Already, the team has applied for time on a next-generation supercomputer at Oak Ridge Lab called Summit, which would greatly speed up the calculations.

    ORNL IBM Summit supercomputer depiction

    In addition to researchers at Berkeley Lab and UC Berkeley, the science team also included researchers from University of North Carolina, RIKEN BNL Research Center at Brookhaven National Laboratory, Lawrence Livermore National Laboratory, the Jülich Research Center in Germany, the University of Liverpool in the U.K., the College of William & Mary, Rutgers University, the University of Washington, the University of Glasgow in the U.K., NVIDIA Corp., and Thomas Jefferson National Accelerator Facility.

    One of the study participants is a scientist at the National Energy Research Scientific Computing Center (NERSC).

    NERSC

    NERSC Cray XC40 Cori II supercomputer

    LBL NERSC Cray XC30 Edison supercomputer


    The Genepool system is a cluster dedicated to the DOE Joint Genome Institute’s computing needs. Denovo is a smaller test system for Genepool that is primarily used by NERSC staff to test new system configurations and software.

    NERSC PDSF


    PDSF is a networked distributed computing cluster designed primarily to meet the detector simulation and data analysis requirements of physics, astrophysics and nuclear science collaborations.

    The Titan supercomputer is a part of the Oak Ridge Leadership Computing Facility (OLCF). NERSC and OLCF are DOE Office of Science User Facilities.

    The work was supported by Laboratory Directed Research and Development programs at Berkeley Lab, the U.S. Department of Energy’s Office of Science, the Nuclear Physics Double Beta Decay Topical Collaboration, the DOE Early Career Award Program, the NVIDIA Corporation, the Joint Sino-German Research Projects of the German Research Foundation and National Natural Science Foundation of China, RIKEN in Japan, the Leverhulme Trust, the National Science Foundation’s Kavli Institute for Theoretical Physics, DOE’s Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program, and the Lawrence Livermore National Laboratory Multiprogrammatic and Institutional Computing program through a Tier 1 Grand Challenge award.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    stem

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

     
  • richardmitnick 9:26 pm on May 29, 2018 Permalink | Reply
    Tags: , LBNL,   

    From Lawrence Berkeley National Lab: “New Machine Learning Approach Could Accelerate Bioengineering” 

    Berkeley Logo

    From Lawrence Berkeley National Lab

    May 29, 2018
    Dan Krotz
    dakrotz@lbl.gov
    510-486-4019

    1
    A new approach developed by Zak Costello (left) and Hector Garcia Martin brings the the speed and analytic power of machine learning to bioengineering. (Credit: Marilyn Chung, Berkeley Lab)

    Scientists from the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) have developed a way to use machine learning to dramatically accelerate the design of microbes that produce biofuel.

    Their computer algorithm starts with abundant data about the proteins and metabolites in a biofuel-producing microbial pathway, but no information about how the pathway actually works. It then uses data from previous experiments to learn how the pathway will behave. The scientists used the technique to automatically predict the amount of biofuel produced by pathways that have been added to E. coli bacterial cells.

    The new approach is much faster than the current way to predict the behavior of pathways, and promises to speed up the development of biomolecules for many applications in addition to commercially viable biofuels, such as drugs that fight antibiotic-resistant infections and crops that withstand drought.

    The research was published May 29 in the journal Nature Systems Biology and Applications.

    In biology, a pathway is a series of chemical reactions in a cell that produce a specific compound. Researchers are exploring ways to re-engineer pathways, and import them from one microbe to another, to harness nature’s toolkit to improve medicine, energy, manufacturing, and agriculture. And thanks to new synthetic biology capabilities, such as the gene-editing tool CRISPR-Cas9, scientists can conduct this research at a precision like never before.

    “But there’s a significant bottleneck in the development process,” said Hector Garcia Martin, group lead at the DOE Agile BioFoundry and director of Quantitative Metabolic Modeling at the Joint BioEnergy Institute (JBEI), a DOE Bioenergy Research Center funded by DOE’s Office of Science and led by Berkeley Lab. The research was performed by Zak Costello (also with the Agile BioFoundry and JBEI) under the direction of Garcia Martin. Both researchers are also in Berkeley Lab’s Biological Systems and Engineering Division.

    “It’s very difficult to predict how a pathway will behave when it’s re-engineered. Trouble-shooting takes up 99% of our time. Our approach could significantly shorten this step and become a new way to guide bioengineering efforts,” Garcia Martin added.

    The current way to predict a pathway’s dynamics requires a maze of differential equations that describe how the components in the system change over time. Subject-area experts develop these “kinetic models” over several months, and the resulting predictions don’t always match experimental results.

    Machine learning, however, uses data to train a computer algorithm to make predictions. The algorithm learns a system’s behavior by analyzing data from related systems. This allows scientists to quickly predict the function of a pathway even if its mechanisms are poorly understood — as long as there are enough data to work with.


    Machine learning approaches, such as the technique recently developed by Berkeley Lab scientists, are hamstrung by a lack of large quantities of quality data. New automation capabilities at JBEI and the Agile BioFoundry will be able to produce these data in a systematic fashion. This video shows a liquid handler coupled with an automated fermentation platform at JBEI, which takes samples automatically to produce data for the machine learning algorithms.

    The scientists tested their technique on pathways added to E. coli cells. One pathway is designed to produce a bio-based jet fuel called limonene; the other produces a gasoline replacement called isopentenol. Previous experiments at JBEI yielded a trove of data related to how different versions of the pathways function in various E. coli strains. Some of the strains have a pathway that produces small amounts of either limonene or isopentenol, while other strains have a version that produces large amounts of the biofuels.

    The researchers fed this data into their algorithm. Then machine learning took over: The algorithm taught itself how the concentrations of metabolites in these pathways change over time, and how much biofuel the pathways produce. It learned these dynamics by analyzing data from the two experimentally known pathways that produce small and large amounts of biofuels.

    The algorithm used this knowledge to predict the behavior of a third set of “mystery” pathways the algorithm had never seen before. It accurately predicted the biofuel-production profiles for the mystery pathways, including that the pathways produce a medium amount of fuel. In addition, the machine learning-derived prediction outperformed kinetic models.

    “And the more data we added, the more accurate the predictions became,” said Garcia Martin. “This approach could expedite the time it takes to design new biomolecules. A project that today takes ten years and a team of experts could someday be handled by a summer student.”

    The work was part of the DOE Agile BioFoundry, supported by DOE’s Office of Energy Efficiency and Renewable Energy, and the Joint BioEnergy Institute, supported by DOE’s Office of Science.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    stem

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

     
  • richardmitnick 12:35 pm on May 28, 2018 Permalink | Reply
    Tags: , , Graphene Layered with Magnetic Materials Could Drive Ultrathin Spintronics, , LBNL, ,   

    From Lawrence Berkeley National Lab: “Graphene Layered with Magnetic Materials Could Drive Ultrathin Spintronics” 

    Berkeley Logo

    From Lawrence Berkeley National Lab

    May 28, 2018
    Glenn Roberts Jr.
    GERoberts@lbl.gov
    (510) 486-5582

    Measurements at Berkeley Lab’s Molecular Foundry reveal exotic spin properties that could lead to new form of data storage.

    1
    Andreas Schmid, left, and Gong Chen are pictured here with the spin-polarized low-energy electron microscopy (SPLEEM) instrument at Berkeley Lab. The insturment was integral to measurements of ultrathin samples that included graphene and magnetic materials. (Credit: Roy Kaltschmidt/Berkeley Lab)

    Researchers working at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) coupled graphene, a monolayer form of carbon, with thin layers of magnetic materials like cobalt and nickel to produce exotic behavior in electrons that could be useful for next-generation computing applications.

    The work was performed in collaboration with French scientists including Nobel Laureate Albert Fert, an emeritus professor at Paris-Sud University and scientific director for a research laboratory in France. The team performed key measurements at Berkeley Lab’s Molecular Foundry, a DOE Office of Science User Facility focused on nanoscience research.

    LBNL Molecular Foundry – No image credits found

    Fert shared the Nobel Prize in Physics in 2007 for his work in understanding a magnetic effect in multilayer materials that led to new technology for reading data in hard drives, for example, and gave rise to a new field studying how to exploit and control a fundamental property known as “spin” in electrons to drive a new type of low-energy, high-speed computer memory and logic technology known as spintronics.

    2
    A view from above (top) and the side (bottom) of materials composed of a layer of graphene (top) with cobalt (bottom left) and with nickel (bottom right). The spin configurations are represented by arrows. (Credit: Nature Materials, May 28, 2018; DOI: 10.1038/s41563-018-0079-4)

    In this latest work, published online May 28 in the journal Nature Materials, the research team showed how that spin property – analogous to a compass needle that can be tuned to face either north or south – is affected by the interaction of graphene with the magnetic layers.

    The researchers found that the material’s electronic and magnetic properties create tiny swirling patterns where the layers meet, and this effect gives scientists hope for controlling the direction of these swirls and tapping this effect for a form of spintronics applications known as “spin-orbitronics” in ultrathin materials. The ultimate goal is to quickly and efficiently store and manipulate data at very small scales, and without the heat buildup that is a common hiccup for miniaturizing computing devices.

    Typically, researchers working to produce this behavior for electrons in materials have coupled heavy and expensive metals like platinum and tantalum with magnetic materials to achieve such effects, but graphene offers a potentially revolutionary alternative since it is ultrathin, lightweight, has very high electrical conductivity, and can also serve as a protective layer for corrosion-prone magnetic materials.

    “You could think about replacing computer hard disks with all solid state devices – no moving parts – using electrical signals alone,” said Andreas Schmid, a staff scientist at the Molecular Foundry who participated in the research. “Part of the goal is to get lower power-consumption and non-volatile data storage.”

    The latest research represents an early step toward this goal, Schmid noted, and a next step is to control nanoscale magnetic features, called skyrmions, which can exhibit a property known as chirality that makes them swirl in either a clockwise or counterclockwise direction.

    In more conventional layered materials, electrons traveling through the materials can act like an “electron wind” that changes magnetic structures like a pile of leaves blown by a strong wind, Schmid said.

    But with the new graphene-layered material, its strong electron spin effects can drive magnetic textures of opposite chirality in different directions as a result of the “spin Hall effect,” which explains how electrical currents can affect spin and vice versa. If that chirality can be universally aligned across a material and flipped in a controlled way, researchers could use it to process data.

    “Calculations by other team members show that if you take different magnetic materials and graphene and build a multilayer stack of many repeating structures, then this phenomenon and effect could possibly be very powerfully amplified,” Schmid said.

    3
    In these images developed using the SPLEEM instrument at Berkeley Lab, the orientation of magnetization in samples containing cobalt (Co) and ruthenium (Ru) is represented with white arrows. The image at left shows how the orientation is altered when a layer of graphene (“Gr”) is added. The scale bar at the lower right of both images is 1 micron, or 1 millionths of a meter. (Credit: Berkeley Lab)

    To measure the layered material, scientists applied spin-polarized low-energy electron microscopy (SPLEEM) using an instrument at the Molecular Foundry’s National Center for Electron Microscopy. It is one of just a handful of specialized devices around the world that allow scientists to combine different images to essentially map the orientations of a sample’s 3-D magnetization profile (or vector), revealing a its “spin textures.”

    The research team also created the samples using the same SPLEEM instrument through a precise process known as molecular beam epitaxy, and separately studied the samples using other forms of electron-beam probing techniques.

    Gong Chen, a co-lead author who participated in the study as a postdoctoral researcher at the Molecular Foundry and is now an assistant project scientist in the UC Davis Physics Department, said the collaboration sprang out of a discussion with French scientists at a conference in 2016 – both groups had independently been working on similar research and realized the synergy of working together.

    While the effects that researchers have now observed in the latest experiments had been discussed decades ago in previous journal articles, Chen noted that the concept of using an atomically thin material like graphene in place of heavy elements to generate those effects was a new concept.

    “It has only recently become a hot topic,” Chen said. “This effect in thin films had been ignored for a long time. This type of multilayer stacking is really stable and robust.”

    Using skyrmions could be revolutionary for data processing, he said, because information can potentially be stored at much higher densities than is possible with conventional technologies, and with much lower power usage.

    Molecular Foundry researchers are now working to form the graphene-magnetic multilayer material on an insulator or semiconductor to bring it closer to potential applications, Schmid said.

    Researchers from Grenoble Alps University; Paris-Sud University; a joint center that includes the French National Center for Scientific Research, Thales Physics Lab, Paris-Sud University, and Paris-Saclay University in France; University of California, Davis; the Chinese Academy of Sciences; Nuclear Technology Development Center (CDTN), Federal University of Minas Gerais, and Federal University de Lavras in Brazil participated in the study.

    The work was supported by the U.S. Department of Energy Office of Science; the European Union’s Horizon 2020 Research and Innovation Program; the U.S. National Science Foundation; the University of California Office of the President Multicampus Research Programs and Initiatives; Brazil’s CAPES, CNPq and FAPEMIG programs; and the 1000 Talents Program for Young Scientists of China and Ningbo Program.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    stem

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: