Tagged: Biotechnology Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 9:59 pm on January 15, 2022 Permalink | Reply
    Tags: "Scientists use Summit supercomputer and deep learning to predict protein functions at genome scale", A grand challenge in biology: translating genetic code into meaningful functions., , “Structure determines function” is the adage when it comes to proteins., , Biotechnology, Deep learning is shifting the paradigm by quickly narrowing the vast field of candidate genes to the most interesting few for further study., , Geneticists are now dealing with the amount of data that astrophysicists deal with., High-performance computing is necessary to take that sequencing data and come up with useful inferences to narrow the field for experiments., One of the tools in the deep learning pipeline is called Sequence Alignments from deep-Learning of Structural Alignments or SAdLSA., , The research team is focusing on organisms critical to DOE missions., With advances in DNA sequencing technology data are available for about 350 million protein sequences — a number that continues to climb.   

    From DOE’s Oak Ridge National Laboratory (US): “Scientists use Summit supercomputer and deep learning to predict protein functions at genome scale” 

    From DOE’s Oak Ridge National Laboratory (US)

    January 10, 2022

    Kimberly A Askey
    askeyka@ornl.gov
    865.576.2841

    1
    This protein drives key processes for sulfide use in many microorganisms that produce methane, including Thermosipho melanesiensis. Researchers used supercomputing and deep learning tools to predict its structure, which has eluded experimental methods such as crystallography. Credit: Ada Sedova/ORNL, Department of Energy (US).

    A team of scientists led by the Department of Energy’s Oak Ridge National Laboratory and The Georgia Institute of Technology (US) is using supercomputing and revolutionary deep learning tools to predict the structures and roles of thousands of proteins with unknown functions.

    Their deep learning-driven approaches infer protein structure and function from DNA sequences, accelerating new discoveries that could inform advances in biotechnology, biosecurity, bioenergy and solutions for environmental pollution and climate change.

    Researchers are using the Summit supercomputer [below] at ORNL and tools developed by Google’s DeepMind and Georgia Tech to speed the accurate identification of protein structures and functions across the entire genomes of organisms. The team recently published [IEEE Xplore] details of the high-performance computing toolkit and its deployment on Summit.

    These powerful computational tools are a significant leap toward resolving a grand challenge in biology: translating genetic code into meaningful functions.

    Proteins are a key component of solving this challenge. They are also central to resolving many scientific questions about the health of humans, ecosystems and the planet. As the workhorses of the cell, proteins drive nearly every process necessary for life — from metabolism to immune defense to communication between cells.

    “Structure determines function” is the adage when it comes to proteins; their complex 3D shapes guide how they interact with other proteins to do the work of the cell. Understanding a protein’s structure and function based on lengthy strings of nucleotides — written as the letters A, C, T and G — that make up DNA has long been a bottleneck in the life sciences as researchers relied on educated guesses and painstaking laboratory experiments to validate structures.

    With advances in DNA sequencing technology data are available for about 350 million protein sequences — a number that continues to climb. Because of the need for extensive experimental work to determine three dimensional structures, scientists have only solved the structures for about 170,000 of those proteins. This is a tremendous gap.

    “We’re now dealing with the amount of data that astrophysicists deal with, all because of the genome sequencing revolution,” said ORNL researcher Ada Sedova. “We want to be able to use high-performance computing to take that sequencing data and come up with useful inferences to narrow the field for experiments. We want to quickly answer questions such as ‘what does this protein do, and how does it affect the cell? How can we harness proteins to achieve goals such as making needed chemicals, medicines and sustainable fuels, or to engineer organisms that can help mitigate the effects of climate change?’”

    The research team is focusing on organisms critical to DOE missions. They have modeled the full proteomes — all the proteins coded in an organism’s genome — for four microbes, each with approximately 5,000 proteins. Two of these microbes have been found to generate important materials for manufacturing plastics. The other two are known to break down and transform metals. The structural data can inform new advances in synthetic biology and strategies to reduce the spread of contaminants such as mercury in the environment.

    The team also generated models of the 24,000 proteins at work in sphagnum moss. Sphagnum plays a critical role in storing vast amounts of carbon in peat bogs, which hold more carbon than all the world’s forests. These data can help scientists pinpoint which genes are most important in enhancing sphagnum’s ability to sequester carbon and withstand climate change.

    Speeding scientific discovery

    In search of the genes that enable sphagnum moss to tolerate rising temperatures, ORNL scientists start by comparing its DNA sequences to the model organism Arabidopsis, a thoroughly investigated plant species in the mustard family.

    “Sphagnum moss is about 515 million years diverged from that model,” said Bryan Piatkowski, a biologist and ORNL Liane B. Russell Fellow. “Even for plants more closely related to Arabidopsis, we don’t have a lot of empirical evidence for how these proteins behave. There is only so much we can infer about function from comparing the nucleotide sequences with the model.”

    Being able to see the structures of proteins adds another layer that can help scientists home in on the most promising gene candidates for experiments.

    Piatkowski, for instance, has been studying moss populations from Maine to Florida with the aim of identifying differences in their genes that could be adaptive to climate. He has a long list of genes that might regulate heat tolerance. Some of the gene sequences are only different by one nucleotide, or in the language of the genetic code, by a single letter.

    “These protein structures will help us look for whether these nucleotide changes cause changes to the protein function and if so, how? Do those protein changes end up helping plants survive in extreme temperatures?” Piatkowski said.

    Looking for similarities in sequences to determine function is only part of the challenge. DNA sequences are translated into the amino acids that make up proteins. Through evolution, some of the sequences can mutate over time, replacing one amino acid with another that has similar properties. These changes do not always cause differences in function.

    “You could have proteins with very different sequences — less than 20% sequence match — and get the same structure and possibly the same function,” Sedova said. “Computational tools that only compare sequences can fail to find two proteins with very similar structures.”

    Until recently, scientists have not had tools that can reliably predict protein structure based on genetic sequences. Applying these new deep learning tools is a game changer.

    Though protein structure and function will still need confirmation via physical experiments and methods such as X-ray crystallography, deep learning is shifting the paradigm by quickly narrowing the vast field of candidate genes to the most interesting few for further study.

    Revolutionary tools

    One of the tools in the deep learning pipeline is called Sequence Alignments from deep-Learning of Structural Alignments or SAdLSA. Developed by collaborators Mu Gao and Jeffrey Skolnick at Georgia Tech, the computational tool is trained in a similar way as other deep learning models that predict protein structure. SAdLSA has the capability to compare sequences by implicitly understanding the protein structure, even if the sequences only share 10% similarity.

    “SAdLSA can detect distantly related proteins that may or may not have the same function,” said Jerry Parks, ORNL computational chemist and group leader. “Combine that with AlphaFold, which provides a 3D structural model of the protein, and you can analyze the active site to determine which amino acids are doing the chemistry and how they contribute to the function.”

    DeepMind’s tool, AlphaFold 2, demonstrated accuracy approaching that of X-ray crystallography in determining the structures of unknown proteins in the 2020 Critical Assessment of protein Structure Prediction, or CASP, competition. In this worldwide biennial experiment, organizers use unpublished protein structures that have been solved and validated to gauge the success of state-of-the-art software programs in predicting protein structure.

    AlphaFold 2 is the first and only program to achieve this level of accuracy since CASP began in 1994. As a bonus, it can also predict protein-protein interactions. This is important as proteins rarely work in isolation.

    “I’ve used AlphaFold to generate models of protein complexes, and it works phenomenally well,” Parks said. “It predicts not only the structure of the individual proteins but also how they interact with each other.”

    With AlphaFold’s success, the European Bioinformatics Institute, or EBI, has partnered with them to model over 100 million proteins — starting with model organisms and those with applications for medicine and human health.

    ORNL researchers and their collaborators are complementing EBI’s efforts by focusing on organisms that are critical to DOE missions. They are working to make the toolkit available to other users on Summit and to share the thousands of protein structures they’ve modeled as downloadable datasets to facilitate science.

    “This is a technology that is difficult for many research groups to just spin up,” Sedova said. “We hope to make it more accessible now that we’ve formatted it for Summit.”

    Using AlphaFold 2, with its many software modules and 1.5 terabyte database, requires significant amounts of memory and many powerful parallel processing units. Running it on Summit was a multi-step process that required a team of experts at the Oak Ridge Leadership Computing Facility, a DOE Office of Science user facility.

    ORNL’s Ryan Prout, Subil Abraham, Nicholas Quentin Haas, Wael Elwasif and Mark Coletti were critical to the implementation process, which relied in part on a unique capability called a Singularity container that was originally developed by DOE’s Lawrence Berkeley National Laboratory (US). Mu Gao contributed by deconstructing DeepMind’s AlphaFold 2 workflow so it could make efficient use of the OLCF resources, including Summit and the Andes system.

    The work will evolve as the tools change, including the advancement to exascale computing with the Frontier system being built at ORNL, expected to exceed a quintillion, or 1018, calculations per second.

    Depiction of ORNL Cray Frontier Shasta based Exascale supercomputer with Slingshot interconnect featuring high-performance AMD EPYC CPU and AMD Radeon Instinct GPU technology , being built at DOE’s Oak Ridge National Laboratory.

    Sedova is excited about the possibilities.

    “With these kinds of tools in our tool belt that are both structure-based and deep learning-based, this resource can help give us information about these proteins of unknown function — sequences that have no matches to other sequences in the entire repository of known proteins,” Sedova said. “This unlocks a lot of new knowledge and potential to address national priorities through bioengineering. For instance, there are potentially many enzymes with useful functions that have not yet been discovered.”

    The research team includes ORNL’s Ada Sedova and Jerry Parks, Georgia Tech’s Jeffrey Skolnick and Mu Gao and Jianlin Cheng from The University of Missouri (US). Sedova virtually presented their work at the Machine Learning in HPC Environments workshop chaired by ORNL’s Seung-Hwan Lim as part of SC21, the International Conference for High Performance Computing, Networking, Storage and Analysis.

    The project is supported through the Biological and Environmental Research program in DOE’s Office of Science and through an award from the DOE Office of Advanced Scientific Computing Research’s Leadership Computing Challenge. Piatkowski’s research on sphagnum moss is supported through ORNL’s Laboratory Directed Research and Development funds.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition


    Established in 1942, DOE’s Oak Ridge National Laboratory (US) is the largest science and energy national laboratory in the Department of Energy system (by size) and third largest by annual budget. It is located in the Roane County section of Oak Ridge, Tennessee. Its scientific programs focus on materials, neutron science, energy, high-performance computing, systems biology and national security, sometimes in partnership with the state of Tennessee, universities and other industries.

    ORNL has several of the world’s top supercomputers, including Summit, ranked by the TOP500 as Earth’s second-most powerful.

    ORNL OLCF IBM AC922 SUMMIT supercomputer, was No.1 on the TOP500..

    The lab is a leading neutron and nuclear power research facility that includes the Spallation Neutron Source and High Flux Isotope Reactor.

    It hosts the Center for Nanophase Materials Sciences, the BioEnergy Science Center, and the Consortium for Advanced Simulation of Light Water Nuclear Reactors.

    ORNL is managed by UT-Battelle for the Department of Energy’s Office of Science. DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time.

    Areas of research

    ORNL conducts research and development activities that span a wide range of scientific disciplines. Many research areas have a significant overlap with each other; researchers often work in two or more of the fields listed here. The laboratory’s major research areas are described briefly below.

    Chemical sciences – ORNL conducts both fundamental and applied research in a number of areas, including catalysis, surface science and interfacial chemistry; molecular transformations and fuel chemistry; heavy element chemistry and radioactive materials characterization; aqueous solution chemistry and geochemistry; mass spectrometry and laser spectroscopy; separations chemistry; materials chemistry including synthesis and characterization of polymers and other soft materials; chemical biosciences; and neutron science.
    Electron microscopy – ORNL’s electron microscopy program investigates key issues in condensed matter, materials, chemical and nanosciences.
    Nuclear medicine – The laboratory’s nuclear medicine research is focused on the development of improved reactor production and processing methods to provide medical radioisotopes, the development of new radionuclide generator systems, the design and evaluation of new radiopharmaceuticals for applications in nuclear medicine and oncology.
    Physics – Physics research at ORNL is focused primarily on studies of the fundamental properties of matter at the atomic, nuclear, and subnuclear levels and the development of experimental devices in support of these studies.
    Population – ORNL provides federal, state and international organizations with a gridded population database, called Landscan, for estimating ambient population. LandScan is a raster image, or grid, of population counts, which provides human population estimates every 30 x 30 arc seconds, which translates roughly to population estimates for 1 kilometer square windows or grid cells at the equator, with cell width decreasing at higher latitudes. Though many population datasets exist, LandScan is the best spatial population dataset, which also covers the globe. Updated annually (although data releases are generally one year behind the current year) offers continuous, updated values of population, based on the most recent information. Landscan data are accessible through GIS applications and a USAID public domain application called Population Explorer.

     
  • richardmitnick 11:09 am on June 28, 2021 Permalink | Reply
    Tags: "One-of-a-Kind Course Aims to Build the Bioeconomy Workforce", , , Berkeley Lab’s Advanced Biofuels and Bioproducts Process Development Unit (ABPDU)., Bioeconomy: a rapidly growing sector of the global economy centered around reducing our dependence on fossil fuels., , Bioprocessing, Biotechnology, , , , We face a health crisis in the form of a global pandemic as well as an ongoing climate crisis that threatens our environment and ecosystem.   

    From DOE’s Lawrence Berkeley National Laboratory (US): “One-of-a-Kind Course Aims to Build the Bioeconomy Workforce” 

    June 28, 2021

    Media Relations
    media@lbl.gov
    (510) 486-5183

    By Emily Scott

    1
    Tiffany Chen, a University of California-Berkeley (US) chemical engineering student, loads a sample into the AMBR 250 device as part of UC Berkeley’s “Advanced Bioprocess Engineering Laboratory” class, which introduces advanced concepts of bioprocessing to chemical engineering students, at Berkeley Lab’s Advanced Biofuels and Bioproducts Process Development Unit in Emeryville, California. Credit: Thor Swift/Berkeley Lab.

    Jason Ryder is the first to tell you that the world is facing several crises. Most notably, we face a health crisis in the form of a global pandemic as well as an ongoing climate crisis that threatens our environment and ecosystem. And as the world population continues to grow, we also face a food crisis as more and more people need access to better and more nutritious foods.

    But Ryder, an adjunct professor in University of California-Berkeley (US)’s College of Chemistry, is an optimist. He knows we have the tools to solve these problems: biotechnology and bioprocessing — using the power of biology to generate sustainable, bio-based chemicals, fuels, materials, and food products. Scientists have harnessed this power by using microbes to produce almost anything imaginable, such as plant-based “meat” products or bio-derived clothing dye that replaces traditional petroleum-derived dye. Products like these are a key part of the bioeconomy, a rapidly growing sector of the global economy centered around reducing our dependence on fossil fuels.

    While the possibilities of biotechnology are staggering, the bioprocess industry is facing a surging demand for experienced people to develop and scale processes that bring these bio-based products to market.

    “There is a great need for bioprocess engineers in synthetic biology, biotechnology, and pharmaceutical companies and a dearth of trained bioprocess engineers,” said Jay Keasling, a senior faculty scientist at Lawrence Berkeley National Laboratory (Berkeley Lab) and professor in UC Berkeley’s College of Chemistry. “As such, companies must train them on the job, which is not ideal for the company or the individual.”

    That problem is the driving force behind UC Berkeley’s Master of Bioprocess Engineering (MBPE) program, which Ryder directs, as well as the Advanced Bioprocess Engineering Laboratory, a new capstone course in the program taught by Keasling that takes place at Berkeley Lab’s Advanced Biofuels and Bioproducts Process Development Unit (ABPDU). The course prepares students for careers in the biopharmaceutical, industrial biotech, or food tech industries by giving them much-needed hands-on experience with bioprocessing equipment.

    “As a bioprocess engineer, you need to be able to confidently say, ‘I understand living cells, the bio-based products they make, and how to build processes around them. I can design these systems, put them together with my own hands, clean and run them,’” Ryder said. “When you can say this, then you are ready to go do it.”

    The course arose from a discussion between Ryder and the College of Chemistry’s MPBE Industrial Advisory Board, made up of global leaders in the biotech and biopharmaceutical industries.

    “If you were to design a curriculum that would prepare someone to join your company,” he asked the board, “what would it look like?”

    The board identified five key pieces of bioprocessing equipment that are commonly found in the industrial biotech, food tech, and biopharmaceutical industries. These unit operations span the production of bio-based products via fermentation through their recovery, separations, and purification.

    Ryder then found a solution that got students’ hands on this equipment with the help of the ABPDU, a bioprocess scale-up facility at Berkeley Lab that collaborates with industry and academia to enable early-stage advanced biofuels and bioproducts, with funding support from the U.S. Department of Energy Bioenergy Technologies Office.

    The ABPDU, which houses these five key pieces of equipment, then developed a curriculum that eventually became UC Berkeley’s Advanced Bioprocess Engineering Laboratory course.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    LBNL campus


    Bringing Science Solutions to the World

    In the world of science, Lawrence Berkeley National Laboratory (Berkeley Lab) (US) is synonymous with “excellence.” Thirteen Nobel prizes are associated with Berkeley Lab. Seventy Lab scientists are members of the National Academy of Sciences (NAS), one of the highest honors for a scientist in the United States. Thirteen of our scientists have won the National Medal of Science, our nation’s highest award for lifetime achievement in fields of scientific research. Eighteen of our engineers have been elected to the National Academy of Engineering, and three of our scientists have been elected into the Institute of Medicine. In addition, Berkeley Lab has trained thousands of university science and engineering students who are advancing technological innovations across the nation and around the world.

    Berkeley Lab is a member of the national laboratory system supported by the U.S. Department of Energy through its Office of Science. It is managed by the University of California (US) and is charged with conducting unclassified research across a wide range of scientific disciplines. Located on a 202-acre site in the hills above the UC Berkeley campus that offers spectacular views of the San Francisco Bay, Berkeley Lab employs approximately 3,232 scientists, engineers and support staff. The Lab’s total costs for FY 2014 were $785 million. A recent study estimates the Laboratory’s overall economic impact through direct, indirect and induced spending on the nine counties that make up the San Francisco Bay Area to be nearly $700 million annually. The Lab was also responsible for creating 5,600 jobs locally and 12,000 nationally. The overall economic impact on the national economy is estimated at $1.6 billion a year. Technologies developed at Berkeley Lab have generated billions of dollars in revenues, and thousands of jobs. Savings as a result of Berkeley Lab developments in lighting and windows, and other energy-efficient technologies, have also been in the billions of dollars.

    Berkeley Lab was founded in 1931 by Ernest Orlando Lawrence, a University of California-Berkeley (US) physicist who won the 1939 Nobel Prize in physics for his invention of the cyclotron, a circular particle accelerator that opened the door to high-energy physics. It was Lawrence’s belief that scientific research is best done through teams of individuals with different fields of expertise, working together. His teamwork concept is a Berkeley Lab legacy that continues today.

    History

    1931–1941

    The laboratory was founded on August 26, 1931, by Ernest Lawrence, as the Radiation Laboratory of the University of California, Berkeley, associated with the Physics Department. It centered physics research around his new instrument, the cyclotron, a type of particle accelerator for which he was awarded the Nobel Prize in Physics in 1939.

    LBNL 88 inch cyclotron.


    Throughout the 1930s, Lawrence pushed to create larger and larger machines for physics research, courting private philanthropists for funding. He was the first to develop a large team to build big projects to make discoveries in basic research. Eventually these machines grew too large to be held on the university grounds, and in 1940 the lab moved to its current site atop the hill above campus. Part of the team put together during this period includes two other young scientists who went on to establish large laboratories; J. Robert Oppenheimer founded DOE’s Los Alamos Laboratory (US), and Robert Wilson founded Fermi National Accelerator Laboratory(US).

    1942–1950

    Leslie Groves visited Lawrence’s Radiation Laboratory in late 1942 as he was organizing the Manhattan Project, meeting J. Robert Oppenheimer for the first time. Oppenheimer was tasked with organizing the nuclear bomb development effort and founded today’s Los Alamos National Laboratory to help keep the work secret. At the RadLab, Lawrence and his colleagues developed the technique of electromagnetic enrichment of uranium using their experience with cyclotrons. The “calutrons” (named after the University) became the basic unit of the massive Y-12 facility in Oak Ridge, Tennessee. Lawrence’s lab helped contribute to what have been judged to be the three most valuable technology developments of the war (the atomic bomb, proximity fuse, and radar). The cyclotron, whose construction was stalled during the war, was finished in November 1946. The Manhattan Project shut down two months later.

    1951–2018

    After the war, the Radiation Laboratory became one of the first laboratories to be incorporated into the Atomic Energy Commission (AEC) (now Department of Energy (US). The most highly classified work remained at Los Alamos, but the RadLab remained involved. Edward Teller suggested setting up a second lab similar to Los Alamos to compete with their designs. This led to the creation of an offshoot of the RadLab (now the Lawrence Livermore National Laboratory (US)) in 1952. Some of the RadLab’s work was transferred to the new lab, but some classified research continued at Berkeley Lab until the 1970s, when it became a laboratory dedicated only to unclassified scientific research.

    Shortly after the death of Lawrence in August 1958, the UC Radiation Laboratory (both branches) was renamed the Lawrence Radiation Laboratory. The Berkeley location became the Lawrence Berkeley Laboratory in 1971, although many continued to call it the RadLab. Gradually, another shortened form came into common usage, LBNL. Its formal name was amended to Ernest Orlando Lawrence Berkeley National Laboratory in 1995, when “National” was added to the names of all DOE labs. “Ernest Orlando” was later dropped to shorten the name. Today, the lab is commonly referred to as “Berkeley Lab”.

    The Alvarez Physics Memos are a set of informal working papers of the large group of physicists, engineers, computer programmers, and technicians led by Luis W. Alvarez from the early 1950s until his death in 1988. Over 1700 memos are available on-line, hosted by the Laboratory.

    The lab remains owned by the Department of Energy (US), with management from the University of California (US). Companies such as Intel were funding the lab’s research into computing chips.

    Science mission

    From the 1950s through the present, Berkeley Lab has maintained its status as a major international center for physics research, and has also diversified its research program into almost every realm of scientific investigation. Its mission is to solve the most pressing and profound scientific problems facing humanity, conduct basic research for a secure energy future, understand living systems to improve the environment, health, and energy supply, understand matter and energy in the universe, build and safely operate leading scientific facilities for the nation, and train the next generation of scientists and engineers.

    The Laboratory’s 20 scientific divisions are organized within six areas of research: Computing Sciences; Physical Sciences; Earth and Environmental Sciences; Biosciences; Energy Sciences; and Energy Technologies. Berkeley Lab has six main science thrusts: advancing integrated fundamental energy science; integrative biological and environmental system science; advanced computing for science impact; discovering the fundamental properties of matter and energy; accelerators for the future; and developing energy technology innovations for a sustainable future. It was Lawrence’s belief that scientific research is best done through teams of individuals with different fields of expertise, working together. His teamwork concept is a Berkeley Lab tradition that continues today.

    Berkeley Lab operates five major National User Facilities for the DOE Office of Science (US):

    The Advanced Light Source (ALS) is a synchrotron light source with 41 beam lines providing ultraviolet, soft x-ray, and hard x-ray light to scientific experiments.

    LBNL/ALS


    The ALS is one of the world’s brightest sources of soft x-rays, which are used to characterize the electronic structure of matter and to reveal microscopic structures with elemental and chemical specificity. About 2,500 scientist-users carry out research at ALS every year. Berkeley Lab is proposing an upgrade of ALS which would increase the coherent flux of soft x-rays by two-three orders of magnitude.

    The DOE Joint Genome Institute (US) supports genomic research in support of the DOE missions in alternative energy, global carbon cycling, and environmental management. The JGI’s partner laboratories are Berkeley Lab, DOE’s Lawrence Livermore National Laboratory (US), DOE’s Oak Ridge National Laboratory (US)(ORNL), DOE’s Pacific Northwest National Laboratory (US) (PNNL), and the HudsonAlpha Institute for Biotechnology (US). The JGI’s central role is the development of a diversity of large-scale experimental and computational capabilities to link sequence to biological insights relevant to energy and environmental research. Approximately 1,200 scientist-users take advantage of JGI’s capabilities for their research every year.

    The LBNL Molecular Foundry (US) [above] is a multidisciplinary nanoscience research facility. Its seven research facilities focus on Imaging and Manipulation of Nanostructures; Nanofabrication; Theory of Nanostructured Materials; Inorganic Nanostructures; Biological Nanostructures; Organic and Macromolecular Synthesis; and Electron Microscopy. Approximately 700 scientist-users make use of these facilities in their research every year.

    The DOE’s NERSC National Energy Research Scientific Computing Center (US) is the scientific computing facility that provides large-scale computing for the DOE’s unclassified research programs. Its current systems provide over 3 billion computational hours annually. NERSC supports 6,000 scientific users from universities, national laboratories, and industry.

    DOE’s NERSC National Energy Research Scientific Computing Center(US) at Lawrence Berkeley National Laboratory

    The Genepool system is a cluster dedicated to the DOE Joint Genome Institute’s computing needs. Denovo is a smaller test system for Genepool that is primarily used by NERSC staff to test new system configurations and software.

    PDSF is a networked distributed computing cluster designed primarily to meet the detector simulation and data analysis requirements of physics, astrophysics and nuclear science collaborations.

    NERSC is a DOE Office of Science User Facility.

    The DOE’s Energy Science Network (US) is a high-speed network infrastructure optimized for very large scientific data flows. ESNet provides connectivity for all major DOE sites and facilities, and the network transports roughly 35 petabytes of traffic each month.

    Berkeley Lab is the lead partner in the DOE’s Joint Bioenergy Institute (US) (JBEI), located in Emeryville, California. Other partners are the DOE’s Sandia National Laboratory (US), the University of California (UC) campuses of Berkeley and Davis, the Carnegie Institution for Science (US), and DOE’s Lawrence Livermore National Laboratory (US) (LLNL). JBEI’s primary scientific mission is to advance the development of the next generation of biofuels – liquid fuels derived from the solar energy stored in plant biomass. JBEI is one of three new U.S. Department of Energy (DOE) Bioenergy Research Centers (BRCs).

    Berkeley Lab has a major role in two DOE Energy Innovation Hubs. The mission of the Joint Center for Artificial Photosynthesis (JCAP) is to find a cost-effective method to produce fuels using only sunlight, water, and carbon dioxide. The lead institution for JCAP is the California Institute of Technology (US) and Berkeley Lab is the second institutional center. The mission of the Joint Center for Energy Storage Research (JCESR) is to create next-generation battery technologies that will transform transportation and the electricity grid. DOE’s Argonne National Laboratory (US) leads JCESR and Berkeley Lab is a major partner.

     
  • richardmitnick 8:49 am on January 15, 2021 Permalink | Reply
    Tags: "Science Begins at Brookhaven Lab's New Cryo-EM Research Facility", , , , Biotechnology, , , ,   

    From DOE’s Brookhaven National Laboratory: “Science Begins at Brookhaven Lab’s New Cryo-EM Research Facility” 

    From DOE’s Brookhaven National Laboratory

    January 14, 2021
    Cara Laasch
    laasch@bnl.gov
    (631) 344-8458

    Peter Genzer
    genzer@bnl.gov
    (631) 344-3174

    Brookhaven Lab’s Laboratory for BioMolecular Structure is now open for experiments with visiting researchers using two NY State-funded cryo-electron microscopes.

    1
    Brookhaven Lab Scientist Guobin Hu loaded the samples sent from researchers at Baylor College of Medicine into the new cryo-EM at LBMS.

    On January 8, 2021, the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory welcomed the first virtually visiting researchers to the Laboratory for BioMolecular Structure (LBMS), a new cryo-electron microscopy facility. DOE’s Office of Science funds operations at this new national resource, while funding for the initial construction and instrument costs was provided by NY State. This state-of-the-art research center for life sciences imaging offers researchers access to advanced cryo-electron microscopes (cryo-EM) for studying complex proteins as well as the architecture of cells and tissues.

    Many modern advances in biology, medicine, and biotechnology were made possible by researchers learning how biological structures such as proteins, tissues, and cells interact with each other. But to truly reveal their function as well as the role they play in diseases, scientists need to visualize these structures at the atomic level. By creating high-resolution images of biological structure using cryo-EMs, researchers can accelerate advances in many fields including drug discovery, biofuel development, and medical treatments.

    This first group of researchers from Baylor College of Medicine used the high-end instruments at LBMS to investigate the structure of solute transporters. These transporters are proteins that help with many biological functions in humans, such as absorbing nutrients in the digestive system or maintaining excitability of neurons in the nervous system. This makes them critical for drug design since they are validated drug targets and many of them also mediate drug uptake or export. By revealing their structure, the researchers gain more understanding for the functions and mechanisms of the transporters, which can improve drug design. The Baylor College researchers gained access to the cryo-EMs at LBMS through a simple proposal process.

    “Our experience at LBMS has been excellent. The facility has been very considerate in minimizing user effort in submission of the applications, scheduling of microscope time, and data collection,” said Ming Zhou, Professor in the Department of Biochemistry of Molecular Biology at Baylor College of Medicine.

    All researchers from academia and industry can request free access to the LBMS instruments and collaborate with the LBMS’ expert staff.

    2
    During the measurement of the samples, the LBMS team interacted with the scientists from Baylor College of Medicine through Zoom to coordinate the research.

    “By allowing science-driven use of our instruments, we will meet the urgent need to advance the molecular understanding of biological processes, enabling deeper insight for bio-engineering the properties of plants and microbes or for understanding disease,” said Liguo Wang, Scientific Operations Director of the LBMS. “We are very excited to welcome our first visiting researchers for their remote experiment time. The researchers received time at our instruments through a call for general research proposals at the end of August 2020. Since September, we have been running the instruments only for COVID-19-related work and commissioning.”

    LBMS has two cryo-electron microscopes—funded by $15 million from NY State’s Empire State Development—and the facility has space for additional microscopes to enhance its capabilities in the future. In recognition of NY State’s partnership on the project and to bring the spirit of New York to the center, each laboratory room is associated with a different iconic New York State landmark, including the Statue of Liberty, the Empire State Building, the Stonewall National Monument, and the Adam Clayton Powell Jr. State Office Building.

    “By dedicating our different instruments to New York landmarks, we wanted to acknowledge the role the State played in this new national resource and its own unique identity within Brookhaven Lab,” said Sean McSweeney, LBMS Director. “Brookhaven Lab has a number of facilities offering scientific capabilities to researchers from both industry and academia. In our case, we purposefully built our center next to the National Synchrotron Light Source II, which also serves the life science research community. We hope that this co-location will promote interactions and synergy between scientists for exchanging ideas on improving performance of both facilities.”

    Brookhaven’s National Synchrotron Light Source II (NSLS-II) [below] is a DOE Office of Science User Facility and one of the most advanced synchrotron light sources in the world. NSLS-II enables scientists from academia and industry to tackle the most important challenges in quantum materials, energy storage and conversion, condensed matter and materials physics, chemistry, life sciences, and more by offering extremely bright light, ranging from infrared light to x-rays. The vibrant structural biology and bio-imaging community at NSLS-II offers many complementary techniques for studying a wide variety of biological samples.

    “At NSLS-II, we build strong partnership with our sister facilities, and we are looking forward to working closely with our colleagues at LBMS. For our users, this partnership will offer them access to expert staff at both facilities as well as to a versatile set of complementary techniques,” said NSLS-II Director John Hill. “NSLS-II has a suite of highly automated x-ray crystallography and solution scattering beamlines as well as imaging beamlines with world-leading spatial resolution. All these beamlines offer comprehensive techniques to further our understanding of biological system. Looking to the future, we expect to combine other x-ray techniques with the cryo-EM data to provide unprecedented information on the structure and dynamics of the engines of life.”

    LBMS operations are funded by the U.S. Department of Energy’s Office of Science. NSLS-II is a DOE Office of Science user facility.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Brookhaven Campus.


    BNL Center for Functional Nanomaterials.

    BNL NSLS-II.


    BNL NSLS II.


    BNL RHIC Campus.

    BNL/RHIC Star Detector.

    BNL/RHIC Phenix.

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.

     
  • richardmitnick 9:54 am on March 9, 2020 Permalink | Reply
    Tags: "Can the US make bioweapons obsolete?", , , Biotechnology, Sandia National Lab   

    From Sandia Lab: “Can the US make bioweapons obsolete?” 

    From Sandia Lab

    March 9, 2020
    Paul Rhien
    prhien@sandia.gov
    925-294-6452

    Sandia experts help set vision to reach ambitious goal.

    1
    “Making Bioweapons Obsolete: A Summary of Workshop Discussions,” released by Sandia National Laboratories and the Council on Strategic Risks addresses recommendations for significantly reducing and ultimately eliminating biothreats.

    As the threats posed by bioterrorism and naturally occurring infectious disease grow and evolve in the modern era, there is a rising potential for broad negative impacts on human health, economic stability and global security. To protect the nation from these dangers, Sandia National Laboratories has partnered with the Council on Strategic Risks in taking on the ambitious goal of making bioweapons obsolete.

    In a report released this week, “Making Bioweapons Obsolete: A Summary of Workshop Discussions,” Sandia and the council outline the discussion and recommendations that came out of the Making Bioweapons Obsolete workshop hosted at Sandia. The one-day meeting brought together government, national laboratories, academia, industry, policy and entrepreneur communities to address the challenges of mitigating and eliminating the risks bioweapons present. The workshop was the first in a planned series.

    The report captures the strategic vision the working group laid out for achieving this ambitious goal more effectively and rapidly. According to Anup Singh, director of Biological and Engineering Sciences at Sandia, addressing the rising threats bioweapons present across the U.S. and around the world will require using strategy, technology advances, policy and other tools.

    “This is an extremely interesting time in biotechnology with the revolutionary advances in genome editing, synthetic biology and convergent technologies such as artificial intelligence and robotics,” Singh said. “Academia and the private sector are driving a variety of biotechnology innovations and it is imperative that we engage them in solving the problem together with the traditional national security partners.”

    Drawing on the cross-discipline expertise of the working group, organizers aim to better understand the threat and how technology can both increase and mitigate the risk. The report focuses on identifying solutions that offer the biggest return and influencing national leadership to provide attention and resources to the issue and engage with academia and industry.

    “We need a moonshot-level, inspirational goal regarding biological threats,” said Andy Weber, senior fellow at the council. “When we convene top experts to explore the concept of making bioweapons obsolete, we are usually met with great enthusiasm and a feeling that the United States can really achieve this vision. Indeed, it is largely an expansion on the work the U.S. government has accomplished to date in addressing smallpox threats to America with an extensive vaccine stockpiling system and its development of vaccines for viruses such as Ebola.”

    The report highlights a wide range of considerations that must be addressed. The report:

    Provides insights on key technological trends.
    Raises questions of the data and information access required for rapidly characterizing and responding to biological attacks and outbreaks.
    Explores market and supply chain dynamics in depth.
    Points to significant U.S. government capacities that can be used and expanded, including its vast testing and evaluation infrastructure.
    Highlights the need for coordinated outreach and education to policymakers, in particular by academic and private sector experts.
    Drives home the critical importance of U.S. leadership.

    The workshop is the beginning of an important conversation in tackling the ambitious issue of eliminating or significantly reducing biothreats, explained Andy McIlroy, associate laboratory director of Integrated Security Solutions at Sandia.

    “With increased commitment, time, resources and leadership, we can make further strides in meeting this bold target,” McIlroy said. “I hope that we can continue this discussion to create a united, national vision that meets the urgency of the moment.”

    Future workshops will continue the wide-ranging discussion focused on engaging in a national dialogue and promoting better public-private collaboration in this grand mission. Sessions will focus on man-made threats from weapons of mass destruction, as well as the risks posed by advances in technology.

    The Council on Strategic Risks is a nonprofit, nonpartisan security policy institute devoted to anticipating, analyzing and addressing core systemic risks to security in the 21st century, with special examination of the ways in which these risks intersect and exacerbate one another. For more on council’s program on making bioweapons obsolete visit the Janne E. Nolan Center on Strategic Weapons.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Sandia Campus
    Sandia National Laboratory

    Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.



     
  • richardmitnick 7:22 pm on November 9, 2018 Permalink | Reply
    Tags: , , , , , , , Biotechnology, , , , , , , Understanding our own backyard will be key in interpreting data from far-flung exoplanets   

    From COSMOS Magazine: “The tech we’re going to need to detect ET” 

    Cosmos Magazine bloc

    From COSMOS Magazine

    09 November 2018
    Lauren Fuge

    1
    Searching for biosignatures rather than examples of life itself is considered a prime strategy in the hunt for ET. smartboy10/Getty Images

    Move over Mars rovers, new technologies to detect alien life are on the horizon.

    A group of scientists from around the world, led by astrochemistry expert Chaitanya Giri from the Tokyo Institute of Technology in Japan, have put their heads together to plan the next 20 years’ worth of life-detection technologies. The study is currently awaiting peer review, but is freely available on the pre-print site, ArXiv.

    For decades, astrobiologists have scoured the skies and the sands of other planets for hints of extraterrestrial life. Not only are these researchers trying to find ET, but they’re also aiming to learn about the origin and evolution of life on Earth, the chemical composition of organic extraterrestrial objects, what makes a planet or satellite habitable, and more.

    But the answers to such questions are preceded by long years of planning, development, problem-solving and strategising.

    Late in 2017, 20 scientists from Japan, India, France, Germany and the USA – each with a special area of expertise – came together at a workshop run by the Earth-Life Science Institute (ELSI) at Giri’s Tokyo campus. There, they discussed the current progress and enticing possibilities of life-detection technologies.

    In particular, the boffins debated which ones should be a priority for research and development for missions within the local solar system – in other words, which instruments will be most feasible to out onto a space probe and send off to Mars or Enceladus during the next couple of decades.

    Of course, the planets and moons in the solar system are an extremely limited sample of the number of potentially habitable worlds in the universe, but understanding our own backyard will be key in interpreting data from far-flung exoplanets.

    So, according to these astrobiology experts, what’s the future plan for alien detection?

    The first step of any space mission is to study the planet or satellite from afar to determine whether it is habitable. Luckily, an array of next-generation telescopes is currently being built, from the ultra-sensitive James Webb Space Telescope, slated for launch in 2021, to the gargantuan Extremely Large Telescope in Chile, which will turn its 39-metre eye to the sky in 2024. The authors point out that observatories such as these will vastly expand our theoretical knowledge of planet habitability.

    NASA/ESA/CSA Webb Telescope annotated

    ESO/E-ELT,to be on top of Cerro Armazones in the Atacama Desert of northern Chile. located at the summit of the mountain at an altitude of 3,060 metres (10,040 ft).

    Just because a world is deemed habitable doesn’t mean life will be found all over it, though. It may exist only in limited geographical niches. To reach these inaccessible sites, the paper argues that we will require “agile robotic probes that are robust, able to seamlessly communicate with orbiters and deep space communications networks, be operationally semi-autonomous, have high-performance energy supplies, and are sterilisable to avoid forward contamination”.

    But according to Elizabeth Tasker, associate professor at the Japan Aerospace Exploration Agency (JAXA), who was not involved in the study, getting there is only half the struggle.

    “In fact, it’s the most tractable half because we can picture the problems we will face,” she says.

    The second, more pressing issue is how to recognise life unlike anything we know on Earth.

    As Tasker explains: “We only have Earth life to compare to and this is the result of huge evolutionary history on a planet whose complex past is unlikely to be replicated closely. That’s a lot of baggage to separate out.”

    According to the paper, the way forward is to equip missions with a suite of life-detection instruments that don’t look for life as we know it, but are instead able to identify the kinds of features that make organisms function.

    The authors outline a huge variety of exciting technologies that could be used for this purpose, including spectroscopy techniques (to analyse potential biological materials), quantum tunnelling [Nature Nanotechnology
    ] (to find DNA, RNA, peptides, and other small molecules), and fluorescence microscopy [ https://www.hou.usra.edu/meetings/lpsc2014/pdf/2744.pdf ](to identify the presence of cell membranes).

    They also nominate different forms of gas chromatography (to spot amino acids and sugars formed by living organisms, plus checking to see if molecules are “homochiral” [Space Science Reviews] (a suspected biosignature) using microfluidic devices and microscopes.

    High-resolution, miniaturised mass spectrometers would also be helpful, characterising biopolymers, which are created by living organisms, and measuring the elemental composition of objects to aid isotopic dating.

    Giri and colleagues also stress that exciting developments in machine learning, artificial intelligence, and pattern recognition will be useful in determining whether chemical samples are biological in origin.

    Interestingly, researchers are also developing technologies that may allow the detection of life in more unconventional places. On Earth, for example, cryotubes were recently used [International Journal of Systematic and Evolutionary Microbiology] to discover several new species of bacteria in the upper atmosphere.

    The scientists also discuss how certain technologies – such as high-powered synchrotron radiation and magnetic field facilities – are not yet compact enough to fly to other planets, and so samples must continue to be brought back for analysis.

    Several sample-and-return missions are currently underway, including JAXA’s Martian Moons exploration mission to Phobos, Hayabusa-2 to asteroid Ryugu, and NASA’s OSIRIS-rex to asteroid Bennu. What we learn from handling the organic-rich extraterrestrial materials brought back from these trips will be invaluable.

    JAXA MMX spacecraft

    JAXA/Hayabusa 2 Credit: JAXA/Akihiro Ikeshita

    NASA OSIRIS-REx Spacecraft

    What we learn from handling the organic-rich extraterrestrial materials brought back from these trips will be invaluable.

    The predictions and recommendations put forward by Giri and colleagues are the first steps in getting these technologies discussed in panel reviews, included in decadal surveys, and eventually funded.

    They complement several similar efforts, including a report prepared by US National Academies of Science, Engineering and Medicine (NASEM), calling for an expansion of the range of possible ET indicators, and a US-led exploration of how the next generation of radio telescopes will be utilised by SETI.

    Perhaps most importantly, these papers all highlight the need for collaborative work between scientists across disciplines.

    “A successful detection of life will need astrophysicists and geologists to examine possible environments on other planets, engineers and physicists to design the missions and instruments that can collect data, and chemists and biologists to determine how to classify life,” JAXA’s Tasker says.

    “But maybe that is appropriate: finding out what life really is and where it can flourish is the story of everyone on Earth. It should take all of us to unravel.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 8:37 am on August 17, 2017 Permalink | Reply
    Tags: , , Biotechnology, , , SynBio FSP-SynBio Future Science Platform, SynBio-synthetic biology   

    From CSIRO blog: “First steps toward a synthetic biology future” 

    CSIRO bloc

    CSIRO blog

    17 August 2017
    Chris McKay

    1
    The Industrial Revolution set off a wave of technological revolutions. Illustration: D O Hill/Wikimedia Commons

    It was steam power in 18th Century Britain that helped set off the Industrial Revolution, an evolution in technology that would change the course of human history. But that turned out to be only the first in a wave of technological revolutions to follow. From the late 1800s, electricity was being harnessed to allow for mass production, and then in the 1980s, electronics and information technology took the world by storm, heralding the third technological revolution and giving us the digital world we know today.

    Now, we’re in the midst of a fourth technological revolution. Building on the digital revolution that came before it, we’re seeing increasing digital connectedness (think Internet of Things) and a fusing of digital technology with biological systems and technologies. And there has been a step change in the speed at which progress is occurring.

    2
    The trend in the cost of sequencing a human-sized genome since 2001. Image: National Human Genome Research Institute.

    Consider the speed of progress during the IT revolution, which saw computing power doubling roughly every two years in accordance with Moore’s Law. Then contrast that with the rate of progress in the field of biotechnology, which has been exponential: the Human Genome Project, starting in 1990, was a $3 billion USD project that sequenced the human genome for the first time over a period of more than 10 years; then as a result of that work, from 2001 a genome could be sequenced for $100 million USD; and today we can sequence a genome for less than $1000.

    It is in this context that the field of synthetic biology (SynBio) has emerged. SynBio is essentially the application of engineering principles to biology. It involves making things from biological components, such as genetic code, to carry out useful activities. These activities could include sustainable production of fuels, treatment and cure of diseases, controlling invasive pests, or sensing toxins in the environment. Indeed, recent advancements in writing DNA code, printing DNA, and gene editing technology have made SynBio one of the fastest growing areas of modern science. It is a rapidly expanding multi-billion dollar industry with significant potential for generating societal benefits and commercial opportunities.

    That’s why SynBio was among the six new Future Science Platforms we announced last year; a program of investment in areas of science that are set to drive innovation and have the potential to help reinvent and create new industries for Australia. The SynBio Future Science Platform (SynBio FSP) is also growing the capability of a new generation of researchers in partnership with some Australian universities—some of the newest recruits, 11 SynBio Future Science Fellows, will be undertaking work on a suite of innovative projects.

    3
    Future Science Fellow Dr Michele Fabris, based at the University of Technology Sydney’s Climate Change Cluster, will be exploring the potential for photosynthetic microalgae to be modified to carry out new functions, like the production of anti-cancer pharmaceutical compounds. Image: Anna Zhu/UTS

    The research projects cover a broad spectrum of activity. There will be environmental and biocontrol applications, such as the development of cell-tissue structures capable of sensing the environment and eliminating toxins, new tools for targeting antibiotic resistant biofilms, and biosensors providing real-time biological monitoring. Some projects will be exploring the potential to use yeast, microalgae or cyanobacteria cells for the production of valuable pharmaceuticals or fuels, driving innovation in chemical and fibre manufacturing. Other projects will be creating new tools and building blocks that will be fundamental in driving progress in SynBio.

    This work will complement other SynBio FSP research being undertaken at CSIRO that will help us position Australia to play a role in the latest technological revolution. It is research that will allow us to better understand global developments and, where appropriate, contribute responsibly to advances in areas as diverse as healthcare, industrial biotechnology, biosecurity, food and agriculture.

    ____________________________________________________________________________
    SynBio FSP’s Future Science Fellowships are co-funded partnerships between CSIRO and the host universities, with each partner contributing matching funding. The host universities are Australian National University, Macquarie University, University of Adelaide, University of Queensland, University of the Sunshine Coast, University of Technology Sydney and University of Western Australia.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    CSIRO campus

    CSIRO, the Commonwealth Scientific and Industrial Research Organisation, is Australia’s national science agency and one of the largest and most diverse research agencies in the world.

    The CSIRO blog is designed to entertain, inform and inspire by generally digging around in the work being done by our terrific scientists, and leaving the techie speak and jargon for the experts.

    We aim to bring you stories from across the vast breadth and depth of our organisation: from the wild sea voyages of our Research Vessel Investigator to the mind-blowing astronomy of our Space teams, right through all the different ways our scientists solve national challenges in areas as diverse as Health, Farming, Tech, Manufacturing, Energy, Oceans, and our Environment.

    If you have any questions about anything you find on our blog, we’d love to hear from you. You can reach us at socialmedia@csiro.au.

    And if you’d like to find out more about us, our science, or how to work with us, head over to CSIRO.au

     
  • richardmitnick 11:33 am on March 5, 2017 Permalink | Reply
    Tags: , , , Biotechnology, , , Hamilton Smith, , Methylation, Restriction enzyme, The Man Who Kicked Off the Biotech Revolution   

    From Nautilus: “The Man Who Kicked Off the Biotech Revolution” Hamilton Smith 

    Nautilus

    Nautilus

    3.5.17
    Carl Zimmer

    It’s hard to tell precisely how big a role biotechnology plays in our economy, because it infiltrates so many parts of it. Genetically modified organisms such as microbes and plants now create medicine, food, fuel, and even fabrics. Recently, Robert Carlson, of the biotech firm Biodesic and the investment firm Bioeconomy Capital, decided to run the numbers and ended up with an eye-popping estimate. He concluded that in 2012, the last year for which good data are available, revenues from biotechnology in the United States alone were over $324 billion.

    “If we talk about mining or several manufacturing sectors, biotech is bigger than those,” said Carlson. “I don’t think people appreciate that.”

    1
    Matchmaker Biotech pioneer Hamilton Smith chose to study recombination in a species of bacteria called Haemophilus influenza (above), which can take up foreign DNA fragments and integrate them into its own DNA. Media for Medical/UIG via Getty Images

    What makes the scope of biotech so staggering is not just its size, but its youth. Manufacturing first exploded in the Industrial Revolution of the 19th century. But biotech is only about 40 years old. It burst into existence thanks largely to a discovery made in the late 1960s by Hamilton Smith, a microbiologist then at Johns Hopkins University, and his colleagues, that a protein called a restriction enzyme can slice DNA. Once Smith showed the world how restriction enzymes work, other scientists began using them as tools to alter genes.

    “And once you have the ability to start to manipulate the world with those tools,” said Carlson, “the world opens up.”

    The story of restriction enzymes is a textbook example of how basic research can ultimately have a huge impact on society. Smith had no such grand ambitions when he started his work. He just wanted to do some science. “I was just having a lot of fun, learning as I went,” Smith, now 85, said.

    In 1968, when Smith was a new assistant professor at Johns Hopkins University, he became curious about how cells cut DNA into pieces and shuffle them into new arrangements—a process known as recombination. “It’s a universal thing,” Smith said. “Every living thing has recombination systems. But at the time, no one was sure how it worked, mechanically.”

    Smith chose to study recombination in a species of bacteria called Haemophilus influenza. Like many other species, H. influenzae can take up foreign DNA, either sucking in loose fragments from the environment or gaining them from microbial donors. Somehow, the bacterium can then integrate these fragments into its own DNA.

    Bacteria gain useful genes in this way, endowing them with new traits such as resistance to antibiotics. But recombination also has a dark side for H. influenzae. Invading viruses can hijack the recombination machinery in bacteria. They then insert their own genes into their host’s DNA, so that the microbes make new copies of the virus.

    To understand recombination, Smith produced radioactive viruses by introducing viruses into bacteria that had been fed radioactive phosphorus. New viruses produced inside the bacteria ended up with radioactive phosphorus in their DNA. Smith and his colleagues could then unleash these radioactive viruses on other bacteria. The scientists expected that during the infection, the bacteria’s genes would become radioactive as the viruses inserted their genetic material into their host’s DNA.

    At least that was they thought would happen. When Smith’s graduate student Kent Wilcox infected bacteria with the radioactive viruses, the radioactivity never ended up in the bacteria’s own genome.

    Trying to make sense of the failure, Wilcox suggested to Smith that the bacteria were destroying the viral DNA. He based his suggestion on a hypothesis proposed a few years earlier by Werner Arber, a microbiologist at the University of Geneva. Arber speculated that enzymes could restrict the growth of viruses by chopping up their DNA, and dubbed these hypothetical molecules “restriction enzymes.”

    Arber recognized that if restriction enzymes went on an unchecked rampage, they could kill the bacteria themselves by chopping up their own DNA. He speculated that bacteria were shielding their own DNA from assault, and thus avoiding suicide, by covering their genes with carbon and hydrogen atoms—a process known as methylation. The restriction enzymes couldn’t attack methylated DNA, Arber proposed, but it could attack the unprotected DNA of invading viruses.

    The week before Wilcox had carried out his baffling experiment, Smith had assigned his lab a provocative new paper supporting Arber’s hypothesis. Matthew Meselson and Robert Yuan at Harvard University reported in the paper how they had discovered a protein in E. coli that cut up foreign DNA—in other words, an actual restriction enzyme. With that paper fresh in his mind, Wilcox suggested to Smith that they had just stumbled across another restriction enzyme in Haemophilus influenzae.

    Smith tested the idea with an elegant experiment. He poured viral DNA into a test tube, and DNA from H. influenza into another. To each of these tubes, he then added a soup of proteins from the bacteria. If indeed the bacteria made restriction enzymes, the enzymes in the soup would chop up the viral DNA into small pieces.

    Scientists were decades away from inventing the powerful sequencers that are used today to analyze DNA. But Smith came up with a simple way to investigate the DNA in his test tubes. A solution containing large pieces of DNA is more viscous—more syrupy, in effect—than one with small pieces. So Smith measured the solution in his two test tubes with a device called a viscometer. As he had predicted, the virus DNA quickly became far less viscous. Something—some H. influenzae protein, presumably—was cutting the virus DNA into little pieces.

    “So I immediately knew this had to be a restriction enzyme,” Smith said. “It was a wonderful result—five minutes, and you know you have a discovery.”

    That instant gratification was followed by months of tedium, as Smith and his colleagues sorted through the proteins in their cell extracts until at last they identified a restriction enzyme. They also discovered a methylation enzyme that protected H. influenzae’s own DNA from destruction by shielding it with carbon and hydrogen.

    Once Smith and his colleagues published the remarkable details of their restriction enzymes, other scientists began to investigate them as well. They didn’t just study the enzymes, though—they began employing them as a tool. In 1972, Paul Berg, a biologist at Stanford University, used restriction enzymes to make cuts in the DNA of SV40 viruses, and then used other enzymes to attach the DNA from another virus to those loose ends. Berg thus created a single piece of DNA made up of genetic material from two species.

    A pack of scientists followed Berg’s lead. They realized that they could use restriction enzymes to insert genes from many different species into bacteria, which could then churn out proteins from those genes. In effect, bacteria could be transformed into biological factories.

    In 1978, Hamilton Smith got a call from Stockholm. He learned that he was sharing that year’s Nobel Prize in Medicine with Werner Arber and Daniel Nathans, another Johns Hopkins scientist who had followed up on Smith’s enzyme research with experiments of his own. Smith was as flummoxed as he was delighted.

    “They caught me off-guard,” Smith said. “I always looked up to the Nobelists as being incredibly smart people who had accomplished some world-shaking thing. It just didn’t seem like I was in that league.”

    But already the full impact of his work was starting to become clear. Companies sprouted up that were dedicated to using restriction enzymes to modify DNA. The first commercial application of this technology came from Genentech, a company founded in 1976. Genentech scientists used restriction enzymes to create a strain of E. coli that carried the gene for human insulin. Previously, people with diabetes could only purchase insulin extracted from the pancreases of cows and pigs. Genentech sold insulin produced by swarms of bacteria reared in giant metal drums.

    Over the years, scientists have built on Smith’s initial successes by finding new tools for manipulating DNA. Yet even today, researchers make regular use of restriction enzymes to slice open genes. “They’re still absolutely crucial,” said Carlson. “If you want to put a specific sequence of DNA in another sequence, it’s still most often restriction enzymes that you use to do that.”

    And as Smith has watched restriction enzymes become powerful and versatile, he has slowly overcome his case of Nobel imposter syndrome. “It probably was okay to get it,” he admitted.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Welcome to Nautilus. We are delighted you joined us. We are here to tell you about science and its endless connections to our lives. Each month we choose a single topic. And each Thursday we publish a new chapter on that topic online. Each issue combines the sciences, culture and philosophy into a single story told by the world’s leading thinkers and writers. We follow the story wherever it leads us. Read our essays, investigative reports, and blogs. Fiction, too. Take in our games, videos, and graphic stories. Stop in for a minute, or an hour. Nautilus lets science spill over its usual borders. We are science, connected.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: