Tagged: Applied Research & Technology Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 3:11 pm on September 28, 2014 Permalink | Reply
    Tags: Applied Research & Technology, , ,   

    From Scientific American: “Weak Nuclear Force Shown to Give Asymmetry to Biochemistry of Life” 

    Scientific American

    Scientific American

    Sep 26, 2014
    Elizabeth Gibney and Nature magazine

    Physicists have found hints that the asymmetry of life — the fact that most biochemical molecules are ‘left-handed’ or ‘right-handed’ — could have been caused by electrons from nuclear decay in the early days of evolution. In an experiment that took 13 years to perfect, the researchers have found that these electrons tend to destroy certain organic molecules slightly more often than they destroy their mirror images.

    fish
    Life is made largely of molecules that are different than their mirror images.
    Credit: Brett Weinstein via Flickr

    Many organic molecules, including glucose and most biological amino acids, are ‘chiral’. This means that they are different than their mirror-image molecules, just like a left and a right glove are. Moreover, in such cases life tends to consistently use one of the possible versions — for example, the DNA double helix in its standard form always twists like a right-handed screw. But the reason for this preference has long remained a mystery.

    Many scientists think that the choice was simply down to chance. Perhaps, in one of the warm little ponds filled with organic chemicals where life arose, a statistical fluke generated a small imbalance in the relative amounts of the two versions of one chemical. This small imbalance could have then amplified over time.

    But an asymmetry in the laws of nature has led others to wonder whether some physical phenomenon could have tipped the balance during the early stages of life. The weak nuclear force, which is involved in nuclear decay, is the only force of nature known to have a handedness preference: electrons created in the subatomic process known as β decay are always ‘left-handed’. This means that their spin — a quantum property analogous to the magnetization of a bar magnet — is always opposite in direction to the electron’s motion.

    In 1967, biochemist Frederic Vester and environmental scientist Tilo Ulbricht proposed that photons generated by these so-called spin-polarized electrons — which are produced in the decay of radioactive materials or of cosmic-ray particles in the atmosphere — could have destroyed more of one kind of molecule than another, creating the imbalance. Some physicists have since suggested that the electrons themselves might be the source of the asymmetry.

    But the hunt to find chemical processes through which electrons or photons could preferentially destroy one version of a molecule over its mirror image has seen little success. Many claims have proven impossible to reproduce. The few experiments in which electron handedness produced a chiral imbalance could not identify the chemical process behind it, says Timothy Gay, a chemical physicist at the University of Nebraska–Lincoln and a co-author of the latest study. But pinpointing a chemical reaction would help scientists to rule out some candidate causes of the process and to better understand the physics that underlie it, he adds.

    Taking it slow

    Gay and Joan Dreiling, a physicist also at the University of Nebraska–Lincoln, fired low-energy, spin-polarized electrons at a gas of bromocamphor, an organic compound used in some parts of the world as a sedative. In the resulting reaction, some electrons were captured by the molecules, which then were kicked into an excited state. The molecules then fell apart, producing bromide ions and other highly reactive compounds. By measuring the flow of ions produced, the researchers could see how often the reaction occurred for each handedness of electron.

    The researchers found that left-handed bromocamphor was just slightly more likely to react with right-handed electrons than with left-handed ones. The converse was true when they used right-handed bromocamphor molecules. At the lowest energies, the direction of the preference flipped, causing an opposite asymmetry.

    In all cases the asymmetry was tiny, but consistent, like flipping a not-quite-fair coin. “The scale of the asymmetry is as though we flip 20,000 coins again and again, and on average, 10,003 of them land on heads while 9,997 land on tails,” says Dreiling.

    The low speed of the electrons was the key to why the experiment finally worked after so many years, Dreiling says. “The interaction takes longer, and it was that insight, I think, that led to our success,” she says.

    The test offers an explanation for how a chiral excess could — at least in principle — arise, Gay says. The research was published in Physical Review Letters on 12 September.

    The idea that spin-polarized electrons could transmit their asymmetry to organic molecules is attractive, says Uwe Meierhenrich, an analytical chemist at the University of Nice Sophia Antipolis in France. The tiny effect that Gay and Dreiling observed would have to be amplified to affect the chemistry of life as a whole — but there are known mechanisms for such amplification, he says. “From my point of view, the main question does not concern the amplification processes, but the first chiral-symmetry breaking,” he says.

    Meierhenrich says that he would like to see the experiment repeated with chiral molecules that are relevant to the origin of life, such as amino acids, to see whether the left-handed electrons produce the same effect.

    Primordial cause

    Even if spin-polarized electrons caused life to become chirally selective, it is still unclear what would have produced those electrons in the first place. Sources of β particles include phosphorus-32 decaying into sulphur-32, or the decay of muons, elementary particles produced at the end of a chain of decays that begin when cosmic ray particles hit the atmosphere. In both cases, the electrons would have been travelling much faster than in Gay’s reaction, but he says that it is possible for electrons to slow down without losing their chirality.

    Slower-moving, left-handed electrons are produced in other ways than via β decay, says Richard Rosenberg, a chemist at the Argonne National Laboratory in Illinois. In 2008 he and his team showed that irradiating a layer of magnetized iron with X-rays could also produce a chirality preference. Chirality could therefore also have been created in molecules stuck to magnetized particles in a dust cloud or comet, he says.

    Gay and his colleagues plan to look at similar reactions with other varieties of camphor molecules to understand how the spin of an electron dictates which of two chiral molecules it prefers.

    The interaction of left-handed electrons with organic molecules is not the only potential explanation for the chiral asymmetry of life.. Meierhenrich favors an alternative — the circularly polarized light that is produced by the scattering of light in the atmosphere and in neutron stars. In 2011, Meierhenrich and colleages showed that such light could transfer its handedness to amino acids.

    But even demonstrating how a common physical phenomenon would have favoured left-handed amino acids over right-handed ones would not tell us that this was how life evolved, adds Laurence Barron, a chemist at the University of Glasgow, UK. “There are no clinchers. We may never know.”

    See the full article here.

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 4:10 pm on September 27, 2014 Permalink | Reply
    Tags: Applied Research & Technology, , , Semiconductors   

    From physicsworld.com: “Nuclear spins control electrical currents” 

    physicsworld
    physicsworld.com

    Sep 23, 2014
    Katherine Kornei

    An international team of physicists has shown that information stored in the nuclear spins of hydrogen isotopes in an organic LED (OLED) can be read out by measuring the electrical current through the device. Unlike previous schemes that only work at ultracold temperatures, this is the first to operate at room temperature, and therefore could be used to create extremely dense and highly energy-efficient memory devices.

    man
    Spin doctor: Christoph Boehme inserts an OLED into a spectrometer

    With the growing demand for ever smaller, more powerful electronic devices, physicists are trying to develop more efficient semiconductors and higher-density data-storage devices. Motivated by the fact that traditional silicon semiconductors are susceptible to significant energy losses via waste heat, scientists are investigating the use of organic semiconductors. These are organic thin films placed between two conductors and they promise to be more energy efficient than silicon semiconductors. Furthermore, the availability of many different types of organic thin film could help physicists to optimize the efficiency of these devices.

    Chip and spin

    Conventional memory chips store data in the form of electrical charge. Moving this charge around the chip generates a lot of waste heat that must be dissipated, which makes it difficult to miniaturize components and also reduces battery life. An alternative approach is to store information in the spins of electrons or atomic nuclei – with spin-up corresponding to “1” and spin-down to “0”, for example. This could result in memories that are much denser and more energy efficient than the devices used today.

    Atomic nuclei are particularly attractive for storing data because their spins tend to be well shielded from the surrounding environment. This means that they could achieve storage times of several minutes, which is billions of times longer than is possible with electrons. The challenge, however, is how to read and write data to these tiny elements.

    Now, Christoph Boehme and colleagues at the University of Utah, along with John Lupton of the University of Regensburg and researchers at the University of Queensland, have shown that the flow of electrical current in an OLED can be modulated by controlling the spins of hydrogen isotopes in the device. “Electrical current in an organic semiconductor device is strongly influenced by the nuclear spins of hydrogen, which is abundant in organic materials,” explains Lupton. The team has shown that the current flowing through a plastic polymer OLED can be tuned precisely, suggesting that inexpensive OLEDs can be used as efficient semiconductors.

    Just like MRI

    Boehme and his team applied a small magnetic field to their test OLED, which creates an energy difference between the orientations of the nuclear spins of protons and deuterium (both hydrogen isotopes). The researchers then used radio-frequency signals to alter the directions of the spins of the protons and deuterium nuclei – a process that is also done during a nuclear magnetic resonance (NMR) experiment.

    The changes to the nuclear spins affect the spins of nearby electrons, and this results in changes to the electrical current. The magnetic forces between the nuclear and electron spins are millions of times smaller than the electrical forces needed to cause a similar change in current. This suggests that the effect could be used to create energy-efficient semiconductor memories.

    This recent work follows on from research done in 2010, when Boehme and colleagues showed that the technique could be used to control current in a device made from phosphorus-doped silicon. However, this was only possible in the presence of strong magnetic fields and at temperatures within a few degrees of absolute zero. Such conditions are impractical for commercial devices, but the OLED-based device needs neither ultracold temperatures nor high magnetic fields.

    Time to relax

    “In organic semiconductors, the spin-relaxation time does not change significantly with temperature,” explains Lupton. “In contrast, the spin-relaxation time in phosphorus-doped silicon increases significantly when the temperature is lowered; so in phosphorus-doped silicon, the experiments had to be carried out at low temperatures and high magnetic fields.”

    The team believes that its technique should also work with other nuclei with non-zero spin, with some limitations. “Since protons and deuterium are both hydrogen isotopes, they can be interchanged in the synthesis without changing the chemical structure of the polymer, which may not be possible with other types of nuclei,” Lupton explains. “Tritium, the third hydrogen isotope, is radioactive, so would not be much good in experiments.”

    The research is described in Science.

    See the full article here.

    PhysicsWorld is a publication of the Institute of Physics. The Institute of Physics is a leading scientific society. We are a charitable organisation with a worldwide membership of more than 50,000, working together to advance physics education, research and application.

    We engage with policymakers and the general public to develop awareness and understanding of the value of physics and, through IOP Publishing, we are world leaders in professional scientific communications.
    IOP Institute of Physics

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 3:41 pm on September 27, 2014 Permalink | Reply
    Tags: Applied Research & Technology, CO2 studies, , ,   

    From LBL: “Pore models track reactions in underground carbon capture” 

    Berkeley Logo

    Berkeley Lab

    September 25, 2014

    Using tailor-made software running on top-tier supercomputers, a Lawrence Berkeley National Laboratory team is creating microscopic pore-scale simulations that complement or push beyond laboratory findings.

    image
    Computed pH on calcite grains at 1 micron resolution. The iridescent grains mimic crushed material geoscientists extract from saline aquifers deep underground to study with microscopes. Researchers want to model what happens to the crystals’ geochemistry when the greenhouse gas carbon dioxide is injected underground for sequestration. Image courtesy of David Trebotich, Lawrence Berkeley National Laboratory.

    The models of microscopic underground pores could help scientists evaluate ways to store carbon dioxide produced by power plants, keeping it from contributing to global climate change.

    The models could be a first, says David Trebotich, the project’s principal investigator. “I’m not aware of any other group that can do this, not at the scale at which we are doing it, both in size and computational resources, as well as the geochemistry.” His evidence is a colorful portrayal of jumbled calcite crystals derived solely from mathematical equations.

    The iridescent menagerie is intended to act just like the real thing: minerals geoscientists extract from saline aquifers deep underground. The goal is to learn what will happen when fluids pass through the material should power plants inject carbon dioxide underground.

    Lab experiments can only measure what enters and exits the model system. Now modelers would like to identify more of what happens within the tiny pores that exist in underground materials, as chemicals are dissolved in some places but precipitate in others, potentially resulting in preferential flow paths or even clogs.

    Geoscientists give Trebotich’s group of modelers microscopic computerized tomography (CT, similar to the scans done in hospitals) images of their field samples. That lets both camps probe an anomaly: reactions in the tiny pores happen much more slowly in real aquifers than they do in laboratories.

    Going deep

    Deep saline aquifers are underground formations of salty water found in sedimentary basins all over the planet. Scientists think they’re the best deep geological feature to store carbon dioxide from power plants.

    But experts need to know whether the greenhouse gas will stay bottled up as more and more of it is injected, spreading a fluid plume and building up pressure. “If it’s not going to stay there (geoscientists) will want to know where it is going to go and how long that is going to take,” says Trebotich, who is a computational scientist in Berkeley Lab’s Applied Numerical Algorithms Group.

    He hopes their simulation results ultimately will translate to field scale, where “you’re going to be able to model a CO2 plume over a hundred years’ time and kilometers in distance.” But for now his group’s focus is at the microscale, with attention toward the even smaller nanoscale.

    At such tiny dimensions, flow, chemical transport, mineral dissolution and mineral precipitation occur within the pores where individual grains and fluids commingle, says a 2013 paper Trebotich coauthored with geoscientists Carl Steefel (also of Berkeley Lab) and Sergi Molins in the journal Reviews in Mineralogy and Geochemistry.

    These dynamics, the paper added, create uneven conditions that can produce new structures and self-organized materials – nonlinear behavior that can be hard to describe mathematically.

    Modeling at 1 micron resolution, his group has achieved “the largest pore-scale reactive flow simulation ever attempted” as well as “the first-ever large-scale simulation of pore-scale reactive transport processes on real-pore-space geometry as obtained from experimental data,” says the 2012 annual report of the lab’s National Energy Research Scientific Computing Center (NERSC).

    The simulation required about 20 million processor hours using 49,152 of the 153,216 computing cores in Hopper, a Cray XE6 that at the time was NERSC’s flagship supercomputer.

    cray hopper
    Cray Hopper at NERSC

    “As CO2 is pumped underground, it can react chemically with underground minerals and brine in various ways, sometimes resulting in mineral dissolution and precipitation, which can change the porous structure of the aquifer,” the NERSC report says. “But predicting these changes is difficult because these processes take place at the pore scale and cannot be calculated using macroscopic models.

    “The dissolution rates of many minerals have been found to be slower in the field than those measured in the laboratory. Understanding this discrepancy requires modeling the pore-scale interactions between reaction and transport processes, then scaling them up to reservoir dimensions. The new high-resolution model demonstrated that the mineral dissolution rate depends on the pore structure of the aquifer.”

    Trebotich says “it was the hardest problem that we could do for the first run.” But the group redid the simulation about 2½ times faster in an early trial of Edison, a Cray XC-30 that succeeded Hopper. Edison, Trebotich says, has larger memory bandwidth.

    cray edison
    Cray Edison at NERSC

    Rapid changes

    Generating 1-terabyte data sets for each microsecond time step, the Edison run demonstrated how quickly conditions can change inside each pore. It also provided a good workout for the combination of interrelated software packages the Trebotich team uses.

    The first, Chombo, takes its name from a Swahili word meaning “toolbox” or “container” and was developed by a different Applied Numerical Algorithms Group team. Chombo is a supercomputer-friendly platform that’s scalable: “You can run it on multiple processor cores, and scale it up to do high-resolution, large-scale simulations,” he says.

    Trebotich modified Chombo to add flow and reactive transport solvers. The group also incorporated the geochemistry components of CrunchFlow, a package Steefel developed, to create Chombo-Crunch, the code used for their modeling work. The simulations produce resolutions “very close to imaging experiments,” the NERSC report said, combining simulation and experiment to achieve a key goal of the Department of Energy’s Energy Frontier Research Center for Nanoscale Control of Geologic CO2

    Now Trebotich’s team has three huge allocations on DOE supercomputers to make their simulations even more detailed. The Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program is providing 80 million processor hours on Mira, an IBM Blue Gene/Q at Argonne National Laboratory. Through the Advanced Scientific Computing Research Leadership Computing Challenge (ALCC), the group has another 50 million hours on NERSC computers and 50 million on Titan, a Cray XK78 at Oak Ridge National Laboratory’s Leadership Computing Center. The team also held an ALCC award last year for 80 million hours at Argonne and 25 million at NERSC.

    mira
    MIRA at Argonne

    titan
    TITAN at Oak Ridge

    With the computer time, the group wants to refine their image resolutions to half a micron (half of a millionth of a meter). “This is what’s known as the mesoscale: an intermediate scale that could make it possible to incorporate atomistic-scale processes involving mineral growth at precipitation sites into the pore scale flow and transport dynamics,” Trebotich says.

    Meanwhile, he thinks their micron-scale simulations already are good enough to provide “ground-truthing” in themselves for the lab experiments geoscientists do.

    See the full article here.

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 2:17 pm on September 26, 2014 Permalink | Reply
    Tags: Applied Research & Technology, ,   

    From NASA/JPL at Caltech: “Cold Atom Laboratory Chills Atoms to New Lows” 

    JPL

    September 26, 2014
    Elizabeth Landau
    Jet Propulsion Laboratory, Pasadena, Calif.
    818-354-6425
    elizabeth.landau@jpl.nasa.gov

    NASA’s Cold Atom Laboratory (CAL) mission has succeeded in producing a state of matter known as a Bose-Einstein condensate, a key breakthrough for the instrument leading up to its debut on the International Space Station in late 2016.

    NASA Cold Atom Laboratory
    >NASA’s Cold Atom Laboratory

    A Bose-Einstein condensate (BEC) is a collection of atoms in a dilute gas that have been lowered to extremely cold temperatures and all occupy the same quantum state, in which all of the atoms have the same energy levels. At a critical temperature, atoms begin to coalesce, overlap and become synchronized like dancers in a chorus line. The resulting condensate is a new state of matter that behaves like a giant — by atomic standards — wave.

    “It’s official. CAL’s ground testbed is the coolest spot at NASA’s Jet Propulsion Laboratory at 200 nano-Kelvin [200 billionths of 1 Kelvin], “said Cold Atom Laboratory Project Scientist Rob Thompson at JPL in Pasadena, California. “Achieving Bose-Einstein condensation in our prototype hardware is a crucial step for the mission.”

    Although these quantum gases had been created before elsewhere on Earth, the Cold Atom Laboratory will explore the condensates in an entirely new regime: The microgravity environment of the space station. It will enable groundbreaking research in temperatures colder than any found on Earth.

    CAL will be a facility for studying ultra-cold quantum gases on the space station. In the station’s microgravity environment, interaction times and temperatures as low as one picokelvin (one trillionth of one Kelvin, or 293 trillion times below room temperature) should be achievable. That’s colder than anything known in nature, and the experiments with CAL could potentially create the coldest matter ever observed in the universe. These breakthrough temperatures unlock the potential to observe new quantum phenomena and test some of the most fundamental laws of physics.

    First observed in 1995, Bose-Einstein condensation has been one of the “hottest” topics in physics ever since. The condensates are different from normal gases; they represent a distinct state of matter that starts to form typically below a millionth of a degree above absolute zero, the temperature at which atoms have the least energy and are close to motionless. Familiar concepts of “solid,” “liquid” and “gas” no longer apply at such cold temperatures; instead, atoms do bizarre things governed by quantum mechanics, such as behaving as waves and particles at the same time.

    Cold Atom Laboratory researchers used lasers to optically cool rubidium atoms to temperatures almost a million times colder than that of the depths of space. The atoms were then magnetically trapped, and radio waves were used to cool the atoms 100 times lower. The radiofrequency radiation acts like a knife, slicing away the hottest atoms from the trap so that only the coldest remain.

    The research is at the point where this process can reliably create a Bose-Einstein condensate in just seconds.

    “This was a tremendous accomplishment for the CAL team. It confirms the fidelity of the instrument system design and provides us a facility to perform science and hardware verifications before we get to the space station,” said CAL Project Manager Anita Sengupta of JPL.

    While so far, the Cold Atom Laboratory researchers have created Bose-Einstein condensates with rubidium atoms, eventually they will also add in potassium. The behavior of two condensates mixing together will be fascinating for physicists to observe, especially in space.

    Besides merely creating Bose-Einstein condensates, CAL provides a suite of tools to manipulate and probe these quantum gases in a variety of ways. It has a unique role as a facility for the atomic, molecular and optical physics community to study cold atomic physics in microgravity, said David Aveline of JPL, CAL ground testbed lead.

    “Instead of a state-of-the-art telescope looking outward into the cosmos, CAL will look inward, exploring physics at the atomic scale,” Aveline said.

    JPL is developing the Cold Atom Laboratory sponsored by the International Space Station Program at NASA’s Johnson Space Center in Houston.

    The Space Life and Physical Sciences Division of NASA’s Human Exploration and Operations Mission Directorate at NASA Headquarters in Washington manages the Fundamental Physics Program.

    See the full article here.

    Jet Propulsion Laboratory (JPL) is a federally funded research and development center and NASA field center located in the San Gabriel Valley area of Los Angeles County, California, United States. Although the facility has a Pasadena postal address, it is actually headquartered in the city of La Cañada Flintridge [1], on the northwest border of Pasadena. JPL is managed by the nearby California Institute of Technology (Caltech) for the National Aeronautics and Space Administration. The Laboratory’s primary function is the construction and operation of robotic planetary spacecraft, though it also conducts Earth-orbit and astronomy missions. It is also responsible for operating NASA’s Deep Space Network.

    Caltech Logo
    jpl

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 1:59 pm on September 26, 2014 Permalink | Reply
    Tags: Applied Research & Technology, , ,   

    From PNNL: “Off-shore Power Potential Floating in the Wind” 


    PNNL Lab

    September 2014
    Web Publishing Services

    Results
    : Two bright yellow buoys – each worth $1.3 million – are being deployed by Pacific Northwest National Laboratory in Washington State’s Sequim Bay. The massive, 20,000-pound buoys are decked out with the latest in meteorological and oceanographic equipment to enable more accurate predictions of the power-producing potential of winds that blow off U.S. shores. Starting in November, they will be commissioned for up to a year at two offshore wind demonstration projects: one near Coos Bay, Oregon, and another near Virginia Beach, Virginia.

    off
    PNNL staff conduct tests in Sequim Bay, Washington, while aboard one of two new research buoys being commissioned to more accurately predict offshore wind’s power-producing potential.

    “We know offshore winds are powerful, but these buoys will allow us to better understand exactly how strong they really are at the heights of wind turbines,” said PNNL atmospheric scientist Dr. William J. Shaw. “Data provided by the buoys will give us a much clearer picture of how much power can be generated at specific sites along the American coastline – and enable us to generate that clean, renewable power sooner.”

    Why It Matters: Offshore wind is a new frontier for U.S. renewable energy developers. There’s tremendous power-producing potential, but limited information is available about ocean-based wind resources. A recent report estimated the U.S. could power nearly 17 million homes by generating more than 54 gigawatts of offshore wind energy, but more information is needed.

    See the full article here.

    Pacific Northwest National Laboratory (PNNL) is one of the United States Department of Energy National Laboratories, managed by the Department of Energy’s Office of Science. The main campus of the laboratory is in Richland, Washington.

    PNNL scientists conduct basic and applied research and development to strengthen U.S. scientific foundations for fundamental research and innovation; prevent and counter acts of terrorism through applied research in information analysis, cyber security, and the nonproliferation of weapons of mass destruction; increase the U.S. energy capacity and reduce dependence on imported oil; and reduce the effects of human activity on the environment. PNNL has been operated by Battelle Memorial Institute since 1965.

    i1

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 11:31 am on September 26, 2014 Permalink | Reply
    Tags: Applied Research & Technology, ,   

    From BNL: “New NIH/DOE Grant for Life Science Studies at NSLS-II” 

    Brookhaven Lab

    September 23, 2014
    Karen McNulty Walsh, (631) 344-8350 or Peter Genzer, (631) 344-3174

    Funding will support operation of three powerful experimental stations designed to reveal detailed structures of proteins, viruses, and more.

    Much of our understanding of how living things function comes from knowledge of structures—atomic details of enzymes that catalyze the processes of life, the receptors that are docking stations for viruses and messenger chemicals, and the nucleic acids DNA and RNA that carry genetic blueprints for building cellular machinery, to name a few. To give scientists unprecedented access to these structural details, a new grant just awarded by the National Institutes of Health (NIH) and the U.S. Department of Energy (DOE) will fund the operation of a suite of powerful experimental tools at DOE’s Brookhaven National Laboratory.

    “With the enhanced capabilities of NSLS-II, researchers will be able to look at smaller specimens of bigger biomolecular complexes and to learn more about them.”
    — Robert Sweet

    BNL NSLS II Photo
    interior
    Brookhaven’s NSLS-II

    The tools, being built with a previous award to Brookhaven Lab of $45 million (M) from NIH in 2010, are being installed at three beamlines at the National Synchrotron Light Source II (NSLS-II), the nation’s newest and most advanced state-of-the-art synchrotron research facility and a DOE Office of Science User Facility, nearing completion at Brookhaven Lab. The new five-year grant—an estimated $17.5 M from DOE’s Office of Science, and $15.6 M from the NIH’s National Institute of General Medical Sciences—will create the Life Science and Biomedical Technology Research Resource (LSBR) to operate these new stations at NSLS-II and to develop new and improved technologies that will enable researchers to address challenging biological questions more effectively.

    The new light source—an electron storage ring that produces intense beams of x rays and other forms of light for studies across a wide array of sciences—is set to open in late 2014, replacing its predecessor, NSLS, which is shutting down at the end of September. Compared to the older facility, which has been one of the most productive scientific facilities in the world, NSLS-II will produce light beams that are 10,000 times brighter. Like NSLS, NSLS-II will serve researchers from academia, industry, other National Labs, and Brookhaven Lab.

    “With the enhanced capabilities of NSLS-II, researchers will be able to look at smaller specimens of bigger biomolecular complexes and to learn more about them,” said Brookhaven’s Robert Sweet, the Principal Investigator of the new grant. He will manage the LSBR along with Sean McSweeney, who recently joined the Lab to serve as Photon Science Associate Division Director for Structural Biology and LSBR Director.

    “Over the course of its thirty years of operation, NSLS served as a resource for thousands of scientists studying the structure and properties of matter,” Sweet said. “Roughly 40 percent of those scientists have studied life science in some form, with many making x-ray diffraction measurements on crystals of proteins, nucleic acids, viruses, and other substances critical to understanding life. We expect that the same will be true at NSLS-II, and designed these new powerful experimental stations specifically to meet these needs.”

    LSBR will operate the three new NSLS-II beamlines—two for macromolecular crystallography and one for general x-ray scattering studies, plus smaller programs in macromolecular crystallography correlated with optical spectroscopy, and x-ray fluorescence imaging. Between now and 2016 the staff of LSBR, numbering about twelve scientists and ten software specialists, engineers, and technical staff, will finish NSLS operations, help build and commission the new facilities, and then operate them for an extensive user program and for internal research.

    The team also plans to foster collaboration among multiple complementary disciplines in an environment where staff scientists and their instruments will share office space, labs, and computational facilities.

    “In effect, we are creating a Biology Village,” said Sweet.

    McSweeney is already expanding on this idea. “We plan interaction with other biology researchers at Brookhaven and beyond,” he said. “For example, we will enable studies of plants in their environment, allowing us to follow individual organisms over a wide range of scales with methods including electron microscopy, positron emission tomography, and bioinformatics, to track the appearance, disappearance, and flow of metabolites in the organism. We have a new $4M pilot project funded by the DOE Office of Science (BER) to develop new capabilities for this research.

    “The combined investments of DOE and NIH will provide firstly a world-leading synchrotron light source of unparalleled brightness and secondly the suite of tools to exploit this opportunity. Synergistic instrumentation developments will, we expect, lead to great opportunities for new biological science.”

    The new NIH grant described in this press release was awarded by the National Institute of General Medical Sciences of the National Institutes of Health under award number P41 GM111244-01. The DOE Office of Science project number is PAR-10-225. NSLS-II was constructed and its overall operation will be funded by the DOE Office of Science.

    See the full article here.

    BNL Campus

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world.Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 10:11 pm on September 25, 2014 Permalink | Reply
    Tags: Applied Research & Technology, , ,   

    From NOVA: “Genetically Engineering Almost Anything” 

    PBS NOVA

    NOVA

    Thu, 17 Jul 2014
    Tim De Chant and Eleanor Nelsen

    When it comes to genetic engineering, we’re amateurs. Sure, we’ve known about DNA’s structure for more than 60 years, we first sequenced every A, T, C, and G in our bodies more than a decade ago, and we’re becoming increasingly adept at modifying the genes of a growing number of organisms.

    But compared with what’s coming next, all that will seem like child’s play. A new technology just announced today has the potential to wipe out diseases, turn back evolutionary clocks, and reengineer entire ecosystems, for better or worse. Because of how deeply this could affect us all, the scientists behind it want to start a discussion now, before all the pieces come together over the next few months or years. This is a scientific discovery being played out in real time.

    dna repair
    Scientists have figured out how to use a cell’s DNA repair mechanisms to spread traits throughout a population.

    Today, researchers aren’t just dropping in new genes, they’re deftly adding, subtracting, and rewriting them using a series of tools that have become ever more versatile and easier to use. In the last few years, our ability to edit genomes has improved at a shockingly rapid clip. So rapid, in fact, that one of the easiest and most popular tools, known as CRISPR-Cas9, is just two years old. Researchers once spent months, even years, attempting to rewrite an organism’s DNA. Now they spend days.

    Soon, though, scientists will begin combining gene editing with gene drives, so-called selfish genes that appear more frequently in offspring than normal genes, which have about a 50-50 chance of being passed on. With gene drives—so named because they drive a gene through a population—researchers just have to slip a new gene into a drive system and let nature take care of the rest. Subsequent generations of whatever species we choose to modify—frogs, weeds, mosquitoes—will have more and more individuals with that gene until, eventually, it’s everywhere.

    Cas9-based gene drives could be one of the most powerful technologies ever discovered by humankind. “This is one of the most exciting confluences of different theoretical approaches in science I’ve ever seen,” says Arthur Caplan, a bioethicist at New York University. “It merges population genetics, genetic engineering, molecular genetics, into an unbelievably powerful tool.”

    We’re not there yet, but we’re extraordinarily close. “Essentially, we have done all of the pieces, sometimes in the same relevant species.” says Kevin Esvelt, a postdoc at Harvard University and the wunderkind behind the new technology. “It’s just no one has put it all together.”

    It’s only a matter of time, though. The field is progressing rapidly. “We could easily have laboratory tests within the next few months and then field tests not long after that,” says George Church, a professor at Harvard University and Esvelt’s advisor. “That’s if everybody thinks it’s a good idea.”

    It’s likely not everyone will think this is a good idea. “There are clearly people who will object,” Caplan says. “I think the technique will be incredibly controversial.” Which is why Esvelt, Church, and their collaborators are publishing papers now, before the different parts of the puzzle have been assembled into a working whole.

    “If we’re going to talk about it at all in advance, rather than in the past tense,” Church says, “now is the time.”

    “Deleterious Genes”

    The first organism Esvelt wants to modify is the malaria-carrying mosquito Anopheles gambiae. While his approach is novel, the idea of controlling mosquito populations through genetic modification has actually been around since the late 1970s. Then, Edward F. Knipling, an entomologist with the U.S. Department of Agriculture, published a substantial handbook with a chapter titled Use of Insects for Their Own Destruction. One technique, he wrote, would be to modify certain individuals to carry “deleterious genes” that could be passed on generation after generation until they pervaded the entire population. It was an idea before its time. Knipling was on the right track, but he and his contemporaries lacked the tools to see it through.

    The concept surfaced a few more times before being picked up by Austin Burt, an evolutionary biologist and population geneticist at Imperial College London. It was the late 1990s, and Burt was busy with his yeast cells, studying their so-called homing endonucleases, enzymes that facilitate the copying of genes that code for themselves. Self-perpetuating genes, if you will. “Through those studies, gradually, I became more and more familiar with endonucleases, and I came across the idea that you might be able to change them to recognize new sequences,” Burt recalls.

    Other scientists were investigating endonucleases, too, but not in the way Burt was. “The people who were thinking along those lines, molecular biologists, were thinking about using these things for gene therapy,” Burt says. “My background in population biology led me to think about how they could be used to control populations that were particularly harmful.”
    “There’s a lot to be done still, but on the scale of years, not months or decades.”

    In 2003, Burt penned an influential article that set the course for an entire field: We should be using homing endonucleases, a type of gene drive, to modify malaria-carrying mosquitoes, he said, not ourselves. Burt saw two ways of going about it—one, modify a mosquito’s genome to make it less hospitable to malaria, and two, skew the sex ratio of mosquito populations so there are no females for the males to reproduce with. In the following years, Burt and his collaborators tested both in the lab and with computer models before they settled on sex ratio distortion. (Making mosquitoes less hospitable to malaria would likely be a stopgap measure at best; the Plasmodium protozoans could evolve to cope with the genetic changes, just like they have evolved resistance to drugs.)

    Burt has spent the last 11 years refining various endonucleases, playing with different scenarios of inheritance, and surveying people in malaria-infested regions. Now, he finally feels like he is closing in on his ultimate goal.“There’s a lot to be done still,” he says. “But on the scale of years, not months or decades.”

    Cheating Natural Selection

    Cas9-based gene drives could compress that timeline even further. One half of the equation—gene drives—are the literal driving force behind proposed population-scale genetic engineering projects. They essentially let us exploit evolution to force a desired gene into every individual of a species. “To anthropomorphize horribly, the goal of a gene is to spread itself as much as possible,” Esvelt says. “And in order to do that, it wants to cheat inheritance as thoroughly as it can.” Gene drives are that cheat.

    Without gene drives, traits in genetically-engineered organisms released into the wild are vulnerable to dilution through natural selection. For organisms that have two parents and two sets of chromosomes (which includes humans, many plants, and most animals), traits typically have only a 50-50 chance of being inherited, give or take a few percent. Genes inserted by humans face those odds when it comes time to being passed on. But when it comes to survival in the wild, a genetically modified organism’s odds are often less than 50-50. Engineered traits may be beneficial to humans, but ultimately they tend to be detrimental to the organism without human assistance. Even some of the most painstakingly engineered transgenes will be gradually but inexorably eroded by natural selection.

    Some naturally occurring genes, though, have over millions of years learned how to cheat the system, inflating their odds of being inherited. Burt’s “selfish” endonucleases are one example. They take advantage of the cell’s own repair machinery to ensure that they show up on both chromosomes in a pair, giving them better than 50-50 odds when it comes time to reproduce.

    gene drive
    A gene drive (blue) always ends up in all offspring, even if only one parent has it. That means that, given enough generations, it will eventually spread through the entire population.

    Here’s how it generally works. The term “gene drive” is fairly generic, describing a number of different systems, but one example involves genes that code for an endonuclease—an enzyme which acts like a pair of molecular scissors—sitting in the middle of a longer sequence of DNA that the endonculease is programmed to recognize. If one chromosome in a pair contains a gene drive but the other doesn’t, the endonuclease cuts the second chromosome’s DNA where the endonuclease code appears in the first.

    The broken strands of DNA trigger the cell’s repair mechanisms. In certain species and circumstances, the cell unwittingly uses the first chromosome as a template to repair the second. The repair machinery, seeing the loose ends that bookend the gene drive sequence, thinks the middle part—the code for the endonuclease—is missing and copies it onto the broken chromosome. Now both chromosomes have the complete gene drive. The next time the cell divides, splitting its chromosomes between the two new cells, both new cells will end up with a copy of the gene drive, too. If the entire process works properly, the gene drive’s odds of inheritance aren’t 50%, but 100%.

    gene drive
    Here, a mosquito with a gene drive (blue) mates with a mosquito without one (grey). In the offspring, one chromosome will have the drive. The endonuclease then slices into the drive-free DNA. When the strand gets repaired, the cell’s machinery uses the drive chromosome as a template, unwittingly copying the drive into the break.

    Most natural gene drives are picky about where on a strand of DNA they’ll cut, so they need to be modified if they’re to be useful for genetic engineering. For the last few years, geneticists have tried using genome-editing tools to build custom gene drives, but the process was laborious and expensive. With the discovery of CRISPR-Cas9 as a genome editing tool in 2012, though, that barrier evaporated. CRISPR is an ancient bacterial immune system which identifies the DNA of invading viruses and sends in an endonuclease, like Cas9, to chew it up. Researchers quickly realized that Cas9 could easily be reprogrammed to recognize nearly any sequence of DNA. All that’s needed is the right RNA sequence—easily ordered and shipped overnight—which Cas9 uses to search a strand of DNA for where to cut. This flexibility, Esvelt says, “lets us target, and therefore edit, pretty much anything we want.” And quickly.

    Gene drives and Cas9 are each powerful on their own, but together they could significantly change biology. CRISRP-Cas9 allows researchers to edit genomes with unprecedented speed, and gene drives allow engineered genes to cheat the system, even if the altered gene weakens the organism. Simply by being coupled to a gene drive, an engineered gene can race throughout a population before it is weeded out. “Eventually, natural selection will win,” Esvelt says, but “gene drives just let us get ahead of the game.”
    Beyond Mosquitoes

    If there’s anywhere we could use a jump start, it’s in the fight against malaria. Each year, the disease kills over 200,000 people and sickens over 200 million more, most of whom are in Africa. The best new drugs we have to fight it are losing ground; the Plasmodium parasite is evolving resistance too quickly. And we’re nowhere close to releasing an effective vaccine. The direct costs of treating the disease are estimated at $12 billion, and the economies of affected countries grew 1.3% less per year, a substantial amount.

    Which is why Esvelt and Burt are both so intently focused on the disease. “If we target the mosquito, we don’t have to face resistance on the parasite itself. The idea is, we can just take out the vector and stop all transmission. It might even lead to eradication,” Esvelt says.

    Esvelt initially mulled over the idea of building Cas9-based gene drives in mosquitoes to do just that. He took the idea to to Flaminia Catteruccia, a professor who studies malaria at the Harvard School of Public Health, and the two grew increasingly certain that such a system would not only work, but work well. As their discussions progressed, though, Esvelt realized they were “missing the forest for the trees.” Controlling malaria-carrying mosquitoes was just the start. Cas9-based gene drives were the real breakthrough. “If it let’s us do this for mosquitos, what is to stop us from potentially doing it for almost anything that is sexually reproducing?” he realized.
    “What is to stop us from potentially doing it for almost anything that is sexually reproducing?”

    In theory, nothing. But in reality, the system works best on fast-reproducing species, Esvelt says. Short generation times allow the trait to spread throughout a population more quickly. Mosquitoes are a perfect test case. If everything were to work perfectly, deleterious traits could sweep through populations of malaria-carrying mosquitoes in as few as five years, wiping them off the map.

    Other noxious species could be candidates, too. Certain invasive species, like mosquitoes in Hawaii or Asian carp in the Great Lakes, could be targeted with Cas9-based gene drives to either reduce their numbers or eliminate them completely. Agricultural weeds like horseweed that have evolved resistance to glyphosate, a herbicide that is broken down quickly in the soil, could have their susceptibility to the compound reintroduced, enabling more farmers to adopt no-till practices, which help conserve topsoil. And in the more distant future, Esvelt says, weeds could even be engineered to introduce vulnerabilities to completely benign substances, eliminating the need for toxic pesticides. The possibilities seem endless.

    The Decision

    Before any of that can happen, though, Esvelt and Church are adamant that the public help decide whether the research should move forward. “What we have here is potentially a general tool for altering wild populations,” Esvelt says. “We really want to make sure that we proceed down this path—if we decide to proceed down this path—as safely and responsibly as possible.”

    To kickstart the conversation, they partnered with the MIT political scientist Kenneth Oye and others to convene a series of workshops on the technology. “I thought it might be useful to get into the room people with slightly different material interests,” Oye says, so they invited regulators, nonprofits, companies, and environmental groups. The idea, he says, was to get people to meet several times, to gain trust and before “decisions harden.” Despite the diverse viewpoints, Oye says there was surprising agreement among participants about what the important outstanding questions were.

    As the discussion enters the public sphere, tensions are certain to intensify. “I don’t care if it’s a weed or a blight, people still are going to say this is way too massive a genetic engineering project,” Caplan says. “Secondly, it’s altering things that are inherited, and that’s always been a bright line for genetic engineering.” Safety, too, will undoubtedly be a concern. As the power of a tool increases, so does its potential for catastrophe, and Cas9-based gene drives could be extraordinarily powerful.

    There’s also little in the way of precedent that we can use as a guide. Our experience with genetically modified foods would seem to be a good place to start, but they are relatively niche organisms that are heavily dependent on water and fertilizer. It’s pretty easy to keep them contained to a field. Not so with wild organisms; their potential to spread isn’t as limited.
    There’s little in the way of precedent that we can use as a guide.

    Aware of this, Esvelt and his colleagues are proposing a number of safeguards, including reversal drives that can undo earlier engineered genes. “We need to really make sure those work if we’re proposing to build a drive that is intended to modify a wild population,” Esvelt says.

    There are still other possible hurdles to surmount—lab-grown mosquitoes may not interbreed with wild ones, for example—but given how close this technology is to prime time, Caplan suggests researchers hew to a few initial ethical guidelines. One, use species that are detrimental to human health and don’t appear to fill a unique niche in the wild. (Malaria-carrying mosquitoes seem fit that description.) Two, do as much work as possible using computer models. And three, researchers should continue to be transparent about their progress, as they have been. “I think the whole thing is hugely exciting,” Caplan says. “But the time to really get cracking on the legal/ethical infrastructure for this technology is right now.”

    Church agrees, though he’s also optimistic about the potential for Cas9-based gene drives. “I think we need to be cautious with all new technologies, especially all new technologies that are messing with nature in some way or another. But there’s also a risk of doing nothing,” Church says. “We have a population of 7 billion people. You have to deal with the environmental consequences of that.”

    See the full article here.

    NOVA is the highest rated science series on television and the most watched documentary series on public television. It is also one of television’s most acclaimed series, having won every major television award, most of them many times over.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 2:23 pm on September 25, 2014 Permalink | Reply
    Tags: Applied Research & Technology, ,   

    From MIT: “How to make stronger, ‘greener’ cement” 


    MIT News

    September 25, 2014
    David L. Chandler | MIT News Office

    Concrete is the world’s most-used construction material, and a leading contributor to global warming, producing as much as one-tenth of industry-generated greenhouse-gas emissions. Now a new study suggests a way in which those emissions could be reduced by more than half — and the result would be a stronger, more durable material.

    pour

    The findings come from the most detailed molecular analysis yet of the complex structure of concrete, which is a mixture of sand, gravel, water, and cement. Cement is made by cooking calcium-rich material, usually limestone, with silica-rich material — typically clay — at temperatures of 1,500 degrees Celsius, yielding a hard mass called “clinker.” This is then ground up into a powder. The decarbonation of limestone, and the heating of cement, are responsible for most of the material’s greenhouse-gas output.

    The new analysis suggests that reducing the ratio of calcium to silicate would not only cut those emissions, but would actually produce better, stronger concrete. These findings are described in the journal Nature Communications by MIT senior research scientist Roland Pellenq; professors Krystyn Van Vliet, Franz-Josef Ulm, Sidney Yip, and Markus Buehler; and eight co-authors at MIT and at CNRS in Marseille, France.

    “Cement is the most-used material on the planet,” Pellenq says, noting that its present usage is estimated to be three times that of steel. “There’s no other solution to sheltering mankind in a durable way — turning liquid into stone in 10 hours, easily, at room temperature. That’s the magic of cement.”

    In conventional cements, Pellenq explains, the calcium-to-silica ratio ranges anywhere from about 1.2 to 2.2, with 1.7 accepted as the standard. But the resulting molecular structures have never been compared in detail. Pellenq and his colleagues built a database of all these chemical formulations, finding that the optimum mixture was not the one typically used today, but rather a ratio of about 1.5.

    As the ratio varies, he says, the molecular structure of the hardened material progresses from a tightly ordered crystalline structure to a disordered glassy structure. They found the ratio of 1.5 parts calcium for every one part silica to be “a magical ratio,” Pellenq says, because at that point the material can achieve “two times the resistance of normal cement, in mechanical resistance to fracture, with some molecular-scale design.”

    The findings, Pellenq adds, were “validated against a large body of experimental data.” Since emissions related to concrete production are estimated to represent 5 to 10 percent of industrial greenhouse-gas emissions, he says, “any reduction in calcium content in the cement mix will have an impact on the CO2.” In fact, he says, the reduction in carbon emissions could be as much as 60 percent.

    In addition to the overall improvement in mechanical strength, Pellenq says, because the material would be more glassy and less crystalline, there would be “no residual stresses in the material, so it would be more fracture-resistant.”

    The work is the culmination of five years of research by a collaborative team from MIT and CNRS, where Pellenq is research director. The two institutions have a joint laboratory at MIT called the Multi-Scale Materials Science for Energy and Environment, run by Pellenq and Ulm, who is director of MIT’s Concrete Sustainability Hub, and hosted by the MIT Energy Initiative.

    Because of its improved resistance to mechanical stress, Pellenq says the revised formulation could be of particular interest to the oil and gas industries, where cement around well casings is crucial to preventing leakage and blowouts. “More resistant cement certainly is something they would consider,” Pellenq says.

    So far, the work has remained at the molecular level of analysis, he says. “Next, we have to make sure these nanoscale properties translate to the mesoscale” — that is, to the engineering scale of applications for infrastructure, housing, and other uses.

    Zdeněk Bažant, a professor of civil and environmental engineering, mechanical engineering, and materials science and engineering at Northwestern University who was not involved in this research, says, “Roland Pellenq, with his group at MIT, is doing cutting-edge research, clarifying the nanostructure and properties of cement hydrates.”

    The Concrete Sustainability Hub is supported by the Portland Cement Association and the Ready Mixed Concrete Research and Education Foundation.

    See the full article here.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 1:32 pm on September 25, 2014 Permalink | Reply
    Tags: Applied Research & Technology, ,   

    From LLNL: “From RAGS to riches” 


    Lawrence Livermore National Laboratory

    09/25/2014
    Breanna Bishop, LLNL, (925) 423-9802, bishop33@llnl.gov

    The Radiochemical Analysis of Gaseous Samples (RAGS) is a true trash to treasure story, turning debris from the National Ignition Facility’s (NIF) target chamber into valuable data that helps to shape future experiments.

    LLNL NIF
    NIF at LLNL

    The RAGS diagnostic, developed for NIF by Sandia National Laboratories and commissioned in 2012, is a cryogenic system designed to collect the gaseous debris from the NIF target chamber after a laser shot, then concentrate, purify and analyze the debris for radioactive gas products. Radiation detectors on the apparatus produce rapid, real-time measurements of the radioactivity content of the gas. Based on the results of the counting, the total number of radioactive atoms that were produced via nuclear reactions during a NIF shot can be determined.

    team
    Members of the RAGS team with the apparatus. From left to right: Bill Cassata and Carol Velsko, primary RAGS operators and data analysts; Wolfgan Stoeffl, RAGS designer; and Dawn Shaughnessy, principal investigator for the project. Photo by Julie Russell/LLNL

    If the number of target atoms in the fuel capsule and/or hohlraum (cylinder surrounding the fuel capsule) was known prior to the shot, then the results from RAGS determine the number of reactions that occurred, which in turn is used to determine the flux of particles that was produced by the capsule as it underwent fusion. This information is used to validate models of NIF capsule performance under certain shot conditions.

    If specific materials are added to the capsule or hohlraum prior to the shot, then reactions related to a particular experiment can be measured. For instance, there have been gas-based experiments designed to measure areal density (a measure of the combined thickness and density of the imploding frozen fuel shell) and mix (a potentially undesirable condition during which spikes of the plastic rocket shell penetrate to the core of the hot fuel and cool it, decreasing the probability of igniting a sustained fusion reaction with energy gain).

    “Radiochemical diagnostics probe reactions that occur within the capsule or hohlraum material. By adding materials into the capsule ablator and subsequently measuring the resulting products, we can explore certain capsule parameters such as fuel-ablator mix,” said radiochemist and RAGS principal investigator Dawn Shaughnessy. “There are plans to add isotopes of xenon gas into capsules specifically for this purpose – to quantify the amount of mix that occurs during a NIF implosion.”

    RAGS can be used to perform basic nuclear science experiments. Recently, the diagnostic has been employed during shots where the hohlraum contained small amounts of depleted uranium. Gaseous fission fragments were collected by RAGS, including very short-lived species with half-lives on order of a few seconds. Based on these observations, there are plans to use RAGS in the future to measure independent fission product yields of gaseous species, which is difficult to do at traditional neutron sources.

    “This opens up the possibility of also using RAGS for fundamental science experiments, such as measuring reaction rates of species relevant to nuclear astrophysics, and measuring independent fission yields,” Shaughnessy said.

    In addition to Shaughnessy as PI, other contributors to RAGS include: Tony Golod and Jay Rouse, who wrote the NIF control software that collects data and operates the diagnostic; Wolfgang Stoeffl, who designed the RAGS apparatus, and Allen Riddle, who built it; Don Jedlovec, who serves as the responsible system engineer; and Carol Velsko and Bill Cassata, who are the primary RAGS operators and data analysts.

    See the full article here.

    LLNL Campus

    Operated by Lawrence Livermore National Security, LLC, for the Department of Energy’s National Nuclear Security
    Administration
    DOE Seal
    NNSA
    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 7:58 pm on September 24, 2014 Permalink | Reply
    Tags: Applied Research & Technology, ,   

    From phys.org: “New dinosaur from New Mexico has relatives in Alberta” 

    physdotorg
    phys.org

    9/24/2014
    No Writer Credit

    A newly discovered armoured dinosaur from New Mexico has close ties to the dinosaurs of Alberta, say University of Alberta paleontologists involved in the research.

    From 76 to 66 million years ago, Alberta was home to at least five species of ankylosaurid dinosaurs, the group that includes club-tailed giants like Ankylosaurus. But fewer ankylosaurids are known from the southern parts of North America. The new species, Ziapelta sanjuanensis, was discovered in 2011 in the Bisti/De-na-zin Wilderness area of New Mexico by a team from the New Mexico Museum of Natural History and Science and the State Museum of Pennsylvania.

    bisti
    Bisti Badlands

    sph
    Sphinx(?) in Bisti Badlands

    The U of A researchers in the Faculty of Science, including recent PhD graduate Victoria Arbour and current doctoral student Michael Burns, were asked to be part of the project because of their expertise in the diversity of ankylosaurs from Alberta.

    “Bob Sullivan, who discovered the specimen, showed us pictures, and we were really excited by both its familiarity and its distinctiveness—we were pretty sure right away we were dealing with a new species that was closely related to the ankylosaurs we find in Alberta,” says Arbour.

    Ziapelta is described in a new paper in PLOS ONE. It stands out from other ankylosaurs because of unusually tall spikes on the cervical half ring, a structure like a yoke of bone sitting over the neck. Ziapelta’s skull also differentiates it from other known ankylosaurs.

    “The horns on the back of the skull are thick and curve downwards, and the snout has a mixture of flat and bumpy scales—an unusual feature for an ankylosaurid,” notes Arbour. “There’s also a distinctive large triangular scale on the snout, where many other ankylosaurids have a hexagonal scale.”

    two
    University of Alberta researchers Michael Burns and Victoria Arbour display a fossil from a newly discovered armoured dinosaur called Ziapelta sanjuanensi. Credit: Supplied

    Ziapelta hails from the Late Cretaceous, when a vast inland sea divided North America in two, and Alberta and New Mexico each boasted beachfront property. Ankylosaur fossils are common in several of the rocky formations of Southern Alberta, but none have yet been found in the lower part of an area called the Horseshoe Canyon Formation—a gap in Alberta’s ankylosaur fossil record.

    can
    Horseshoe Canyon Formation at its type locality in Horseshoe Canyon, near Drumheller. The dark bands are coal seams.

    “The rocks in New Mexico fill in this gap in time, and that’s where Ziapelta occurs,” says Arbour. “Could Ziapelta have lived in Alberta, in the gap where we haven’t found any ankylosaur fossils yet? It’s possible, but in recent years there has also been increasing evidence that the dinosaurs from the southern part of North America—New Mexico, Texas, and Utah, for example—are distinct from their northern neighbours in Alberta.”

    Arbour says Ziapelta may have belonged to this group of southern dinosaurs—but more fossils could be waiting to be unearthed in Alberta’s Badlands. “We should be on the lookout for Ziapelta fossils in the Horseshoe Canyon Formation in the future.”

    See the full article here.

    About Phys.org in 100 Words

    Phys.org™ (formerly Physorg.com) is a leading web-based science, research and technology news service which covers a full range of topics. These include physics, earth science, medicine, nanotechnology, electronics, space, biology, chemistry, computer sciences, engineering, mathematics and other sciences and technologies. Launched in 2004, Phys.org’s readership has grown steadily to include 1.75 million scientists, researchers, and engineers every month. Phys.org publishes approximately 100 quality articles every day, offering some of the most comprehensive coverage of sci-tech developments world-wide. Quancast 2009 includes Phys.org in its list of the Global Top 2,000 Websites. Phys.org community members enjoy access to many personalized features such as social networking, a personal home page set-up, RSS/XML feeds, article comments and ranking, the ability to save favorite articles, a daily newsletter, and other options.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
Follow

Get every new post delivered to your Inbox.

Join 331 other followers

%d bloggers like this: