Tagged: Applied Research & Technology Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 3:24 pm on October 25, 2014 Permalink | Reply
    Tags: Applied Research & Technology, , ,   

    From livesience: “Telltale Signs of Life Could Be Deepest Yet” 

    Livescience

    October 24, 2014
    Becky Oskin

    Telltale signs of life have been discovered in rocks that were once 12 miles (20 kilometers) below Earth’s surface — some of the deepest chemical evidence for life ever found.

    life
    White aragonite veins on Washington’s San Juan Islands may contain evidence of deep microbial life.
    Credit: Philippa Stoddard

    Researchers found carbon isotopes in rocks on Washington state’s South Lopez Island that suggest the minerals grew from fluids flush with microbial methane. Methane from living creatures has distinct levels of carbon isotopes that distinguish it from methane gas that arises from rocks. (Isotopes are atoms of the same element with different numbers of neutrons in their nuclei.)

    In a calcium carbonate mineral called aragonite, the standard mix of carbon isotopes was radically shifted toward lighter carbon isotopes (by about 50 per mil, or parts per thousand). This ratio is characteristic of methane gas made by microorganisms, said Philippa Stoddard, an undergraduate student at Yale University who presented the research Tuesday (Oct. 21) at the Geological Society of America‘s annual meeting in Vancouver, British Columbia. “These really light signals are only observed when you have biological processes,” she told Live Science.

    The pale aragonite veins cut through basalt rocks that sat offshore North America millions of years ago. The veins formed after the basalt was sucked into an ancient subduction zone, one that predated today’s Cascadia subduction zone. Two tectonic plates smash together at subduction zones, and one plate descends under the other, creating deep trenches.

    Methane gas supplied the carbon as aragonite crystallized in cracks in the basalt, and replaced pre-existing limestone. The researchers think that microbes produced the methane gas as a waste product.

    “We reason that you could have life deeper in subduction zones, because you have a lot of water embedded in those rocks, and the rocks stay cold longer as the [plate] comes down,” Stoddard said.

    But the South Lopez Island aragonite suggests the minerals formed under extreme conditions that push the limits of life on Earth. For example, temperatures reached more than 250 degrees Fahrenheit (122 degrees Celsius), above the stability limit for DNA, Stoddard said. However, the researchers think the higher pressures at these depths may have counterbalanced the effects of the heat. The rocks are now visible thanks to faulting, which pushed them back up to the surface.

    Stoddard and her collaborators plan to sample more of the aragonite and other rocks nearby, to gain a better understanding of where the fluids came from and pin down the temperatures at which the rocks formed.

    Methane seeps teeming with million of microbes are found on the seafloor offshore Washington and Oregon along the Cascadia subduction zone. And multicellular life has been documented in the Mariana Trench, the deepest spot on Earth, and in South African mines 0.8 miles (1.3 km) deep. Researchers also have discovered microbes feasting on rocks within the oceanic crust itself.

    See the full article here.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 7:38 pm on October 23, 2014 Permalink | Reply
    Tags: Applied Research & Technology, , ,   

    From BNL: “National Synchrotron Light Source II Achieves ‘First Light'” 

    Brookhaven Lab

    October 23, 2014
    Chelsea Whyte, (631) 344-8671 or Peter Genzer, (631) 344-3174

    The National Synchrotron Light Source II detects its first photons, beginning a new phase of the facility’s operations. Scientific experiments at NSLS-II are expected to begin before the end of the year.

    crowd
    A crowd gathered on the experimental floor of the National Synchrotron Light Source II to witness “first light,” when the x-ray beam entered a beamline for the first time at the facility.

    The brightest synchrotron light source in the world has delivered its first x-ray beams. The National Synchrotron Light Source II (NSLS-II) at the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory achieved “first light” on October 23, 2014, when operators opened the shutter to begin commissioning the first experimental station (called a beamline), allowing powerful x-rays to travel to a phosphor detector and capture the facility’s first photons. While considerable work remains to realize the full potential of the new facility, first light counts as an important step on the road to facility commissioning.

    BNL NSLS II
    BNL NSLS-II Interior
    NSLS-II at BNL

    “This is a significant milestone for Brookhaven Lab, for the Department of Energy, and for the nation,” said Harriet Kung, DOE Associate Director of Science for Basic Energy Sciences. “The National Synchrotron Light Source II will foster new discoveries and create breakthroughs in crucial areas of national need, including energy security and the environment. This new U.S. user facility will advance the Department’s mission and play a leadership role in enabling and producing high-impact research for many years to come.”

    At 10:32 a.m. on October 23, a crowd of scientists, engineers, and technicians gathered around the Coherent Soft X-ray Scattering (CSX) beamline at NSLS-II, expectantly watching the video feed from inside a lead-lined hutch where the x-ray beam eventually struck the detector. As the x-rays hit the detector, cheers and applause rang out across the experimental hall for a milestone many years in the making.

    team
    The team of scientists, engineers, and technicians at the Coherent Soft X-ray Scattering (CSX) beamline gathered around the control station to watch as group leader Stuart Wilkins (seated, front) opened the shutter between the beamline and the storage ring, allowing x-rays to enter the first optical enclosure for the first time.

    “This achievement begins an exciting new chapter of synchrotron science at Brookhaven, building on the remarkable legacy of NSLS, and leading us in new directions we could not have imagined before,” said Laboratory Director Doon Gibbs. “It’s a great illustration of the ways that national labs continually evolve and grow to meet national needs, and it’s a wonderful time for all of us. Everyone at the Lab, in every role, supports our science, so we can all share in the sense of excitement and take pride in this accomplishment.”

    beam
    NSLS-II first x-rays
    Inside the beamline enclosure, a phosphor detector (the rectangle at right) captured the first x-rays (in white) which hit the mark dead center.

    In the heart of the 590,000 square foot facility, an electron gun emits packets of the negatively charged particles, which travel down a linear accelerator into a booster ring. There, the electrons are brought to nearly the speed of light, and then steered into the storage ring, where powerful magnets guide the beam on a half-mile circuit around the NSLS-II storage ring. As the electrons travel around the ring, they emit extremely intense x-rays, which are delivered and guided down beamlines into experimental end stations where scientists will carry out experiments for scientific research and discovery. NSLS-II accelerator operators have previously stored beam in the storage ring, but they hadn’t yet opened the shutters to allow x-ray light to reach a detector until today’s celebrated achievement.

    “We have been eagerly anticipating this culmination of nearly a decade of design, construction, and testing and the sustained effort and dedication of hundreds of individuals who made it possible,” said Steve Dierker, Associate Laboratory Director for Photon Sciences. ‘We have more work to do, but soon researchers from around the world will start using NSLS-II to advance their research on everything from new energy storage materials to developing new drugs to fight disease. I’m very much looking forward to the discoveries that NSLS-II will enable, and to the continuing legacy of groundbreaking synchrotron research at Brookhaven.”

    NSLS-II, a third-generation synchrotron light source, will be the newest and most advanced synchrotron facility in the world, enabling research not possible anywhere else. As a DOE Office of Science User Facility, it will offer researchers from academia, industry, and national laboratories new ways to study material properties and functions with nanoscale resolution and exquisite sensitivity by providing state-of-the-art capabilities for x-ray imaging, scattering, and spectroscopy.

    Currently 30 beamlines are under development to take advantage of the high brightness of the x-rays at NSLS-II. Commissioning of the first group of seven beamlines will begin in the coming months, with first experiments beginning at the CSX beamline before the end of 2014.

    At the NSLS-II beamlines, scientists will be able to generate images of the structure of materials such as lithium-ion batteries or biological proteins at the nanoscale level—research expected to advance many fields of science and impact people’s quality of life in the years to come.

    NSLS-II will support the Department of Energy’s scientific mission by providing the most advanced tools for discovery-class science in condensed matter and materials science, physics, chemistry, and biology—science that ultimately will enhance national and energy security and help drive abundant, safe, and clean energy technologies.

    Media Contacts:
    Karen McNulty Walsh, 631 344-8350 or kmcnulty@bnl.gov
    Chelsea Whyte, 631 344-8671 or cwhyte@bnl.gov

    See the full article here.

    BNL Campus

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world.Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 5:28 pm on October 23, 2014 Permalink | Reply
    Tags: Applied Research & Technology, ,   

    From CEP at WCG: “A productive summer for the Clean Energy Project” 

    New WCG Logo

    23 Oct 2014
    By: The Clean Energy Project team
    Harvard University

    Summary
    The Clean Energy Project team has an end-of-summer update for all the World Community Grid volunteers. Several changes to the database and work units were put in place over the summer. The team sends a big thank-you to the volunteers who make this work possible, as well as to the lab’s summer students and the departing CEP web developer.

    Hi all!

    The time has come for another update on the The Clean Energy Project – Phase 2 (CEP) on World Community Grid.

    Wow, it has been a busy and productive summer! Our redesign of the database is complete, and all new jobs are being created from, and their results being stored into, the new design. This will give us a much more quickly searchable database, capable of storing a wider variety of data – very exciting! The data that has been produced so far is being parsed into this structure as well, and is also being recompressed using a more efficient algorithm. We estimate that this recompression will save us a significant amount of storage space, meaning we can now store more results than ever!

    We were very lucky to have three brilliant students work on the CEP over the summer: Kewei, Trevor and Daniel. They were mainly focused on harnessing the power of machine learning techniques to improve how we generate molecules. Their research was very promising, and we hope to write it up into a paper or two in the near future – well done, guys! In fact, two of them (Kewei and Trevor) have agreed to continue working with us during term time, and we hope to get many more exciting projects done. We will keep you all posted on those as details emerge.

    As you have probably seen in the forums, we have had a redesign of the structure of the work units. We want to thank everyone for their patience while we sorted out all the “teething” problems, but they now seem to be working well. The reason for these changes was to allow us to try and move onto slightly different families of molecules which we have identified as being particularly interesting. It is important for the CEP to be constantly updating the molecular libraries so we can really live on the cutting edge, and hopefully discover the next “blockbuster” Organic Photovoltaic molecule (the type of molecule the CEP is looking for). To do this, we have to push up against the limits of what is possible on the grid, and we really appreciate the patience of the crunchers when we occasionally push too hard!

    We have also changed the way that we build the molecules for these libraries. This was done in order to prioritize molecules that are more synthesizable (i.e. easier for our experimental friends to make in a lab). This is a win-win, because we are also able to sample a more diverse area of chemical space.

    Thanks to all the crunchers and our friends at IBM; without you the project literally would not happen!

    We would also like to take a moment to give a big thank you to Carolina Roman-Salgado, our awesome web developer. She is moving to California at the end of September, and so will be leaving the CEP. Carolina has been absolutely fantastic in working with the CEP database and molecularspace.org (where our results are all hosted for public access), and has recently been working on an update, which we hope to release soon. Aside from her brilliant work, we will really miss having Carolina around the office – please don’t wait too long before you come visit, Carolina; you will always be welcome here!

    Your Harvard CEP Team

    See the full article here.

    World Community Grid (WCG) brings people together from across the globe to create the largest non-profit computing grid benefiting humanity. It does this by pooling surplus computer processing power. We believe that innovation combined with visionary scientific research and large-scale volunteerism can help make the planet smarter. Our success depends on like-minded individuals – like you.”

    WCG projects run on BOINC software from UC Berkeley.

    BOINC is a leader in the field(s) of Distributed Computing, Grid Computing and Citizen Cyberscience.BOINC is more properly the Berkeley Open Infrastructure for Network Computing.

    CAN ONE PERSON MAKE A DIFFERENCE? YOU BETCHA!!

    “Download and install secure, free software that captures your computer’s spare power when it is on, but idle. You will then be a World Community Grid volunteer. It’s that simple!” You can download the software at either WCG or BOINC.

    Please visit the project pages-

    Say No to Schistosoma

    GO Fight Against Malaria

    Drug Search for Leishmaniasis

    Computing for Clean Water

    The Clean Energy Project

    Discovering Dengue Drugs – Together

    Help Cure Muscular Dystrophy

    Help Fight Childhood Cancer

    Help Conquer Cancer

    Human Proteome Folding

    FightAIDS@Home

    World Community Grid is a social initiative of IBM Corporation
    IBM Corporation
    ibm

    IBM – Smarter Planet
    sp

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 6:24 pm on October 21, 2014 Permalink | Reply
    Tags: Applied Research & Technology, , Brain Studies,   

    From Princeton: “Immune proteins moonlight to regulate brain-cell connections” 

    Princeton University
    Princeton University

    October 21, 2014
    Morgan Kelly, Office of Communications

    When it comes to the brain, “more is better” seems like an obvious assumption. But in the case of synapses, which are the connections between brain cells, too many or too few can both disrupt brain function.

    Researchers from Princeton University and the University of California-San Diego (UCSD) recently found that an immune-system protein called MHCI, or major histocompatibility complex class I, moonlights in the nervous system to help regulate the number of synapses, which transmit chemical and electrical signals between neurons. The researchers report in the Journal of Neuroscience that in the brain MHCI could play an unexpected role in conditions such as Alzheimer’s disease, type II diabetes and autism.

    MHCI proteins are known for their role in the immune system where they present protein fragments from pathogens and cancerous cells to T cells, which are white blood cells with a central role in the body’s response to infection. This presentation allows T cells to recognize and kill infected and cancerous cells.

    In the brain, however, the researchers found that MHCI immune molecules are one of the only known factors that limit the density of synapses, ensuring that synapses form in the appropriate numbers necessary to support healthy brain function. MHCI limits synapse density by inhibiting insulin receptors, which regulate the body’s sugar metabolism and, in the brain, promote synapse formation.

    Tangled web

    web
    Researchers from Princeton University and the University of California-San Diego recently found that an immune-system protein called MHCI, or major histocompatibility complex class I, moonlights in the nervous system to help regulate the number of synapses, which transmit chemical and electrical signals between neurons. Pictured is a mouse hippocampal neuron studded with thousands of synaptic connections (yellow). The number and location of synapses — not too many or too few — is critical to healthy brain function. The researchers found that MHCI proteins, known for their role in the immune system, also are one of the only known factors that ensure synapse density is not too high. The protein does so by inhibiting insulin receptors, which promote synapse formation. (Image courtesy of Lisa Boulanger, Department of Molecular Biology)

    Senior author Lisa Boulanger, an assistant professor in the Department of Molecular Biology and the Princeton Neuroscience Institute (PNI), said that MHCI’s role in ensuring appropriate insulin signaling and synapse density raises the possibility that changes in the protein’s activity could contribute to conditions such Alzheimer’s disease, type II diabetes and autism. These conditions have all been associated with a complex combination of disrupted insulin-signaling pathways, changes in synapse density, and inflammation, which activates immune-system molecules such as MHCI.

    Patients with type II diabetes develop “insulin resistance” in which insulin receptors become incapable of responding to insulin, the reason for which is unknown, Boulanger said. Similarly, patients with Alzheimer’s disease develop insulin resistance in the brain that is so pronounced some have dubbed the disease “type III diabetes,” Boulanger said.

    “Our results suggest that changes in MHCI immune proteins could contribute to disorders of insulin resistance,” Boulanger said. “For example, chronic inflammation is associated with type II diabetes, but the reason for this link has remained a mystery. Our results suggest that inflammation-induced changes in MHCI could have consequences for insulin signaling in neurons and maybe elsewhere.”

    green
    This image of a neuron from a mouse hippocampus shows insulin receptors (green) and the protein calbindin (red). In this area of the brain, calbindin is present in dentate granule cells, which form synapses on MHCI-expressing cells. The extensive overlap (yellow) suggests that this neuron, which expresses insulin receptors, is a dentate granule cell neuron. (Image courtesy of Lisa Boulanger, Department of Molecular Biology)

    MHCI levels also are “dramatically altered” in the brains of people with Alzheimer’s disease, Boulanger said. Normal memory depends on appropriate levels of MHCI. Boulanger was senior author on a 2013 paper in the journal Learning and Memory that found that mice bred to produce less functional MHCI proteins exhibited striking changes in the function of the hippocampus, a part of the brain where some memories are formed, and had severe memory impairments.

    “MHCI levels are altered in the Alzheimer’s brain, and altering MHCI levels in mice disrupts memory, reduces synapse number and causes neuronal insulin resistance, all of which are core features of Alzheimer’s disease,” Boulanger said.

    Links between MHCI and autism also are emerging, Boulanger said. People with autism have more synapses than usual in specific brain regions. In addition, several autism-associated genes regulate synapse number, often via a signaling protein known as mTOR (mammalian target of rapamycin). In their study, Boulanger and her co-authors found that mice with reduced levels of MHCI had increased insulin-receptor signaling via the mTOR pathway, and, consequently, more synapses. When elevated mTOR signaling was reduced in MHCI-deficient mice, normal synapse density was restored.

    Thus, Boulanger said, MHCI and autism-associated genes appear to converge on the mTOR-synapse regulation pathway. This is intriguing given that inflammation during pregnancy, which alters MHCI levels in the fetal brain, may slightly increase the risk of autism in genetically predisposed individuals, she said.

    “Up-regulating MHCI is essential for the maternal immune response, but changing MHCI activity in the fetal brain when synaptic connections are being formed could potentially affect synapse density,” Boulanger said.

    Ben Barres, a professor of neurobiology, developmental biology and neurology at the Stanford University School of Medicine, said that while it is known that both insulin-receptor signaling increases synapse density, and MHCI signaling decreases it, the researchers are the first to show that MHCI actually affects insulin receptors to control synapse density.

    “The idea that there could be a direct interaction between these two signaling systems comes as a great surprise,” said Barres, who was not involved in the research. “This discovery not only will lead to new insight into how brain circuitry develops but to new insight into declining brain function that occurs with aging.”

    cer
    This section of adult mouse cerebellum shows insulin receptors (green) and calbindin (red), which in this case is present in the cerebellar neurons known as Purkinje cells. Insulin receptors are highly expressed in fibers that form synapses onto Purkinje cells, which express MHCI. Thus both in the cerebellum and hippocampus (previous image), insulin receptors are highly expressed in cells that form synapses onto MHCI-expressing neurons, which suggests MHCI and insulin receptors could interact, either directly or indirectly, in the living brain. (Image courtesy of Lisa Boulanger, Department of Molecular Biology)

    Particularly, the research suggests a possible functional connection between type II diabetes and Alzheimer’s disease, Barres said.

    “Type II diabetes has recently emerged as a risk factor for Alzheimer’s disease but it has not been clear what the connection is to the synapse loss experienced with Alzheimer’s disease,” he said. “Given that type II diabetes is accompanied by decreased insulin responsiveness, it may be that the MHCI signaling becomes able to overcome normal insulin signaling and contribute to synapse decline in this disease.”

    Research during the past 15 years has shown that MHCI lives a prolific double-life in the brain, Boulanger said. The brain is “immune privileged,” meaning the immune system doesn’t respond as rapidly or effectively to perceived threats in the brain. Dozens of studies have shown, however, that MHCI is not only present throughout the healthy brain, but is essential for normal brain development and function, Boulanger said. A 2013 paper from her lab published in the journal Molecular and Cellular Neuroscience showed that MHCI is even present in the fetal-mouse brain, at a stage when the immune system is not yet mature.

    “Many people thought that immune molecules like MHCI must be missing from the brain,” Boulanger said. “It turns out that MHCI immune proteins do operate in the brain — they just do something completely different. The dual roles of these proteins in the immune system and nervous system may allow them to mediate both harmful and beneficial interactions between the two systems.”

    The paper, MHC Class I Limits Hippocampal Synapse Density by Inhibiting Neuronal Insulin Receptor Signaling, was published Aug. 27 in the Journal of Neuroscience. Boulanger worked with Carolyn Tyler, a postdoctoral research fellow in PNI; Julianna Poole, who received her master’s degree in molecular biology from Princeton in 2014; Princeton senior Joseph Park; and Lawrence Fourgeaud and Tracy Dixon-Salazar, both at UCSD. The work was supported by the Whitehall Foundation; the Sloan Foundation; Cure Autism Now; the Princeton Neuroscience Institute Innovation Fund; the Silvio Varon Chair in Neuroregeneration at UCSD; Autism Speaks; and the National Science Foundation.

    See the full article here.

    About Princeton: Overview

    Princeton University is a vibrant community of scholarship and learning that stands in the nation’s service and in the service of all nations. Chartered in 1746, Princeton is the fourth-oldest college in the United States. Princeton is an independent, coeducational, nondenominational institution that provides undergraduate and graduate instruction in the humanities, social sciences, natural sciences and engineering.

    As a world-renowned research university, Princeton seeks to achieve the highest levels of distinction in the discovery and transmission of knowledge and understanding. At the same time, Princeton is distinctive among research universities in its commitment to undergraduate teaching.

    Today, more than 1,100 faculty members instruct approximately 5,200 undergraduate students and 2,600 graduate students. The University’s generous financial aid program ensures that talented students from all economic backgrounds can afford a Princeton education.

    Princeton Shield
    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 4:19 pm on October 21, 2014 Permalink | Reply
    Tags: Applied Research & Technology, , , USGS   

    From livescience: “Earthquake Forecast: 4 California Faults Are Ready to Rupture” 

    Livescience

    October 13, 2014
    Becky Oskin

    With several faults slicing through the San Francisco Bay Area, forecasting the next deadly earthquake becomes a question of when and where, not if.

    Now researchers propose that four faults have built up enough seismic strain (stored energy) to unleash destructive earthquakes, according to a study published today (Oct. 13) in the Bulletin of the Seismological Society of America.

    The quartet includes the Hayward Fault, the Rodgers Creek Fault, the Green Valley Fault and the Calaveras Fault. While all are smaller pieces of California’s San Andreas Fault system, which is more than 800 miles (1,300 kilometers) long, the four faults are a serious threat because they directly underlie cities. [Photo Journal: The Gorgeous San Andreas Fault]

    fault
    San Francisco Bay Area earthquake faults are drawn in red.

    saf
    Description: USGS diagram of San Andreas Fault
    Date: 14 March 2006

    “The Hayward Fault is just right in the heart of where people live, and the most buildings and the most infrastructure,” said Jim Lienkaemper, lead study author and a research geophysicist at the U.S. Geological Survey’s Earthquake Science Center in Menlo Park, California. “But it’s not just one fault, it’s the whole shopping basket. If you are in the middle of the Bay Area, you are near a whole lot of faults, and I’m concerned about all of them.”

    Lienkaemper and his colleagues gauged the potential for destructive earthquakes by monitoring tiny surface shifts along California faults. Certain faults are in constant motion, creeping steadily by less than 0.4 inches (1 centimeter) each year. These slow movements add up over time, cracking sidewalk curbs and buildings. They also serve as clues to what’s happening deep below ground, where earthquakes strike.

    “If you figure out where faults are creeping, it tells you where they’re locked and how much they’re locked,” Lienkaemper told Live Science.

    Fault creep varies, with some faults sliding at a snail’s pace and others barely budging. Models suggest that the diversity comes from locked zones that are 3 to 6 miles (5 to 10 km) below the surface, where the fault is stuck instead of sliding. For example, the relatively fast-creeping southern Hayward Fault is only about 40 percent locked, on average, while the slow-creeping Rodgers Creek Fault is 89 percent locked, the study reports. When these locked areas build up a critical amount of strain, they break apart in an earthquake.
    earthquakes

    sfa
    Map of Bay Area earthquake faults and creep measurement sites.
    Credit: USGS

    Lienkaemper and his co-author estimated a fault’s future earthquake potential by combining creep measurements with mathematical fault models and other regional data, such as the time since the last earthquake.

    The Hayward Fault has banked enough energy for a magnitude-6.8 earthquake, according to the study. The Rodgers Creek Fault could trigger a magnitude-7.1 earthquake, and the Green Valley Fault also has the potential to unleash a magnitude-7.1 shaker. The Northern Calaveras Fault is set for a magnitude-6.8 temblor.

    Of all Bay Area faults, the Hayward Fault is most likely to spawn a damaging earthquake in the next 30 years, scientists think. Its 1868 earthquake was called the Big One until the great 1906 San Francisco quake came along. The Hayward Fault has ruptured about every 140 years for its previous five large earthquakes. The probability of a magnitude-6.7 earthquake on the Hayward Fault is 30 percent in the next 30 years.

    Though 146 years have now passed since the last Hayward earthquake, that doesn’t mean the fault is overdue for another quake, Lienkaemper said. “The average is 160 years, but the uncertainty is plus or minus 100 years, which is almost as big as the time [interval] itself.” The 160-year average comes from an analysis of data collected from trenches dug across the fault that revealed evidence of earthquakes over thousands of years.

    The Rodgers Creek and Green Valley Faults are also closing in on their average repeat times between earthquakes.

    See the full article here.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 6:22 am on October 21, 2014 Permalink | Reply
    Tags: Applied Research & Technology, , ,   

    From SLAC: “Puzzling New Behavior Found in High-Temperature Superconductors” 


    SLAC Lab

    October 20, 2014

    Ultimate Goal: A Super-efficient Way to Conduct Electricity at Room Temperature

    Research by an international team led by SLAC and Stanford scientists has uncovered a new, unpredicted behavior in a copper oxide material that becomes superconducting – conducting electricity without any loss – at relatively high temperatures.

    This new phenomenon – an unforeseen collective motion of electric charges coursing through the material – presents a challenge to scientists seeking to understand its origin and connection with high-temperature superconductivity. Their ultimate goal is to design a superconducting material that works at room temperature.

    “Making a room-temperature superconductor would save the world enormous amounts of energy,” said Thomas Devereaux, leader of the research team and director of the Stanford Institute for Materials and Energy Sciences (SIMES), which is jointly run with SLAC. “But to do that we must understand what’s happening inside the materials as they become superconducting. This result adds a new piece to this long-standing puzzle.”

    The results are published Oct. 19 in Nature Physics.

    Delving Into Doping Differences

    The researchers used an emerging X-ray technique called resonant inelastic X-ray scattering, or RIXS, to measure how the properties of a copper oxide change as extra electrons are added in a process known as doping. The team used the Swiss Light Source’s RIXS instrument, which currently has the world’s highest resolution and can reveal atomic-scale excitations – rapid changes in magnetism, electrical charge and other properties – as they move through the material.

    Copper oxide, a ceramic that normally doesn’t conduct electricity at all, becomes superconducting only when doped with other elements to add or remove electrons and chilled to low temperatures. Intriguingly, the electron-rich version loses its superconductivity when warmed to about 30 degrees above absolute zero (30 kelvins) while the electron-poor one remains superconducting up to 120 kelvins (minus 244 degrees Fahrenheit). One of the goals of the new research is to understand why they behave so differently.

    The experiments revealed a surprising increase of magnetic energy and the emergence of a new collective excitation in the electron-rich compounds, said Wei-sheng Lee, a SLAC staff scientist and lead author on the Nature Physics paper. “It’s very puzzling that these new electronic phenomena are not seen in the electron-poor material,” he said.

    wl
    SLAC Staff Scientist Wei-sheng Lee (SLAC National Accelerator Laboratory)

    Lee added that it’s unclear whether the new collective excitation is related to the ability of electrons to pair up and effortlessly conduct electricity – the hallmark of superconductivity – or whether it promotes or limits high-temperature superconductivity. Further insight can be provided by additional experiments using next-generation RIXS instruments that will become available in a few years at synchrotron light sources worldwide.

    A Long, Tortuous Path

    This discovery is the latest step in the long and tortuous path toward understanding high-temperature superconductivity.

    Scientists have known since the late 1950s why certain metals and simple alloys become superconducting when chilled within a few degrees of absolute zero: Their electrons pair up and ride waves of atomic vibrations that act like a virtual glue to hold the pairs together. Above a certain temperature, however, the glue fails as thermal vibrations increase, the electron pairs split up and superconductivity disappears.

    Starting in 1986, researchers discovered a number of materials that are superconducting at higher temperatures. By understanding and optimizing how these materials work, they hope to develop superconductors that work at room temperature and above.

    Until recently, the most likely glue holding superconducting electron pairs together at higher temperatures seemed to be strong magnetic excitations created by interactions between electron spins. But a recent theoretical simulation by SLAC and Stanford researchers concluded that these high-energy magnetic interactions are not the sole factor in copper oxide’s high-temperature superconductivity. The new results confirm that prediction, and also complement a 2012 report on the behavior of electron-poor copper oxides by a team that included Lee, Devereaux and several other SLAC/Stanford scientists.

    “Theorists must now incorporate this new ingredient into their explanations of how high-temperature superconductivity works,” said Thorsten Schmitt, leader of the RIXS team at the Paul Scherrer Institute in Switzerland, who collaborated on the study.

    Other researchers involved in the study were from Columbia University, University of Minnesota, AGH University of Science and Technology in Poland, National Synchrotron Radiation Research Center and National Tsing Hua University in Taiwan, and the Chinese Academy of Sciences. Funding for the research came from the DOE Office of Science, U.S. National Science Foundation and Swiss National Science Foundation.

    See the full article, with animation video, here.

    SLAC Campus
    SLAC is a multi-program laboratory exploring frontier questions in photon science, astrophysics, particle physics and accelerator research. Located in Menlo Park, California, SLAC is operated by Stanford University for the DOE’s Office of Science.
    i1

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 6:49 pm on October 20, 2014 Permalink | Reply
    Tags: Applied Research & Technology, ,   

    From NYT: “25 Years Ago, NASA Envisioned Its Own ‘Orient Express’” 

    New York Times

    The New York Times

    OCT. 20, 2014
    KENNETH CHANG

    The National Aero-Space Plane was to be a revolutionary advance beyond the space shuttle.

    plane

    In his 1986 State of the Union address, President Ronald Reagan promised “a new Orient Express that could, by the end of the next decade, take off from Dulles Airport and accelerate up to 25 times the speed of sound, attaining low-earth orbit or flying to Tokyo within two hours.”

    On Oct. 3, 1989, an article in Science Times, Designing a Plane for the Leap of Space (and Back), reported frenetic activity at NASA and the Defense Department.

    “Scientists and engineers are making rapid progress in developing technologies needed to build a 17,000-mile-an-hour ‘space plane’ that could escape earth’s gravity and circle the globe in 90 minutes,” the article began.

    “Their goal,” it continued, “is a space plane that could take off and land from virtually any airport in the world, carry satellites and other space cargo into orbit cheaply, shuttle between the earth and an orbiting space station, or carry a load of bombs deep into enemy territory as fast as an intercontinental missile.”

    Proponents contended the space plane would be far cheaper to operate than the shuttle.

    Others were dubious. The Air Force, which was providing most of the financing, had already tried to back out, but the National Space Council, headed by Vice President Dan Quayle, recommended continuing work at a slower pace.

    The target for the first flight of the first experimental version, known as the X-30, was originally 1993 but was pushed back to 1997.

    25 YEARS LATER The space plane, able to fly by itself to orbit, never took off. The X-30 died in 1994. Smaller-scale hypersonic programs came and went.

    Was the X-30 technologically feasible?

    “No, and it’s still not,” said Jess Sponable, a program manager in the tactical technology office at Darpa, the Defense Advanced Research Projects Agency. For X-30 to succeed, infant ideas would have had to have been developed into robust, reliable technologies — materials that could survive intense temperatures, air-breathing engines that could fly faster and higher.

    Nonetheless, “absolutely, it was worthwhile,” Mr. Sponable said, although he added perhaps not worth the more than $1.6 billion spent. “We learned a lot.”

    The pendulum for spacecraft design has since swung away from the cutting edge to the tried and true. The Orion craft, which NASA is building for deep-space missions, is a capsule, just like the one used for the Apollo moon missions but bigger. The two private company designs that NASA chose to take future astronauts to the space station are also capsules. (The loser in that competition was a mini-shuttle offering.)

    NASA Orion Spacecraft
    NASA/Orion

    But the dream of hypersonic space planes continues.

    At Darpa, Mr. Sponable heads the XS-1 space plane project. It is not a do-it-all-at-once effort like the 1980s space plane but a much simpler, unmanned vehicle that would serve as a reusable first stage.

    Mr. Sponable is eager to figure out how to send it up many times, quickly and cheaply; the goal is 10 flights in 10 days.

    “We want operability No. 1,” he said. With the quick launches, the issue of cost “just disappears, because we can’t spend a lot of money from Day 1 to Day 2 to Day 3.”

    Darpa has awarded contracts to three industry teams to develop preliminary designs. Mr. Sponable said the decision of a next step would come next spring.

    The space plane episode illustrates the recurring money woes that have bedeviled NASA for decades: A grandiose plan is announced with fanfare and a burst of financing that fades as delays and cost overruns undercut the optimistic plans. Then a new president or a new NASA administrator changes course.

    Most recently, the Obama administration canceled plans started under President George W. Bush to send astronauts back to the moon and told NASA to consider an asteroid instead.

    If the pattern continues, NASA priorities could zig again after the next president moves into the White House in 2017.

    See the full article here.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 3:04 pm on October 20, 2014 Permalink | Reply
    Tags: Applied Research & Technology, , ,   

    From LLNL: “Supercomputers link proteins to drug side effects” 


    Lawrence Livermore National Laboratory

    10/20/2014
    Kenneth K Ma, LLNL, (925) 423-7602, ma28@llnl.gov

    New medications created by pharmaceutical companies have helped millions of Americans alleviate pain and suffering from their medical conditions. However, the drug creation process often misses many side effects that kill at least 100,000 patients a year, according to the journal Nature.

    Lawrence Livermore National Laboratory researchers have discovered a high-tech method of using supercomputers to identify proteins that cause medications to have certain adverse drug reactions (ADR) or side effects. They are using high-performance computers (HPC) to process proteins and drug compounds in an algorithm that produces reliable data outside of a laboratory setting for drug discovery.

    The team recently published its findings in the journal PLOS ONE, titled Adverse Drug Reaction Prediction Using Scores Produced by Large-Scale Drug-Protein Target Docking on High-Performance Computer Machines.

    “We need to do something to identify these side effects earlier in the drug development cycle to save lives and reduce costs,” said Monte LaBute, a researcher from LLNL’s Computational Engineering Division and the paper’s lead author.

    It takes pharmaceutical companies roughly 15 years to bring a new drug to the market, at an average cost of $2 billion. A new drug compound entering Phase I (early stage) testing is estimated to have an 8 percent chance of reaching the market, according to the Food and Drug Administration (FDA).

    A typical drug discovery process begins with identifying which proteins are associated with a specific disease. Candidate drug compounds are combined with target proteins in a process known as binding to determine the drug’s effectiveness (efficacy) and/or harmful side effects (toxicity). Target proteins are proteins known to bind with drug compounds in order for the pharmaceutical to work.

    While this method is able to identify side effects with many target proteins, there are myriad unknown “off-target” proteins that may bind to the candidate drug and could cause unanticipated side effects.

    Because it is cost prohibitive to experimentally test a drug candidate against a potentially large set of proteins — and the list of possible off-targets is not known ahead of time — pharmaceutical companies usually only test a minimal set of off-target proteins during the early stages of drug discovery. This results in ADRs remaining undetected through the later stages of drug development, such as clinical trials, and possibly making it to the marketplace.

    There have been several highly publicized medications with off-target protein side effects that have reached the marketplace. For example, Avandia, an anti-diabetic drug, caused heart attacks in some patients; and Vioxx, an anti-inflammatory medication, caused heart attacks and strokes among certain patient populations. Both therapeutics were recalled because of their side effects.

    “There were no indications of side effects of these medications in early testing or clinical trials,” LaBute said. “We need a way to determine the safety of such therapeutics before they reach patients. Our work can help direct such drugs to patients who will benefit the most from them with the least amount of side effects.”

    LaBute and the LLNL research team tackled the problem by using supercomputers and information from public databases of drug compounds and proteins. The latter included protein databases of DrugBank, UniProt and Protein Data Bank (PDB), along with drug databases from the FDA and SIDER, which contain FDA-approved drugs with ADRs.

    The team examined 4,020 off-target proteins from DrugBank and UniProt. Those proteins were indexed against the PDB, which whittled the number down to 409 off-proteins that have high-quality 3D crystallographic X-ray diffraction structures essential for analysis in a computational setting.

    mp

    The 409 off-target proteins were fed into a Livermore HPC software known as VinaLC along with 906 FDA-approved drug compounds. VinaLC used a molecular docking matrix that bound the drugs to the proteins. A score was given to each combination to assess whether effective binding occurred.

    The binding scores were fed into another computer program and combined with 560 FDA-approved drugs with known side effects. An algorithm was used to determine which proteins were associated with certain side effects.

    The Lab team showed that in two categories of disorders — vascular disorders and neoplasms — their computational model of predicting side effects in the early stages of drug discovery using off-target proteins was more predictive than current statistical methods that do not include binding scores.

    In addition to LLNL ADR prediction methods performing better than current prediction methods, the team’s calculations also predicted new potential side effects. For example, they predicted a connection between a protein normally associated with cancer metastasis to vascular disorders like aneurysms. Their ADR predictions were validated by a thorough review of existing scientific data.

    “We have discovered a very viable way to find off-target proteins that are important for side effects,” LaBute said. “This approach using HPC and molecular docking to find ADRs never really existed before.”

    The team’s findings provide drug companies with a cost-effective and reliable method to screen for side effects, according to LaBute. Their goal is to expand their computational pharmaceutical research to include more off-target proteins for testing and eventually screen every protein in the body.

    “If we can do that, the drugs of tomorrow will have less side effects that can potentially lead to fatalities,” Labute said. “Optimistically, we could be a decade away from our ultimate goal. However, we need help from pharmaceutical companies, health care providers and the FDA to provide us with patient and therapeutic data.”

    two
    LLNL researchers Monte LaBute (left) and Felice Lightstone (right) were part of a Lab team that recently published an article in PLOS ONE detailing the use of supercomputers to link proteins to drug side effects. Photo by Julie Russell/LLNL

    The LLNL team also includes Felice Lightstone, Xiaohua Zhang, Jason Lenderman, Brian Bennion and Sergio Wong.

    See the full article here.

    LLNL Campus

    Operated by Lawrence Livermore National Security, LLC, for the Department of Energy’s National Nuclear Security
    Administration
    DOE Seal
    NNSA
    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 1:41 pm on October 19, 2014 Permalink | Reply
    Tags: Applied Research & Technology, , livescience   

    From livescience: “Bronze Warrior Chariot Discovery Is ‘Find of a Lifetime'” 

    Livescience

    October 14, 2014
    Stephanie Pappas

    More than 2,000 years ago, pieces of an Iron Age chariot were burnt and buried, perhaps as a religious offering. Now, archaeologists have discovered the bronze remains of this sacrifice.

    Digging near Melton Mowbray in Leicestershire, England, an archaeology team discovered a trove of bronze chariot fittings dating back to the second or third century B.C. The remains were discovered at the Burrough Hill Iron Age Hillfort, a fortified hilltop structure that was once surrounded by farms and settlements. Though humans lived in the area beginning around 4000 B.C., it was used most heavily between about 100 B.C. and A.D. 50, according to the University of Leicester.

    stuff
    A linch pin (shown from three angles) from an Iron Age chariot that were discovered at the Burrough Hill Iron Age Hillfort in Leicestershire, England
    Credit: University of Leicester

    hill
    Maiden Castle in England is one of the largest hill forts in Europe.[1][2] Photograph taken in 1935 by Major George Allen (1891–1940).

    “This is the most remarkable discovery of material we made at Burrough Hill in the five years we worked on the site,” University of Leicester archaeologist Jeremy Taylor said in a statement. “This is a very rare discovery and a strong sign of the prestige of the site.” [See Images of the Iron Age Chariot's Remains]

    Burnt offering

    Taylor co-directs the field project at Burrough Hill, which is used to train archaeology students. It was four of these archaeology students who first found a piece of bronze near an Iron Age house within the Burrough Hill fort. More bronze pieces were found nearby.

    The pieces are the metal remains of a chariot that once belonged to a warrior or noble, according to university archaeologists. They include linchpins with decorated end caps, as well as rings and fittings that would have held harnesses. One linchpin is decorated with three wavy lines radiating from a single point, almost like the modern flag for the Isle of Man, a British dependency in the Irish Sea. The Isle of Man’s flag is decorated with an odd symbol called a triskelion, or three half-bent legs converging at the thigh.

    tris
    The flag of the Isle of Man, is composed solely of a triskele against a red background

    “The atmosphere at the dig on the day was a mix of ‘tremendously excited’ and ‘slightly shell-shocked,'” Taylor said. “I have been excavating for 25 years, and I have never found one of these pieces — let alone a whole set. It is a once-in-a-career discovery.”

    The pieces were found upon a layer of chaff, which may have provided fuel for the burning ritual. The chariot pieces were put into a box and then covered with cinder and slag after being set on fire. This may have been a ritual marking the dismantling or closing of a home at the fort, or it could have honored the change of seasons, University of Leicester archaeologists suspect.

    Bronze and iron

    Alongside the chariot pieces, the researchers found a set of iron tools, which were placed around the parts before they were burned.

    “The function of the iron tools is a bit of a mystery, but given the equestrian nature of the hoard, it is possible that they were associated with horse grooming,” Burrough Hill project co-director John Thomas said in a statement. “One piece, in particular, has characteristics of a modern curry comb, while two curved blades may have been used to maintain horses’ hooves or manufacture harness parts.”

    The pieces will be on display temporarily at the Melton Carnegie Museum in Melton Mowbray from Oct. 18 to Dec. 13.

    “Realizing that I was actually uncovering a hoard that was carefully placed there hundreds of years ago made it the find of a lifetime,” University of Leicester student Nora Battermann, who was one of the four students to make the find, said in a statement. “Looking at the objects now that they have been cleaned makes me even more proud, and I can’t wait for them to go on display.”

    See the full article here.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 1:18 pm on October 19, 2014 Permalink | Reply
    Tags: Applied Research & Technology, ,   

    From Huff Post: “Ancient Cult Complex Discovered In Israel Dates Back 3,300 Years, May Be Temple Of Baal” 

    Huffington Post
    The Huffington Post

    10/15/2014
    Dominique Mosbergen

    Archaeologists working in Israel have discovered an “ancient cult complex,” where people who lived thousands of years ago might have worshipped a Canaanite “storm god” known as Baal.

    baal
    Bronze figurine of a Baal, ca. 14th–12th century BC, found at Ras Shamra (ancient Ugarit) near the Phoenician coast. Musée du Louvre.

    site
    Inside the massive cult complex, archaeologists found facemasks, human-size containers and burnt animal bones possibly related to sacrificial practices. The Canaanite storm god may have been worshiped

    The complex was unearthed at the archaeological site of Tel Burna, located near the Israeli city of Kiryat Gat. It’s believed to date back 3,300 years.

    tel
    Aerial view of Tel Burna

    tel

    site2
    Tel Burna Archaeological Project

    Though more excavation needs to be conducted, the archaeologists said the site is believed to be quite large, with the courtyard of the complex measuring more than 50 feet on one side.

    Researchers said the site has already yielded artifacts that seem to confirm the complex’s cultic past. These include enormous jars that may have been used to store tithes, masks that might have been used in ceremonial processions, and burnt animal bones that hint at sacrificial rituals.

    Itzhaq Shai, director of the Tel Burna Excavation Project, told Live Science that it wasn’t entirely clear which god the complex was dedicated to. But he called Baal — which ancient Middle Eastern cultures worshipped as a fertility god — the “most likely candidate.” Another possibility, according to UPI, is that members of the cult worshipped a female god, like the ancient war goddess Anat.

    anat

    Excavation work at Tel Burna has been going on since 2009, and members of the public have a standing invitation to help out.
    “Unlike most excavations, we are looking for people come to participate for even just a few hours,” Shai told Fox News in 2013. ” Hopefully they will be captivated and come back.”

    See the full article here.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
Follow

Get every new post delivered to your Inbox.

Join 345 other followers

%d bloggers like this: