Tagged: Applied Research & Technology Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 10:40 am on May 5, 2016 Permalink | Reply
    Tags: Applied Research & Technology, , San Andreas Fault,   

    From Science Alert: “Scientist says the San Andreas fault is ‘locked, loaded, and ready to roll’ “ 

    ScienceAlert

    Science Alert

    5 MAY 2016
    FIONA MACDONALD

    That can’t be good.

    1
    Southern California Earthquake Centre

    California’s San Andreas fault has been quiet for far too long and is overdue for a major earthquake, a leading geoscientist has announced. In a conference this week, the state was warned to prepare for a potential earthquake as strong as magnitude 8.0.

    “The springs on the San Andreas system have been wound very, very tight. And the southern San Andreas fault, in particular, looks like it’s locked, loaded and ready to go,” said Thomas Jordan, director of the Southern California Earthquake Centre.

    Jordan gave his warning in the keynote talk of the annual National Earthquake Conference in Long Beach, the Los Angeles Times reports.

    Here’s why he’s so worried: research has shown that the Pacific plate is moving northwest relative to the North American plate at a rate of around 5 metres (16 feet) every 100 years – and that’s building up a whole lot of tension along the San Andreas fault line that needs to be relieved regularly.

    But the last time southern California experienced a major shake-up was in 1857, when a magnitude 7.9 quake rupture almost 300 km (185 miles) between Monterey County and the San Gabriel Mountains.

    Further south, areas of the fault line have been quiet even longer, with San Bernardino county not moving substantially since 1812, and the region near the Salton Sea remaining still since the late 1600s.

    All of this means that there’s a lot of tension underneath California right now. Last year, Jordan’s team found there’s a 7 percent chance the state will experience a magnitude 8.0 quake in the next three decades.

    And that’s a big problem. Back in 2008, a US Geological Survey report* found that a magnitude 7.8 earthquake on the southern San Andreas fault could cause more than 1,800 deaths, 50,000 injuries, US$200 billion in damage, and long-lasting infrastructure disruptions – such as six months of compromised sewer systems and ongoing wildfires.

    Even though Los Angeles isn’t on the San Andreas fault line, simulations by the Southern California Earthquake Centre show that the shaking would quickly spread there:


    Access mp4 video here .

    According to their modelling, that size earthquake could cause shaking for nearly 2 minutes, said Jordan, with the strongest activity in the Coachella Valley, Inland Empire and Antelope Valley.

    The reason Los Angeles is at so much risk is because it’s built over a sedimentary basin, and the seismic waves spread and get trapped there to cause more extreme and longer-lasting shaking. As you can see in the magnitude 8.0 simulation:


    Access mp4 video here .

    While Jordan praised recent initiatives to earthquake retrofit buildings in LA, he warned that the rest of the state needs to get ready for the next big one, by making residents more aware of ways to stay safe during an earthquake and when and how to evacuate.

    “We are fortunate that seismic activity in California has been relatively low over the past century,” Jordan explained last year. “But we know that tectonic forces are continually tightening the springs of the San Andreas fault system, making big quakes inevitable.”

    *Science paper
    The ShakeOut Scenario

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 4:10 am on May 4, 2016 Permalink | Reply
    Tags: Applied Research & Technology, ,   

    From PPPL: “Scientists challenge conventional wisdom to improve predictions of the bootstrap current at the edge of fusion plasmas” 


    PPPL

    May 3, 2016
    John Greenwald

    1
    Simulation shows trapped electrons at left and passing electron at right that are carried in the bootstrap current of a tokamak. Credit: Kwan Liu-Ma, University of California, Davis.

    Researchers at the U.S. Department of Energy’s (DOE) Princeton Plasma Physics Laboratory (PPPL) have challenged understanding of a key element in fusion plasmas. At issue has been an accurate prediction of the size of the “bootstrap current” — a self-generating electric current — and an understanding of what carries the current at the edge of plasmas in doughnut-shaped facilities called tokamaks. This bootstrap-generated current combines with the current in the core of the plasma to produce a magnetic field to hold the hot gas together during experiments, and can produce stability at the edge of the plasma.

    The recent work, published* in the April issue of the journal Physics of Plasmas, focuses on the region at the edge in which the temperature and density drop off sharply. In this steep gradient region — or pedestal — the bootstrap current is large, enhancing the confining magnetic field but also triggering instability in some conditions.

    The bootstrap current appears in a plasma when the pressure is raised. It was first discovered at the University of Wisconsin by Stewart Prager, now director of PPPL, and Michael Zarnstorff, now deputy director for research at PPPL. Prager was Zarnstorff’s thesis advisor at the time.

    Essential for predicting instabiities

    Physics understanding and accurate prediction of the size of the current at the edge of the plasma is essential for predicting its effect on instabilities that can diminish the performance of fusion reactors. Such understanding will be vital for ITER, the international tokamak under construction in France to demonstrate the feasibility of fusion power.

    ITER icon
    ITER Tokamak
    ITER Tokamak

    This work was supported by the DOE Office of Science.

    The new paper, by physicists Robert Hager and C.S. Chang, leader of the Scientific Discovery through Advanced Computing project’s Center for Edge Physics Simulation headquartered at PPPL, discovered that the bootstrap current in the tokamak edge is mostly carried by the “magnetically trapped” electrons that cannot travel as freely as the “passing” electrons in plasma. The trapped particles bounce between two points in the tokamak while the passing particles swirl all the way around it.

    Challenge to conventional understanding

    The discovery challenges conventional understanding and provides an explanation of how the bootstrap current can be so large at the tokamak edge, where the passing electron population is small. Previously, physicists thought that only the passing electrons carry the bootstrap current. “Correct modeling of the current enables accurate prediction of the instabilities,” said Hager, the lead author of the paper.

    The researchers performed the study by running an advanced global code called “XGCa” on the Mira supercomputer at the Argonne Leadership Computing Facility, a DOE Office of Science User Facility located at the Department’s Argonne National Laboratory.

    MIRA IBM Blue Gene Q supercomputer at the Argonne Leadership Computing Facility
    MIRA IBM Blue Gene Q supercomputer at the Argonne Leadership Computing Facility

    Researchers turned to the new global code, which models the entire plasma volume, because simpler local computer codes can become inadequate and inaccurate in the pedestal region.

    Numerous XGCa simulations led Hager and Chang to construct a new formula that greatly improves the accuracy of bootstrap current predictions. The new formula was found to fit well with all the XGCa cases studied and could easily be implemented into modeling or analysis codes.

    *Science paper:
    Gyrokinetic neoclassical study of the bootstrap current in the tokamak edge pedestal with fully non-linear Coulomb collisions

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Princeton Plasma Physics Laboratory is a U.S. Department of Energy national laboratory managed by Princeton University. PPPL, on Princeton University’s Forrestal Campus in Plainsboro, N.J., is devoted to creating new knowledge about the physics of plasmas — ultra-hot, charged gases — and to developing practical solutions for the creation of fusion energy. Results of PPPL research have ranged from a portable nuclear materials detector for anti-terrorist use to universally employed computer codes for analyzing and predicting the outcome of fusion experiments. The Laboratory is managed by the University for the U.S. Department of Energy’s Office of Science, which is the largest single supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.

     
  • richardmitnick 3:47 am on May 4, 2016 Permalink | Reply
    Tags: Applied Research & Technology, ,   

    From EPFL: “Your brain suppresses perception of heartbeat, for your own good” 

    EPFL bloc

    Ecole Polytechnique Federale Lausanne

    04.05.16
    Lionel Pousaz

    1
    thinkstockphotos

    Researchers have discovered that the human brain suppresses the sensory effects of the heartbeat. They believe that this mechanism prevents internal sensations from interfering with the brain’s perception of the external world. This mechanism could also have something to do with anxiety disorders.

    Our heart is constantly beating yet we normally do not feel it. It turns out that our brain is capable of filtering out the cardiac sensation so that it doesn’t interfere with the brain’s ability to perceive external sensations. For the first time, researchers from the Center for Neuroprosthetics at EPFL have identified this mechanism. They discovered that a certain region in the brain determines where internal and external sensations interact. Their work appears in The Journal of Neuroscience.

    EPFL’s neuroscientists noted that the brain perceives visual stimuli less effectively if they occur in time with the heartbeat. It seems as if the brain wants to avoid processing information that is synchronized with the body’s heartbeat.

    “We don’t see the same way as a video camera does”

    “We are not objective, and we don’t see everything that hits our retina like a video camera does,” said Roy Salomon from the Laboratory of Cognitive Neuroscience, one of the study’s co-authors. “The brain itself decides which information to bring to awareness. But what’s surprising is that our heart also affects what we see!”

    The researchers carried out an initial series of experiments with more than 150 volunteers. The volunteers were subjected to a visual stimulus – an octagonal shape flashing on a screen. When this geometric shape flashed in sync with the subject’s heartbeat, the subject had more difficulties perceiving it.

    What’s happening in the brain – a first insight

    The researchers just needed to figure out what was happening in the brain. They were able to show that a specific region, the insular cortex, acts as a filter and intercepts the sensations coming from the body’s beating heart.

    They did this by running the experiment again in an MRI scanner. When the visual stimuli were not in sync with the subject’s heartbeat, the insular cortex functioned normally and the subject perceived the flashing octagon easily. But when the stimuli occurred in time with the heart rate, the level of activity in the insular cortex dropped noticeably: the subject was less aware – or totally unaware – of the flashing shape being shown.

    It did not take long for Roy to get over his initial surprise at his discovery. “You don’t want your internal sensations to interfere with your external ones. It’s in your interest to be aware of what’s outside you. Since our heart was already beating while our brain was still forming, we’ve been exposed to it since the very start of our existence. So it’s not surprising that the brain acts to suppress it and make it less apparent.”

    Is feeling one’s heartbeat realted to anxiety?

    Awareness of one’s heartbeat is known to be correlated with a number of psychological problems, including anxiety disorders. Patients typically perceive their heart rate more clearly than most people. “But someone who does not suffer from this type of disorder can also be aware of their heartbeat,” said Roy. “This can happen at times of intense excitement or fear, for example.”

    Could anxiety disorders be, at least in part, the cause or effect of someone’s inability to silence their heartbeat? “We don’t know that yet. What we do know now is that, under most conditions, we are not aware of our own heartbeat and that there is a specific region of the brain whose task is to suppress it.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    EPFL is Europe’s most cosmopolitan technical university. It receives students, professors and staff from over 120 nationalities. With both a Swiss and international calling, it is therefore guided by a constant wish to open up; its missions of teaching, research and partnership impact various circles: universities and engineering schools, developing and emerging countries, secondary schools and gymnasiums, industry and economy, political circles and the general public.

     
  • richardmitnick 3:18 pm on May 3, 2016 Permalink | Reply
    Tags: , Applied Research & Technology, ,   

    From AAAS: “The gene editor CRISPR won’t fully fix sick people anytime soon. Here’s why” 

    AAAS

    AAAS

    May. 3, 2016
    Jocelyn Kaiser

    1
    Researchers still have a ways to go before using CRISPR to repair genes in patients. iStock

    This week, scientists will gather in Washington, D.C., for an annual meeting devoted to gene therapy—a long-struggling field that has clawed its way back to respectability with a string of promising results in small clinical trials. Now, many believe the powerful new gene-editing technology known as CRISPR will add to gene therapy’s newfound momentum. But is CRISPR really ready for prime time? Science explores the promise—and peril—of the new technology.

    How does CRISPR work?

    Traditional gene therapy works via a relatively brute-force method of gene transfer. A harmless virus, or some other form of so-called vector, ferries a good copy of a gene into cells that can compensate for a defective gene that is causing disease. But CRISPR can fix the flawed gene directly, by snipping out bad DNA and replacing it with the correct sequence. In principle, that should work much better than adding a new gene because it eliminates the risk that a foreign gene will land in the wrong place and turn on a cancer gene. And a CRISPR-repaired gene will be under the control of that gene’s natural promoter, so the cell won’t make too much or too little of its protein product.

    What has CRISPR accomplished so far?

    Researchers have published successes with using CRISPR to treat animals with an inherited liver disease and muscular dystrophy, and there will be more such preclinical reports at this week’s annual meeting of the American Society of Gene and Cell Therapy (ASGCT). The buzz around CRISPR is growing. This year’s meeting includes 93 abstracts on CRISPR (of 768 total), compared with only 33 last year. What’s more, investors are flocking to CRISPR. Three startups, Editas Medicine, Intellia Therapeutics, and CRISPR Therapeutics, have already attracted hundreds of millions of dollars.

    So why isn’t CRISPR ready for prime time?

    CRISPR still has a long way to go before it can be used safely and effectively to repair—not just disrupt—genes in people. That is particularly true for most diseases, such as muscular dystrophy and cystic fibrosis, which require correcting genes in a living person because if the cells were first removed and repaired then put back, too few would survive. And the need to treat cells inside the body means gene editing faces many of the same delivery challenges as gene transfer—researchers must devise efficient ways to get a working CRISPR into specific tissues in a person, for example.

    CRISPR also poses its own safety risks. Most often mentioned is that the Cas9 enzyme that CRISPR uses to cleave DNA at a specific location could also make cuts where it’s not intended to, potentially causing cancer.

    With these caveats, do you even need CRISPR?

    Conventional gene addition treatments for some diseases are so far along that it may not make sense to start over with CRISPR. In Europe, where one gene therapy is already approved for use for a rare metabolic disorder, regulators are poised to approve a second for an immune disorder known as adenosine deaminase–severe combined immunodeficiency (SCID). And in the United States, a company this year expects to seek approval for a gene transfer treatment for a childhood blindness disease called Leber congenital amaurosis (LCA).

    At the ASCGT meeting, researchers working with the company Bluebird Bio will present interim data for a late-stage trial showing that gene addition can halt the progression of cerebral adrenoleukodystrophy, a devastating childhood neurological disease. Final results could help pave the way for regulatory approval. Bluebird will also report on trials using gene transfer for two blood disorders, sickle cell disease and β-thalassemia, bringing these treatments closer to the clinic.

    Except for LCA, in which gene-carrying viruses are injected directly into eyes, these diseases are treated by removing bone marrow cells from patients, adding a gene to the cells, and reinfusing the cells back into the patient. New, safer viral vectors have reduced risks of leukemia seen in a few patients in some early trials for immunodeficiency diseases. Researchers are seeing “excellent clinical responses,” says Donald Kohn of the University of California, Los Angeles.

    Although Kohn and other researchers have used an older gene-editing tool known as zinc finger nucleases to repair defective genes causing sickle cell disease and a type of SCID in cells in a dish, only a tiny fraction of immature blood cells needed for the therapy to work end up with the gene corrected—far below the fraction altered by now standard gene transfer methods. One reason is because the primitive blood cells aren’t dividing much (more on this below). Because gene-editing methods such as CRISPR are so much less efficient than gene addition, for several diseases, “I don’t think there will be a strong rationale for switching to editing,” says Luigi Naldini of the San Raffaele Telethon Institute for Gene Therapy in Milan, Italy.

    CRISPR also has other issues

    Using CRISPR to cut out part of a gene—not correct the sequence—is relatively easy to do. In fact, this strategy is already being tested with zinc finger nucleases in a clinical effort to stop HIV infection. In this treatment, the nucleases are used to knock out a gene for a receptor called CCR5 in blood cells that HIV uses to get into cells.

    But when CRISPR is used to correct a gene using a strand of DNA that scientists supply to cells, not just to snip out some DNA, it doesn’t work very well. That’s because the cells must edit the DNA using a process called homology-directed repair, or HDR, that is only active in dividing cells. And unfortunately, most cells in the body—liver, neuron, muscle, eye, blood stem cells—are not normally dividing. For this reason, “knocking out a gene is a lot simpler than knocking in a gene and correcting a mutation,” says Cynthia Dunbar, president-elect of ASGCT and a gene therapy researcher at the National Heart, Lung, and Blood Institute in Bethesda, Maryland.

    Researchers are working on ways to get around this limitation. The genes for HDR are present in all cells, and it’s a matter of turning them on, perhaps by adding certain drugs to the cells, says CRISPR researcher Feng Zhang of the Broad Institute in Cambridge, Massachusetts. Another avenue is to find alternatives to the Cas9 system that don’t rely on the HDR process, Zhang says.

    But the low rate of HDR in most cells is one reason why the first use of CRISPR in the clinic will likely involve disrupting genes, not fixing them. For example, several labs have shown in mice that CRISPR can remove a portion of the defective gene that causes Duchenne muscular dystrophy, so that the remaining portion will produce a functional, albeit truncated protein. Editas hopes to start a clinical trial next year to treat a form of LCA blindness by chopping out part of the defective gene. One proposed gene-editing treatment for sickle cell disease would similarly snip out some DNA, so that blood cells produce a fetal form of the oxygen-carrying protein hemoglobin.

    And CRISPR still has big safety risks

    The most-discussed safety risk with CRISPR is that the Cas9 enzyme, which is supposed to slice a specific DNA sequence, will also make cuts in other parts of the genome that could result in mutations that raise cancer risk. Researchers are moving quickly to make CRISPR more specific. For example, in January, one lab described a tweak to Cas9 that dramatically reduces off-target effects. And in April in Nature, another team showed how to make the enzyme more efficient at swapping out single DNA bases.

    But immediate off-target cuts aren’t the only worry. Although it’s possible to deliver CRISPR’s components into cells in a dish as proteins or RNA, so far researchers can usually only get it working in tissue inside the body by using a viral vector to deliver the DNA for Cas9 into cells. This means that even after Cas9 has made the desired cuts, cells will keep cranking it out. “The enzyme will still hang around over 10, 20 years,” Zhang says. That raises the chances that even a very specific Cas9 will still make off-target cuts and that the body will mount an immune response to the enzyme.

    This may not truly be a problem, Zhang suggests. His team created a mouse strain that is born with the gene for Cas9 turned on all the time, so it expresses the enzyme in all cells for the animal’s entire life. Even after interbreeding these mice for about 20 generations, the mice “seem to be fine” with no obvious abnormal health effects, Zhang says. All the same, “the most ideal case is, we want to shut off the enzyme.” And that may mean finding nonviral methods for getting Cas9 into cells, such as ferrying the protein with lipids or nanoparticles—delivery methods that biologists have long struggled to make work in living animals.

    Other long-standing obstacles to gene therapy will confront efforts using CRISPR, too. Depending on the disease, any gene-edited cells may eventually die and patients could have to be treated multiple times. Researchers using gene transfer and editing approaches are also both hindered by limits on how much DNA a viral vector can carry. Right now CRISPR researchers often must use two different viruses to get CRISPR’s components into cells, which is less efficient than a single vector.

    So what’s the bottom line?

    Gene therapists remain excited by CRISPR, in part because it could tackle many more inherited diseases than can be treated with gene transfer. Among them are certain immune diseases where the amount of the repaired protein must be precisely controlled. In other cases, such as sickle cell disease, patients won’t get completely well unless a defective protein is no longer made by their cells, so just adding a gene isn’t enough. “It opens up a lot of diseases to gene therapy because gene addition wasn’t going to work,” Dunbar says.

    After more than 2 decades of seeing their field through ups and downs, veterans of the gene therapy field are wary of raising expectations about CRISPR for treating diseases. “Whenever there’s a new technology, there’s a huge amount of excitement and everybody thinks it will be ready tomorrow to cure patients,” says gene therapy researcher Mark Kay of Stanford University in Palo Alto, California. “It’s going to take some time.”

    See the full article here .

    The American Association for the Advancement of Science is an international non-profit organization dedicated to advancing science for the benefit of all people.

    Please help promote STEM in your local schools.
    STEM Icon
    Stem Education Coalition

     
  • richardmitnick 2:45 pm on May 3, 2016 Permalink | Reply
    Tags: Applied Research & Technology, Columbia University Medical Center, Hi-Res Images Reveal How a “SuperBug” Hides from Antibiotics,   

    From Columbia: “Hi-Res Images Reveal How a “SuperBug” Hides from Antibiotics” 

    Columbia U bloc

    Columbia University

    CUMC bloc

    March 7, 2016 [just appeared in social media]

    1
    Multidrug-resistant Klebsiella pneumoniae gram-negative bacteria are known to cause severe hospital-acquired infections. Image: David Dorward, PhD, National Institute of Allergy and Infectious Diseases

    “The force” is not just for Jedi knights.

    Bacteria have developed their own “force” to hide from our antibiotics, and they are increasingly using this strategy to chip away at the effectiveness of polymyxins, our last line of defense against some “superbug” infections.

    Biologists at Columbia are now peering inside these bacteria with super high-resolution imaging techniques and have found places where drugs could disrupt the bugs’ defense and restore their susceptibility to these powerful antibiotics.

    To evade detection by polymyxin antibiotics, bugs like E. coli, Salmonella, and Klebsiella pneumoniae–all gram-negative bacteria–are known to alter their electrostatic charge.

    “Polymyxins find bacteria via electrostatic attraction,” says Vasileios Petrou, PhD, a postdoc in the lab of Filippo Mancia, PhD, assistant professor of physiology & cellular biophysics. “Polymyxins are positively charged, so they are attracted to negatively charged parts of the bacteria.”

    Bacteria become resistant to polymyxins by placing a cap, made from a sugar molecule, over the negative charge. This trick alters the electrostatic forces between the bacteria and antibiotics.

    “It’s like the bacteria become invisible to polymyxins,” Dr. Mancia says. “The antibiotics can’t stick to the bacteria or kill them.”

    An enzyme called ArnT in the membrane of these bacteria is responsible for the capping. First, ArnT grabs a sugar from a lipid, then the sugar is planted on the negative charge.


    Access mp4 video here .

    The Columbia researchers were able to visualize the precise details of this process by using X-ray crystallography to reveal the location of each individual atom in the ArnT enzyme before and after it grabs the sugar [see video above].

    These images reveal places where the enzyme could be disabled. “To grab the sugar, the ArnT enzyme must first bind to the lipid that carries it, and this binding happens in a large ‘pocket’ in the enzyme’s side,” says Jérémie Vendome, PhD, a research associate scientist in the lab of Barry Hönig.

    Filling the pocket with a drug could prevent the binding. “Essentially, that would sensitize the bacteria to the antibiotic again,” Dr, Petrou says.

    Dr. Vendome is now using computerized techniques to virtually screen millions of potential drug candidates to detect those that fit in the pocket. Hits generated from the virtual screening will be tested with polymyxins to see if the combination can eliminate antibiotic-resistant bacteria.

    “We are not pharma, but we can do some initial development in the lab,” Dr. Mancia says. “We hope that this work will lead to the development of a co-drug that will allow us to extend the lives of already available antibiotics.”

    Details of the research were published* Feb. 5 in the journal Science.

    The research performed at Columbia University was supported by grants from the NIH (U54GM095315, R01GM111980) and a Charles H. Revson Senior Fellowship. The New York Consortium on Membrane Protein Structure, led by Wayne Hendrickson, PhD, contributed valuable support.

    *Science paper:
    Structures of aminoarabinose transferase ArnT suggest a molecular basis for lipid A glycosylation

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Columbia U Campus

    Columbia University was founded in 1754 as King’s College by royal charter of King George II of England. It is the oldest institution of higher learning in the state of New York and the fifth oldest in the United States.

     
  • richardmitnick 2:10 pm on May 3, 2016 Permalink | Reply
    Tags: , Applied Research & Technology,   

    From AGU: “Scientists find likely cause for recent southeast U.S. earthquakes” 

    AGU bloc

    American Geophysical Union

    1

    3 May 2016
    Lauren Lipuma

    2
    Shaking from the magnitude 5.8 earthquake near Mineral, Virginia on August 23, 2011 was felt by more people than any other earthquake in U.S. history, according to the U.S. Geological Survey. Credit: USGS.

    The southeastern United States should, by all means, be relatively quiet in terms of seismic activity. It’s located in the interior of the North American Plate, far away from plate boundaries where earthquakes usually occur. But the area has seen some notable seismic events – most recently, the 2011 magnitude-5.8 earthquake near Mineral, Virginia that shook the nation’s capital.

    Now, scientists report in a new study* a likely explanation for this unusual activity: pieces of the mantle under this region have been periodically breaking off and sinking down into the Earth. This thins and weakens the remaining plate, making it more prone to slipping that causes earthquakes. The study authors conclude this process is ongoing and likely to produce more earthquakes in the future.

    “Our idea supports the view that this seismicity will continue due to unbalanced stresses in the plate,” said Berk Biryol, a seismologist at the University of North Carolina Chapel Hill and lead author of the new study. “The [seismic] zones that are active will continue to be active for some time.”

    The study* was published today in the Journal of Geophysical Research – Solid Earth, a journal of the American Geophysical Union.

    Compared to earthquakes near plate boundaries, earthquakes in the middle of plates are not well understood and the hazards they pose are difficult to quantify. The new findings could help scientists better understand the dangers these earthquakes present, according to the study’s authors.

    Old plates and earthquakes

    Tectonic plates are composed of Earth’s crust and the uppermost portion of the mantle. Below that is the asthenosphere: the warm, viscous conveyor belt of rock on which tectonic plates ride.

    3
    A map of the North American Plate. Arrows show directions of its movement across Earth’s surface.
    Credit: Alataristarion via Wikimedia Commons.

    Earthquakes typically occur at the boundaries of tectonic plates, where one plate dips below another, thrusts another upward, or where plate edges scrape alongside each other.

    The tectonic plates of the world were mapped in 1996, USGS.
    The tectonic plates of the world were mapped in 1996, USGS

    Earthquakes rarely occur in the middle of plates, but they can happen when ancient faults or rifts far below the surface reactivate. These areas are relatively weak compared to the surrounding plate, and can easily slip and cause an earthquake.

    Today, the southeastern U.S. is more than 1,700 kilometers (1,056 miles) from the nearest edge of the North American Plate, which covers all of North America, Greenland and parts of the Atlantic and Arctic oceans. But the region was built over the past billion years by periods of accretion, when new material is added to a plate, and rifting, when plates split apart. The authors of the new study suspected ancient fault lines or pieces of old plates extending deep in the mantle following episodes of accretion and rifting could be responsible for earthquakes in the area.

    “This region has not been active for a long time,” Biryol said. “We were intrigued by what was going on and how we can link these activities to structures in deeper parts of the Earth.”

    A CAT scan of the Earth

    To find out what was happening deep below the surface, the researchers created 3D images of the mantle portion of the North American Plate. Just as doctors image internal organs by tracing the paths of x-rays through human bodies, seismologists image the interior of the Earth by tracing the paths of seismic waves created by earthquakes as they move through the ground. These waves travel faster through colder, stiffer, denser rocks and slower through warmer, more elastic rocks. Rocks cool and harden as they age, so the faster seismic waves travel, the older the rocks.

    In the new study, researchers used tremors caused by earthquakes more than 3,500 kilometers (2,200 miles) away to create a 3D map of the mantle underlying the U.S. east of the Mississippi River and south of the Ohio River.

    The study’s authors found plate thickness in the southeast U.S. to be fairly uneven – they saw thick areas of dense, older rock stretching downward and thin areas of less dense, younger rock.

    “This was an interesting finding because everybody thought that this is a stable region, and we would expect regular plate thickness,” Biryol said.

    At first, they thought the thick, old rocks could be remnants of ancient tectonic plates. But the shapes and locations of the thick and thin regions suggested a different explanation: through past rifting and accretion, areas of the North American Plate have become more dense and were pulled downward into the mantle through gravity. At certain times, the densest parts broke off from the plate and sank into the warm asthenosphere below. The asthenosphere, being lighter and more buoyant, surged in to fill the void created by the missing pieces of mantle, eventually cooling to become the thin, young rock in the images.

    4
    Volcanoes were once active in the southeastern U.S. Mole Hill, pictured here, is a mound of volcanic rock in the Shenandoah Valley in Virginia that formed from an active volcano 48 million years ago (a relatively recent event, in geological time scales).
    Credit: Jstuby via Wikimedia Commons.

    The researchers concluded this process is likely what causes earthquakes in this otherwise stable region: when the pieces of the mantle break off, the plate above them becomes thinner and more prone to slip along ancient fault lines. Typically, the thicker the plate, the stronger it is, and the less likely to produce earthquakes.

    According to Biryol, pieces of the mantle have most likely been breaking off from underneath the plate since at least 65 million years ago. Because the researchers found fragments of hard rocks at shallow depths, this process is still ongoing and likely to continue into the future, potentially leading to more earthquakes in the region, he said.

    5

    *Science paper:
    Relationship between observed upper mantle structures and recent tectonic activity across the Southeastern United States

    See the full post here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The purpose of the American Geophysical Union is to promote discovery in Earth and space science for the benefit of humanity.

    To achieve this mission, AGU identified the following core values and behaviors.

    Core Principles

    As an organization, AGU holds a set of guiding core values:

    The scientific method
    The generation and dissemination of scientific knowledge
    Open exchange of ideas and information
    Diversity of backgrounds, scientific ideas and approaches
    Benefit of science for a sustainable future
    International and interdisciplinary cooperation
    Equality and inclusiveness
    An active role in educating and nurturing the next generation of scientists
    An engaged membership
    Unselfish cooperation in research
    Excellence and integrity in everything we do

    When we are at our best as an organization, we embody these values in our behavior as follows:

    We advance Earth and space science by catalyzing and supporting the efforts of individual scientists within and outside the membership.
    As a learned society, we serve the public good by fostering quality in the Earth and space science and by publishing the results of research.
    We welcome all in academic, government, industry and other venues who share our interests in understanding the Earth, planets and their space environment, or who seek to apply this knowledge to solving problems facing society.
    Our scientific mission transcends national boundaries.
    Individual scientists worldwide are equals in all AGU activities.
    Cooperative activities with partner societies of all sizes worldwide enhance the resources of all, increase the visibility of Earth and space science, and serve individual scientists, students, and the public.
    We are our members.
    Dedicated volunteers represent an essential ingredient of every program.
    AGU staff work flexibly and responsively in partnership with volunteers to achieve our goals and objectives.

     
  • richardmitnick 9:05 am on May 3, 2016 Permalink | Reply
    Tags: Applied Research & Technology, , Polaritons   

    From DESY: “Crossing light with matter” 

    DESY
    DESY

    2016/05/02

    1
    The “plot for experts”: the recorded detector image shows that the incident X-ray radiation is amplified particularly at certain angles (blue and green areas). The energy difference between the two resonances that occur is a tiny 37.3 nano-electronvolts. No image credit.

    Precision spectroscopy of X-ray polaritons

    When light interacts with matter, it may be deflected or absorbed, resulting in the excitation of atoms and molecules; but the interaction can also produce composite states of light and matter which are neither one thing nor the other, and therefore have a name of their own – polaritons. These hybrid particles, named in allusion to the particles of light, photons, have now been prepared and accurately measured for the first time in the field of hard X-rays by researchers of DESY, ESRF in Grenoble, Helmholtz Institute in Jena and University of Jena. In the journal Nature Photonics, they describe* the surprising discoveries they made in the process.

    From a scientific point of view, polaritons are an extremely interesting type of quasiparticles. Scientists have recently succeeded in using polaritons to create a new type of source of visible laser light which does not depend on the stimulated emission that is necessary in conventional lasers. If this technology can be transferred to the field of X-rays, it could serve as the basis for a new type of narrow-band X-ray laser.

    Polaritons can be created particularly well using atoms whose nuclei have very sharply defined excitation states, so-called resonances. In the domain of X-rays, the Mössbauer isotope iron-57 (57Fe), whose atomic nucleus displays an extremely narrow energy resonance at an energy of 14.4 kilo-electronvolts (keV), is ideal for this purpose. For their experiment, the scientists manufactured periodic stacks made of alternating layers of 57Fe and non-resonant 56Fe, the most commonly occurring isotope of iron, each less than two nanometres thick. When X-rays from a synchrotron source are shone at such a periodic array of atoms at a certain angle, the layers act as an amplifier for the X-rays: resonance occurs at precisely the same energy as that displayed by the 57Fe nuclei. “This combination of two different resonant systems gives rise to a remarkable phenomenon,” explains Johann Haber, the principle author of the study and a doctoral student at DESY. “The resonances of the X-rays and the atomic nuclei seem to try and get out of each other’s way, because a hybrid of atoms and light is formed, which displays two new resonances having different energies that weren’t present beforehand.” This is a so-called collective effect which is caused by the mutual interaction of a large number of atomic nuclei with the X-rays.”

    The separation of the energy levels of these new resonances closely depends on the interaction between the nuclei and the X-rays. In their experiment, the scientists were for the first time able to determine the spectral structure of the resonances of such a system with extremely high precision. They were helped in this by a novel detection method developed by the team surrounding Ingo Uschmann, a researcher from Jena. This method is able to separate the signal of the atomic nuclei from the background signal with a very high degree of sensitivity. Thanks to this apparatus, the scientists managed to measure the two new resonances, which are separated by only 37.3 nano-electronvolts and which can be attributed to the creation of polaritons. “We were able to give an excellent theoretical description of the results using a quantum-optical model specifically developed for this purpose,” says Johann Haber.

    “Being able to prepare and measure polaritons of this type in the X-ray range is an important step on the path to the high-precision creation of radiation fields by modern X-ray sources, especially by the new X-ray lasers,” explains Ralf Röhlsberger, the researcher from DESY who was in charge of this work group. “The simultaneous emission of many identical photons during the decay of nuclear polaritons could lead to extremely narrow-band, non-classical light sources in the X-ray range, and open the way for new applications in high-precision spectroscopy.” At the same time, the experiment is a further step towards establishing quantum optics in the X-ray domain.

    *Science paper:
    Collective strong coupling of X-rays and nuclei in a nuclear optical lattice

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    desi

    DESY is one of the world’s leading accelerator centres. Researchers use the large-scale facilities at DESY to explore the microcosm in all its variety – from the interactions of tiny elementary particles and the behaviour of new types of nanomaterials to biomolecular processes that are essential to life. The accelerators and detectors that DESY develops and builds are unique research tools. The facilities generate the world’s most intense X-ray light, accelerate particles to record energies and open completely new windows onto the universe. 
That makes DESY not only a magnet for more than 3000 guest researchers from over 40 countries every year, but also a coveted partner for national and international cooperations. Committed young researchers find an exciting interdisciplinary setting at DESY. The research centre offers specialized training for a large number of professions. DESY cooperates with industry and business to promote new technologies that will benefit society and encourage innovations. This also benefits the metropolitan regions of the two DESY locations, Hamburg and Zeuthen near Berlin.

     
  • richardmitnick 8:44 am on May 3, 2016 Permalink | Reply
    Tags: Applied Research & Technology, ,   

    From Harvard: “Mapping the circuit of our internal clock” 

    Harvard University

    Harvard University

    May 2, 2016
    Leah Burrows

    Research sheds light the neural structure that controls our sleep, eating habits, hormones and more.

    What’s that old saying about Mussolini? Say what you will but he made the trains run on time. Well, the suprachiasmatic nucleus — SCN for short — makes everything in the body run on time. The SCN is the control center for our internal genetic clock, the circadian rhythms which regulate everything from sleep to hunger, insulin sensitivity, hormone levels, body temperature, cell cycles and more.

    The SCN has been studied extensively but the underlying structure of its neural network has remained a mystery.

    Now, researchers from the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS), the University of California Santa Barbara, and Washington University in St. Louis have shown for the first time how neurons in the SCN are connected to each other, shedding light on this vital area of the brain. Understanding this structure — and how it responds to disruption — is important for tackling illnesses like diabetes and posttraumatic stress disorder. The scientists have also found that disruption to these rhythms such as shifts in work schedules or blue light exposure at night can negatively impact overall health.

    The research was recently published* in the Proceedings of the National Academies of Science (PNAS).

    “The SCN has been so challenging to understand because the cells within it are incredibly noisy,” said John Abel, first author of the paper and graduate student at SEAS. “There are more than 20,000 neurons in the SCN, each of which not only generates their own autonomous circadian oscillations but also communicates with other neurons to maintain stable phase lengths and relationships. We were able to cut through that noise and figure out which cells share information with each other.”

    The SCN looks like a miniature brain, with two hemispheres, inside the hypothalamus. It receives light cues from the retina to help it keep track of time and reset when necessary. When functioning probably, the neurons inside both hemispheres oscillate in a synchronized pattern.

    In order to understand the structure of the network, Abel and the team had to disrupt that pattern. The researchers used a potent neurotoxin commonly found in pufferfish to desynchronize the neurons in each hemisphere, turning the steady, rhythmic pulse of oscillations into a cacophony of disconnected beats. The team then removed the toxin and observed the network as it reestablished communication, using information theory to figure out which cells had to communicate to resynchronize the whole network.

    1
    Before the delivery of the neurotoxin (left) the SCN oscillate in a synchronized pattern. After the delivery of the neurotoxin (right), neurons in the SCN oscillate randomly. (Image courtesy of the Doyle Lab)

    “It’s like trying to figure out if a group of people are friends without being able to look at their phone calls or their text messages,” Abel said. “In a large group of other people, you might not be able to tell who is in contact with each other, but if a certain group shows up together at a party, you can probably assume they’re friends because they show similar behavior.”

    By observing the SCN at single-cell resolution, Abel and the team identified a core group of very friendly neurons in the center of each hemisphere that share a lot of information during resynchronization. They also observed dense connections between the hubs of each hemisphere. The neurons outside these central hubs, in the area called the shell, behaved more like acquaintances than friends, sharing little information amongst themselves.

    2
    Neurons in the SCN resychronize after being exposed to a neurotoxin. (Image courtesy of the Doyle Lab)

    “We were surprised to find that the shell lacked a functionally connected cluster of neurons,” said Abel. “We’ve known that exposure to an artificially long day can split the SCN into core and shell phase clusters which oscillate out of sync with each other. We’ve assumed that the neurons in the shell communicated to synchronize that rhythm but our research suggests that phase clustering in the shell is actually mediated by the core neurons.”

    Previous research also assumed that the core SCN was dominant only due to its role in receiving light cues from the eyes. By using the neurotoxin to disrupt circadian rhythms, Abel and the team demonstrated that the core is the key to resynchronization even without light cues.

    “For the last 15 years our group has been studying the complex control mechanisms that are responsible for the generation of robust circadian rhythms in the brain,” said Frank Doyle, the John A. Paulson Dean and John A. & Elizabeth S. Armstrong Professor of Engineering & Applied Sciences, who co-authored the paper. “This work brings us one step closer to reverse engineering those paradigms by elucidating the topology of communication amongst neurons, thus demonstrating the importance of a systems perspective to link genes to cells to the SCN tissue.”

    The research was coauthored by Kirsten Meeker, Peter St. John, Benjamin Bales and Linda Petzold of UC Santa Barbara and Thomas Wang, Daniel Granados-Fuentes and Eric Herzog, of Washington University. It was funded by the National Institute of Health and the US Army Research Office.

    *Science paper:
    Functional network inference of the suprachiasmatic nucleus

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Harvard University campus

    Harvard is the oldest institution of higher education in the United States, established in 1636 by vote of the Great and General Court of the Massachusetts Bay Colony. It was named after the College’s first benefactor, the young minister John Harvard of Charlestown, who upon his death in 1638 left his library and half his estate to the institution. A statue of John Harvard stands today in front of University Hall in Harvard Yard, and is perhaps the University’s best known landmark.

    Harvard University has 12 degree-granting Schools in addition to the Radcliffe Institute for Advanced Study. The University has grown from nine students with a single master to an enrollment of more than 20,000 degree candidates including undergraduate, graduate, and professional students. There are more than 360,000 living alumni in the U.S. and over 190 other countries.

     
  • richardmitnick 8:31 am on May 3, 2016 Permalink | Reply
    Tags: Applied Research & Technology, ,   

    From XFEL: “World’s most precise mirror arrives in Hamburg” 

    XFEL bloc

    European XFEL

    03 May 2016

    The first of several ultraflat mirrors is a milestone of a rigorous research and development effort.

    A 95-cm long mirror that is more precise than any other yet built was delivered to European XFEL, an X-ray laser research facility that is under construction in the Hamburg area of Germany. The mirror is superflat and does not deviate from its surface quality by more than one nanometre, or a billionth of a metre. It is the first of several of its kind needed for the European XFEL. Each will be essential to the facility’s operation, enabling scientists from around the globe to reliably use the world’s brightest X-ray laser light for research into ultrafast chemical processes, complex molecular structures, and extreme states of matter. The precision of the European XFEL mirror is equivalent to a 40-km long road not having any bumps larger than the width of a hair. The mirror’s production is the culmination of a long research and development process involving several institutes and companies in Japan, France, Italy, and Germany.

    The mirror body, with a 95 cm long and 5.2 cm wide reflective face, is made from a single crystal of silicon that was crafted by industrial partners in France and Italy. In order to polish a mirror of the required length to European XFEL’s nanometre specification, the optics company JTEC in Osaka, Japan, used a new polishing method using a pressurized fluid bath capable of stripping atom-thick layers off of the crystal. This development required the construction of a brand-new facility that would be able to meet the exceptional demands from the European XFEL, while also expanding the company’s ability to serve other, similar facilities, such as the LCLS in the U.S. and SwissFEL in Switzerland.

    SLAC/LCLS
    SLAC/LCLS

    SwissFEL Paul Sherrer Institute
    SwissFEL Paul Sherrer Institute

    The polishing technique alone took nearly a year to develop to a point where the extreme quality could be reached.

    1
    European XFEL scientist Maurizio Vannoni inspects the delivered superflat mirror, which does not deviate from a perfect surface by more than a billionth of a metre. European XFEL

    “When we first started working on these optics, we were looking for something that simply didn’t exist at anywhere near this precision”, says Harald Sinn, who leads the European XFEL X-Ray Optics group. “Now we have the first ever mirror at this extreme specification.”

    The mirrors have to be so precise because of the laser properties of the X-rays at the European XFEL. These properties are essential to clearly image matter at the atomic level. Previously, European XFEL simulations had shown that any distortions in the mirrors greater than one nanometre would cause the properties of the laser spot on the sample to be degraded.

    Mirrors of this series will be used to deflect the X-rays by up to a few tenths of a degree into the European XFEL’s six scientific instruments in its underground experiment hall in the town of Schenefeld. This is done because the instruments, which are parallel to each other, will eventually be able to operate in parallel, enabling scientists to have greater access to the facility and its unique X-ray light. Additionally, similar mirrors will focus the X-ray light within some of the facility’s instruments.

    However, the particular mirror that was delivered is needed for filtering the light generated by the facility to only the kind needed for experiments. Within the European XFEL’s X-ray laser light-generating structures, called undulators, some undesirable wavelengths of light are produced. A set of these superflat mirrors will be arranged after each undulator in the facility’s underground tunnels, and the position of each mirror allows for only the desired wavelength of laser light to continue towards the experiment hall. The undesirable wavelengths of light are more energetic and pass through the mirror instead of reflecting, ending up in an adjacent absorber made of boron carbide and tungsten.

    The mirror will now be measured at European XFEL and Helmholtz Zentrum Berlin for additional verification of its specifications. Three more mirrors of the same type are due to arrive at European XFEL in May.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    XFEL Campus

    The Hamburg area will soon boast a research facility of superlatives: The European XFEL will generate ultrashort X-ray flashes—27 000 times per second and with a brilliance that is a billion times higher than that of the best conventional X-ray radiation sources.

    The outstanding characteristics of the facility are unique worldwide. Starting in 2017, it will open up completely new research opportunities for scientists and industrial users.

     
  • richardmitnick 7:57 am on May 3, 2016 Permalink | Reply
    Tags: Applied Research & Technology, ,   

    From PNNL: “A New Model for Simulating DNA’s ‘Atmosphere’ of Ions” 

    PNNL BLOC
    PNNL Lab

    1
    The study compared two models of very large ions (macroions) that carry an electrical charge, and compared the results to experimental studies. On the left is a model of a cylinder with uniform axial charge density; on the right is the more complex (and useful) discrete charge model.

    April 2016

    Refined insights into critical ionic interactions with nature’s building blocks

    Nucleic acids, large biomolecules essential to life, include the familiar double-stranded deoxyribonucleic acid (DNA), a very stable long molecule that stores genetic information.

    In nature, DNA exists within a solution rife with electrostatically charged atoms or molecules called ions. A recent study by researchers at Pacific Northwest National Laboratory (PNNL) proposed a new model of how B-DNA, the form of DNA that predominates in cells, is influenced by the water-and-ions “atmosphere” around it.

    Understanding the ionic atmosphere around nucleic acids, and being able to simulate its dynamics, is important. After all, this atmosphere stabilizes DNA’s structure; it impacts how DNA is folded and “packed” in cells, which triggers important biological functions; and it strongly influences how proteins and drugs bind to DNA.

    The research combines theoretical modeling and experiments in a study of ion distribution around DNA. It was led by PNNL physical scientist Maria Sushko, computational biologist Dennis Thomas, and applied mathematician Nathan Baker, in concert with colleagues from Cornell University and Virginia Tech.

    Earlier approaches have been used to simulate the distribution of ions around biomolecules like DNA. But only roughly. The PNNL-led study goes beyond commonplace electrostatics to propose a more refined but still computationally efficient model of what happens in these critical ionic atmospheres.

    “The main idea was to dissect the complex interplay of interactions, and to understand the main forces driving ions deep inside the DNA helix and the forces keeping them on its surface,” said Sushko, the paper’s first author. That interplay includes the correlation of ions within the solution, how they move, how they interact with one another, and how they interact with the DNA.

    The new model has two key advantages over older simulations: It allows researchers to turn ion-water and ion-ion interactions on and off at will. “We can calculate important interactions independently,” she said, a flexibility not present in previous simulations. And the new model is computationally efficient, allowing researchers to cheaply simulate a large-scale molecular event over a long time scale.

    Results: Importantly, both previous and new experiments by the Cornell colleagues identified the number of bound ions around DNA. Previous simplified models were also able to reproduce this number. But the new model “is richer than that,” said Sushko, because it gives more details on how ions are distributed along the surface of DNA and within DNA’s critical grooves. “DNA interaction will strongly depend on where those ions sit,” she said. For one, the presence of ions in the grooves relates to how compact DNA will be. “The more ions within the grooves,” said Sushko, “the more compact the structure.”

    The researchers confirmed that biological “correlation,” a measure of ion affinity, allowed DNA to pack more tightly by effectively neutralizing DNA’s electrostatic charge. Researchers also observed how ions get distributed through a solution, a water-ion interaction called solvation. The stronger the water-ion interaction, the larger the effective ion size, and therefore the less likely the ion was to settle in the DNA’s grooves. More strongly solvated ions, therefore, create a different environment for DNA folding.

    Researchers observed results regarding the activity of three types of salts within the simulated ionic environment. Small, single-charge ions did not strongly react with water; about 50 percent of these bond ions could penetrate into DNA grooves. Large ions with triple charges were not strongly hydrated, but their size prevented penetration into the grooves. (“They just decorate the surface,” said Sushko.)

    Only 15 to 20 percent of ions with double charges, which were strongly hydrated and strongly correlated, settled in DNA grooves. That showed a “very delicate interplay” of ion-to-ion and ion-to-water interactions, according to Sushko.

    Why It Matters: These results highlight important aspects of the properties of electrolyte solutions influencing the ionic atmosphere that impacts DNA condensation. This “packing” of DNA, which is otherwise one of the longest molecules in nature, is essential to DNA’s role in gene regulation. DNA condensation is also the key to protein binding and drug binding. It therefore points to practical applications in medicine and biotechnology.

    This research also highlights the impact of the ionic atmosphere on the interaction between biomolecules and a ligand: that is, the molecule, ion, or protein that binds with a protein or the DNA double helix for some biological purpose.

    But it is the “methodology itself,” not the designed simulations of DNA, that is most important, said Sushko, in part because it provides a new computational model of how to see into complex molecular systems. “We get a better fundamental understanding of the important forces.”

    Methods: Researchers employed two coarse-grained models to simulate the DNA macroion, which is a large colloidal ion carrying a charge. The goal was to capture two versions of detail on how ions spread out in a solvent and how they interact with simulations of DNA topology.

    One DNA model posited an infinitely long cylinder with a uniform charge density along one axis. Sushko called it “a very crude model used a lot in the past. It explains quite a lot about ion interactions, but it is deficient in some ways.” The second, more complex “discrete charge” model posited three types of spheres in a helical array that mimics B-form DNA. It had a 3D-like character that allowed ions to penetrate into DNA grooves.

    The DNA simulations were run through four computational models of classical density function theory to assess the energetics of different ion-DNA interactions. Results were also compared to data from what Sushko called “state-of-the-art experiments” that used anomalous small-angle x-ray scattering. This technique, used to investigate the spatial dimensions of structures in the nanometer range, always yields a lot of detail about how ions are distributed around a biomolecule.

    The uniformly charged cylinder model was not good at simulating the ionic atmosphere around DNA. “This model is a very common simplification,” said Sushko. “You get the same number of ions attached to DNA, but the distribution is completely wrong. In this model, ions will just sit somewhere on the surface.”

    But their more complex discrete charge model provided a much more naturalistic portrait of ion distribution in an ionic atmosphere. Its simulations showed ions both clinging to the helical DNA surface and also penetrating into the DNA’s grooves. “The small details of ion penetration are very important for the way DNA will package the chromosome,” she said.

    What’s Next? Researchers plan to study the role of the ionic atmosphere in mediating interactions between DNA molecules. They also plan to extend their DNA model to include DNA sequence-specific effects, which often influence ion binding, and DNA sequence-dependent structural variations.

    Science paper:
    The role of correlation and solvation in ion interactions with B-DNA

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    Pacific Northwest National Laboratory (PNNL) is one of the United States Department of Energy National Laboratories, managed by the Department of Energy’s Office of Science. The main campus of the laboratory is in Richland, Washington.

    PNNL scientists conduct basic and applied research and development to strengthen U.S. scientific foundations for fundamental research and innovation; prevent and counter acts of terrorism through applied research in information analysis, cyber security, and the nonproliferation of weapons of mass destruction; increase the U.S. energy capacity and reduce dependence on imported oil; and reduce the effects of human activity on the environment. PNNL has been operated by Battelle Memorial Institute since 1965.

    i1

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
Follow

Get every new post delivered to your Inbox.

Join 554 other followers

%d bloggers like this: