Tagged: Applied Research & Technology Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 7:50 am on May 2, 2015 Permalink | Reply
    Tags: Applied Research & Technology, ,   

    From Princeton: “Digging for Meaning in the Big Data of Human Biology” 

    Princeton University
    Princeton University

    April 28, 2015
    No Writer Credit

    1

    Since the Human Genome Project drafted the human body’s genetic blueprint more than a decade ago, researchers around the world have generated a deluge of information related to genes and the role they play in diseases like hypertension, diabetes, and various cancers.

    Although thousands of studies have made discoveries that promise a healthier future, crucial questions remain. An especially vexing challenge has been to identify the function of genes in specific cells, tissues, and organs. Because tissues cannot be studied by direct experimentation (in living people), and many disease-relevant cell types cannot be isolated for analysis, the data have emerged in bits and pieces through studies that produced mountains of disparate signals.

    A multi-year effort by researchers from Princeton and other universities and medical schools has taken a big step toward extracting knowledge from these big data collections and opening the door to new understanding of human illnesses. Their paper, published online by the prestigious biology journal Nature Genetics, demonstrates how computer science and statistical methods can comb broad expanses of diverse data to identify how genetic circuits function and change in different tissues relevant to disease.

    Led by Olga Troyanskaya, professor in the Department of Computer Science and the Lewis-Sigler Institute of Integrative Genomics and deputy director for genomics at the Simons Center for Data Analysis in New York, the team used integrative computational analysis to dig out interconnections and relationships buried in the data pile. The study collected and integrated about 38,000 genome-wide experiments from an estimated 14,000 publications. Their findings produced molecular-level functional maps for 144 different human tissues and cell types, including many that are difficult or impossible to uncover experimentally.

    “A key challenge in human biology is that genetic circuits in human tissues and cell types are very difficult to study experimentally,” Troyanskaya said. “For example, the podocyte cells in the kidneys, which are the cells that perform the filtering that the kidneys are responsible for, cannot be isolated and studied experimentally. Yet we must understand how proteins interact in these cells if we want to understand and treat chronic kidney disease. Our approach mines big data collections to build a map of how genetic circuits function in the podocyte cells, as well as in many other disease-relevant tissues and cell types.”

    These networks allow biomedical researchers to understand the function and interactions of genes in specific cellular contexts and can illuminate the molecular basis of many complex human diseases. The researchers developed an algorithm, which they call a network-guided association study, or NetWAS, that combines these tissue-specific functional maps with standard genome-wide association studies (GWAS) in order to identify genes that are causal drivers of human disease. Because the technique is completely data-driven, NetWAS avoids biases toward well-studied genes and diseases — enabling discovery of completely new disease-associated genes, processes, and pathways.

    To put NetWAS and the tissue-specific networks in the hands of biomedical researchers around the world, the team created an interactive server called GIANT (for Genome-scale Integrated Analysis of Networks in Tissues). GIANT allows users to explore these networks, compare how genetic circuits change across tissues, and analyze data from genetic studies to find genes that cause disease.

    Aaron K. Wong, a data scientist at the Simons Center for Data Analysis and formerly a graduate student in the computer science department at Princeton, played the lead role in creating GIANT. “Our goal was to develop a resource that was accessible to biomedical researchers,” he said. “For example, with GIANT, researchers studying Parkinson’s disease can search the substantia nigra network, which represents the brain region affected by Parkinson’s, to identify new genes and pathways involved in the disease.” Wong is one of three co-first authors of the paper.

    The paper’s other two co-first authors are Arjun Krishnan, a postdoctoral fellow at the Lewis-Sigler Institute; and Casey Greene, an assistant professor of genetics at Dartmouth College, who was a postdoctoral fellow at Lewis-Sigler from 2009 to 2012. The team also included Ran Zhang, a graduate student in Princeton’s Department of Molecular Biology, and Kara Dolinski, assistant director of the Lewis-Sigler Institute.

    Looking to the future, Troyanskaya sees practical therapeutic uses for the group’s findings about the interrelatedness of genetic actions. “Biomedical researchers can use these networks and the pathways that they uncover to understand drug action and side effects, and to repurpose drugs,” she said. “They can also be useful for understanding how various therapies work and how to develop new ones.”

    Other contributors to the study were Emanuela Ricciotti, Garret A. FitzGerald, and Tilo Grosser of the Department of Pharmacology and the Institute for Translational Medicine and Therapeutics at the Perelman School of Medicine, University of Pennsylvania; Rene A. Zelaya, of Dartmouth; Daniel S. Himmelstein, of the University of California, San Francisco; Boris M. Hartmann, Elena Zaslavsky, and Stuart C. Sealfon, of the Department of Neurology at the Icahn School of Medicine at Mount Sinai, in New York; and Daniel I. Chasman, of Brigham and Women’s Hospital and Harvard Medical School in Boston.

    The Simons Center for Data Analysis was formed in 2013 by the Simons Foundation, a private organization dedicated to advancing research in mathematics and the basic sciences.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    Princeton University Campus

    About Princeton: Overview

    Princeton University is a vibrant community of scholarship and learning that stands in the nation’s service and in the service of all nations. Chartered in 1746, Princeton is the fourth-oldest college in the United States. Princeton is an independent, coeducational, nondenominational institution that provides undergraduate and graduate instruction in the humanities, social sciences, natural sciences and engineering.

    As a world-renowned research university, Princeton seeks to achieve the highest levels of distinction in the discovery and transmission of knowledge and understanding. At the same time, Princeton is distinctive among research universities in its commitment to undergraduate teaching.

    Today, more than 1,100 faculty members instruct approximately 5,200 undergraduate students and 2,600 graduate students. The University’s generous financial aid program ensures that talented students from all economic backgrounds can afford a Princeton education.

    Princeton Shield

     
  • richardmitnick 1:24 pm on May 1, 2015 Permalink | Reply
    Tags: Applied Research & Technology, ,   

    From U Washington: “Seafloor sensors record possible eruption of underwater volcano” 

    U Washington

    University of Washington

    April 30, 2015
    Hannah Hickey

    If a volcano erupts at the bottom of the sea, does anybody see it? If that volcano is Axial Seamount, about 300 miles offshore and 1 mile deep, the answer is now: yes.

    1
    Axial Seamount and associated vent fields

    Thanks to a set of high-tech instruments installed last summer by the University of Washington to bring the deep sea online, what appears to be an eruption of Axial Volcano on April 23 was observed in real time by scientists on shore.

    “It was an astonishing experience to see the changes taking place 300 miles away with no one anywhere nearby, and the data flowed back to land at the speed of light through the fiber-optic cable connected to Pacific City — and from there, to here on campus by the Internet, in milliseconds,” said John Delaney, a UW professor of oceanography who led the installation of the instruments as part of a larger effort sponsored by the National Science Foundation.

    1
    This custom-built precise pressure sensor detects the seafloor’s rise and fall as magma, or molten rock, moves in and out of the underlying magma chamber. Three are installed on the caldera of the underwater volcano.NSF-OOI/UW/CSSF

    Delaney organized a workshop on campus in mid-April at which marine scientists discussed how this high-tech observatory would support their science. Then, just before midnight on April 23 until about noon the next day, the seismic activity went off the charts.

    The gradually increasing rumblings of the mountain were documented over recent weeks by William Wilcock, a UW marine geophysicist who studies such systems.

    During last week’s event, the earthquakes increased from hundreds per day to thousands per day, and the center of the volcanic crater dropped by about 6 feet (2 meters) over the course of 12 hours.

    “The only way that could have happened was to have the magma move from beneath the caldera to some other location,” Delaney said, “which the earthquakes indicate is right along the edge of the caldera on the east side.”

    The seismic activity was recorded by eight seismometers that measure shaking up to 200 times per second around the caldera and at the base of the 3,000-foot seamount. The height of the caldera was tracked by the bottom pressure tilt instrument, which measures the pressure of the water overhead and then removes the effect of tides and waves to calculate its position.

    The depth instrument was developed by Bill Chadwick, an oceanographer at Oregon State University and the National Oceanic and Atmospheric Administration who has also been tracking the activity at Axial Volcano and predicted that the volcano would erupt in 2015.

    The most recent eruptions were in 1998 and 2011.

    2
    After the 2011 eruption, dark black flow in the right is completely covered by a layer of glass that forms when lava, at more than 2,000 deg F, meets the near-freezing seawater.NSF-OOI/UW/CSSF

    The volcano is located about 300 miles west of Astoria, Oregon, on the Juan de Fuca Ridge, part of the globe-girdling mid-ocean ridge system — a continuous, 70,000 km (43,500 miles) long submarine volcanic mountain range stretching around the world like the strings on a baseball, and where about 70 percent of the planet’s volcanic activity occurs.

    3
    The Juan de Fuca Ridge at northwest on world map

    The highly energetic Axial Seamount, Delaney said, is viewed by many scientists as being representative of the myriad processes operating continuously along the powerful subsea volcanic chain that is present in every ocean.

    “This exciting sequence of events documented by the OOI-Cabled Array at Axial Seamount gives us an entirely new view of how our planet works,” said Richard Murray, division director for ocean sciences at the National Science Foundation. “Although the OOI-Cabled Array is not yet fully operational, even with these preliminary observations we can see how the power of innovative instrumentation has the potential to teach us new things about volcanism, earthquakes and other vitally important scientific phenomena.”

    The full set of instruments in the deep-sea observatory is scheduled to come online this year. A first maintenance cruise leaves from the UW in early July, and will let researchers and students further explore the aftermath of the volcanic activity.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The University of Washington is one of the world’s preeminent public universities. Our impact on individuals, on our region, and on the world is profound — whether we are launching young people into a boundless future or confronting the grand challenges of our time through undaunted research and scholarship. Ranked number 10 in the world in Shanghai Jiao Tong University rankings and educating more than 54,000 students annually, our students and faculty work together to turn ideas into impact and in the process transform lives and our world. For more about our impact on the world, every day.

    So what defines us — the students, faculty and community members at the University of Washington? Above all, it’s our belief in possibility and our unshakable optimism. It’s a connection to others, both near and far. It’s a hunger that pushes us to tackle challenges and pursue progress. It’s the conviction that together we can create a world of good. Join us on the journey.

     
  • richardmitnick 12:15 pm on April 29, 2015 Permalink | Reply
    Tags: , Applied Research & Technology,   

    From AAAS: “Heartland danger zones emerge on new U.S. earthquake hazard map” 

    AAAS

    AAAS

    28 April 2015
    Eric Hand

    1
    New map highlights earthquake risk zones. Blue boxes indicate areas with induced, or human-caused, quakes. USGS

    The gentle landscape of southern Kansas doesn’t exactly shout “earthquake country.” Until recently, the notoriously flat state had just two of the seismic stations used for recording and locating earthquakes. Now, 21 are in place. They have been sorely needed. Since 2013, 192 earthquakes bigger than magnitude 2 have hit Harper and Sumner counties, on the border with Oklahoma, up from just two in the previous 35 years. “It feels like we’re on the front lines of this thing,” says Rex Buchanan, the state geologist for the Kansas Geological Survey in Lawrence.

    Across the U.S. heartland, an oil and gas boom has driven a surge of small to moderate earthquakes. Scientists say that deep underground injection of wastewater from oil and gas operations is triggering the tremors by pushing critically stressed faults past the snapping point. On 23 April, the U.S. Geological Survey (USGS) released a report that, for the first time, accounts for these human-caused, or induced, earthquakes in a map of seismic hazards across the country. The new map highlights 17 areas in eight states with frequent induced earthquakes (see boxed areas on map). The probability of dangerous levels of ground shaking in some of these areas, such as the one that bleeds from central Oklahoma into southern Kansas, rivals that of California, the traditional earthquake king. “It was kind of a surprise,” says Mark Petersen, chief of the USGS National Seismic Hazard Mapping Project in Golden, Colorado.

    So far, most induced earthquakes have done no more than rattle windows. But a few have been big enough to damage buildings, and now USGS says that it can’t rule out the possibility of a magnitude-7 temblor, which would cause widespread damage.

    USGS researchers had to develop new methods to make the map. Typically, in predicting future earthquake behavior, they assume that past is prologue. In places like California, where quakes are set off by well-understood forces that cause tectonic plates to grind past each other, seismologists can invoke centuries of earthquake statistics. But for the new, induced earthquake regions, the researchers modeled the future hazard based on tremors only in the past year. They also predicted the hazard just 1 year into the future, rather than offering the usual 50-year prediction.

    That short time window is a challenge for engineers who design bridges and buildings meant to last decades, as it says nothing about the hazards a structure will face over its lifetime. Peterson isn’t sure how engineers will use the new information, but he says the agency couldn’t be confident in longer term predictions when so many factors—the price of oil, the actions of regulators—could influence earthquake rates.

    Some of these feedback loops may already be having an effect. The price of oil has dropped drastically in the past year, and many operations are slowing down or shuttering. Some states are also beginning to crack down. On 19 March, Kansas’s regulator, the Kansas Corporation Commission, issued an order that would reduce saltwater injection in Harper and Sumner counties by up to 60% for some wells—its first response to specific wells that seem to be triggering quakes. Buchanan says that April has been the quietest month for his state since August 2014, with just six earthquakes. “It’s been a little bit of a roller coaster ride over the last 18 months,” he says.

    See the full article here.

    The American Association for the Advancement of Science is an international non-profit organization dedicated to advancing science for the benefit of all people.

    Please help promote STEM in your local schools.
    STEM Icon
    Stem Education Coalition

     
  • richardmitnick 8:03 pm on April 28, 2015 Permalink | Reply
    Tags: Applied Research & Technology,   

    From NASA: “NASA Successfully Tests Shape-Changing Wing for Next Generation Aviation” 

    NASA

    NASA

    April 28, 2015

    NASA researchers, working in concert with the Air Force Research Laboratory (AFRL) and FlexSys Inc., of Ann Arbor, Michigan, successfully completed initial flight tests of a new morphing wing technology that has the potential to save millions of dollars annually in fuel costs, reduce airframe weight and decrease aircraft noise during takeoffs and landings.

    The test team at NASA’s Armstrong Flight Research Center in Edwards, California, flew 22 research flights during the past six months with experimental Adaptive Compliant Trailing Edge (ACTE) flight control surfaces that offer significant improvements over conventional flaps used on existing aircraft.

    “Armstrong’s work with ACTE is a great example of how NASA works with our government and industry partners to develop innovative technologies that make big leaps in efficiency and environmental performance,” said Jaiwon Shin, associate administrator for NASA’s Aeronautics Research Mission Directorate at the agency’s headquarters in Washington. “This is consistent with the agency’s goal to support the nation’s leadership in the aviation sector.”

    1
    NASA successfully completed flight tests of a morphing wing technology. Flap angles were adjusted from -2 degrees up to 30 degrees during the six months of testing. Credits: NASA

    AFRL began work with FlexSys in 1998 through the Small Business Innovative Research (SIBR) program. AFRL and FlexSys developed and wind tunnel tested several wing leading and trailing edge designs for various aircraft configurations through 2006. In 2009, AFRL and NASA’s Environmentally Responsible Aviation (ERA) project agreed to equip a Gulfstream III jet with ACTE flaps designed and built by FlexSys, incorporating its proprietary technology.

    ACTE technology, which can be retrofitted to existing airplane wings or integrated into entirely new airframes, enables engineers to reduce wing structural weight and to aerodynamically tailor the wings to promote improved fuel economy and more efficient operations while also reducing environmental and noise impacts.

    “The completion of this flight test campaign at Armstrong is a big step for NASA’s Environmentally Responsible Aviation Project,” said ERA project manager Fay Collier. “This is the first of eight large-scale integrated technology demonstrations ERA is finishing up this year that are designed to reduce the impact of aviation on the environment.”

    Flight testing was key to proving the concept’s airworthiness. The test aircraft was flown with its experimental control surfaces at flap angles ranging from -2 degrees up to 30 degrees. Although the flexible ACTE flaps were designed to morph throughout the entire range of motion, each test was conducted at a single fixed setting in order to collect incremental data with a minimum of risk.
    “We are thrilled to have accomplished all of our flight test goals without encountering any significant technical issues,” said AFRL Program Manager Pete Flick, from Wright-Patterson Air Force Base in Ohio. “These flights cap 17 years of technology maturation, beginning with AFRL’s initial Phase 1 SBIR contract with FlexSys, and the technology now is ready to dramatically improve aircraft efficiency for the Air Force and the commercial aviation industry.”

    All the primary and secondary objectives for the test were successfully completed on schedule and within budget. The results of these flight tests will be included in design trade studies performed at NASA’s Langley Research Center in Hampton, Virginia, for designing future large transport aircraft.

    For more information on NASA’s research in next generation aircraft, visit:
    http://www.nasa.gov/subject/7565/future-aircraft/

    Article received by email.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The National Aeronautics and Space Administration (NASA) is the agency of the United States government that is responsible for the nation’s civilian space program and for aeronautics and aerospace research.

    President Dwight D. Eisenhower established the National Aeronautics and Space Administration (NASA) in 1958 with a distinctly civilian (rather than military) orientation encouraging peaceful applications in space science. The National Aeronautics and Space Act was passed on July 29, 1958, disestablishing NASA’s predecessor, the National Advisory Committee for Aeronautics (NACA). The new agency became operational on October 1, 1958.

    Since that time, most U.S. space exploration efforts have been led by NASA, including the Apollo moon-landing missions, the Skylab space station, and later the Space Shuttle. Currently, NASA is supporting the International Space Station and is overseeing the development of the Orion Multi-Purpose Crew Vehicle and Commercial Crew vehicles. The agency is also responsible for the Launch Services Program (LSP) which provides oversight of launch operations and countdown management for unmanned NASA launches. Most recently, NASA announced a new Space Launch System that it said would take the agency’s astronauts farther into space than ever before and lay the cornerstone for future human space exploration efforts by the U.S.

    NASA science is focused on better understanding Earth through the Earth Observing System, advancing heliophysics through the efforts of the Science Mission Directorate’s Heliophysics Research Program, exploring bodies throughout the Solar System with advanced robotic missions such as New Horizons, and researching astrophysics topics, such as the Big Bang, through the Great Observatories [Hubble, Chandra, Spitzer, and associated programs. NASA shares data with various national and international organizations such as from the [JAXA]Greenhouse Gases Observing Satellite.

     
  • richardmitnick 2:46 pm on April 28, 2015 Permalink | Reply
    Tags: Applied Research & Technology, ,   

    From Rice: “Chromosome-folding theory shows promise” 

    Rice U bloc

    Rice University

    April 28, 2015
    Mike Williams

    Human chromosomes are much bigger and more complex than proteins, but like proteins, they appear to fold and unfold in an orderly process as they carry out their functions in cells.

    Rice University biophysicist Peter Wolynes and postdoctoral fellow Bin Zhang have embarked upon a long project to define that order. They hope to develop a theory that predicts the folding mechanisms and resulting structures of chromosomes in the same general way Wolynes helped revolutionize the view of protein folding through the concept of energy landscapes.

    The first fruit of their quest is a new paper in the Proceedings of the National Academy of Sciences that details a coarse-grained method to “skirt some of the difficulties” that a nucleotide-level analysis of chromosomes would entail.

    Essentially, the researchers are drawing upon frequently observed crosslinking contacts among domains – distinct sequences that form along folding strands of DNA – to apply statistical tools. With these tools, they can build computational models and infer the presence of energy landscapes that predict the dynamics of chromosomes.

    How macromolecules of DNA fold into chromosomes is thought to have a crucial role in biological processes like gene regulation, DNA replication and cell differentiation. The researchers argue that unraveling the dynamics of how they fold and their structural details would add greatly to the understanding of cell biology.

    “It’s inevitable that there’s a state of the chromosome that involves having structure,” Wolynes said. “Since the main theme of our work is gene regulation, it’s something we would naturally be interested in pursuing.”

    But it’s no small task. First, though a chromosome is made of a single strand of DNA, that strand is huge, with millions of subunits. That’s much longer than the average protein and probably a lot slower to organize, the researchers said.

    Second, a large “team of molecular players” is involved in helping chromosomes get organized, and only a few of these relevant proteins are known.

    Third, chromosome organization appears to vary from one cell to the next and may depend on the cell’s type and the phase in its lifecycle.

    All those factors led Wolynes and Zhang to conclude that treating chromosomes exactly as they do proteins — that is, figuring out how and when the individual units along the DNA strand attract and repel each other — would be impractical.

    “But the three-dimensional organization of chromosomes is of critical importance and is worthy of study by Rice’s Center for Theoretical Biological Physics,” Wolynes said. He holds out hope that the theory developed in this study will lead to a more detailed view of chromosome conformations and will result in a better understanding of the relationships of the structure, dynamics and function of the genome.

    He said there is already evidence for the idea that actual gene regulatory processes are influenced by the chromosomes’ structures. He noted work by Rice colleague Erez Lieberman Aiden to develop high-resolution, three-dimensional maps of folded genomes will be an important step toward specifying their structures.

    One result of the new study was the observation that, at least during the interphase state the Rice team primarily studied, chromosome domains take on the characteristics of liquid crystals. In such a state, the domains remain fluid but become ordered, allowing for locally funneled landscapes that lead to the “ideal” chromosome structures that resemble the speculative versions seen in textbooks.

    Wolynes and Rice colleague José Onuchic, a biophysicist, began developing their protein-folding theory nearly three decades ago. In short, it reveals that proteins, which start as linear chains of amino acids, are programmed by genes to quickly fold into their three-dimensional native states. In doing so, they obey the principle of minimal frustration, in which interactions between individual acids guide the protein to its final, stable form.

    Wolynes used the principle to conceptualize folding as a funnel. The top of the funnel represents all of the possible ways a protein can fold. As individual stages of the protein come together, the number of possibilities decreases. The funnel narrows and eventually guides the protein to its functional native state.

    He hopes the route to understanding chromosome folding will take much less time than the decades it took for his team’s protein-folding work to pay off.

    “We’re not the first in this area,” he said. “A lot of people have said the structure of the chromosome is an important problem. I see it as being as big a field as protein folding was – and when you look at it from that point of view, you realize the state of our ignorance is profound. We’re like where protein folding was, on the experimental side, in 1955.

    “The question for this work is whether we can leapfrog over the dark ages of protein folding that led to our energy-landscape theory. I think we can.”

    The Center for Theoretical Biological Physics, funded by the National Science Foundation, and the D.R. Bullard-Welch Chair at Rice supported the research. The researchers utilized the National Science Foundation-supported DAVinCI supercomputer and the BlueBioU supercomputer, both administered by Rice’s Ken Kennedy Institute for Information Technology.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Rice U campus

    In his 1912 inaugural address, Rice University president Edgar Odell Lovett set forth an ambitious vision for a great research university in Houston, Texas; one dedicated to excellence across the range of human endeavor. With this bold beginning in mind, and with Rice’s centennial approaching, it is time to ask again what we aspire to in a dynamic and shrinking world in which education and the production of knowledge will play an even greater role. What shall our vision be for Rice as we prepare for its second century, and how ought we to advance over the next decade?

    This was the fundamental question posed in the Call to Conversation, a document released to the Rice community in summer 2005. The Call to Conversation asked us to reexamine many aspects of our enterprise, from our fundamental mission and aspirations to the manner in which we define and achieve excellence. It identified the pressures of a constantly changing and increasingly competitive landscape; it asked us to assess honestly Rice’s comparative strengths and weaknesses; and it called on us to define strategic priorities for the future, an effort that will be a focus of the next phase of this process.

     
  • richardmitnick 1:18 pm on April 28, 2015 Permalink | Reply
    Tags: Applied Research & Technology, , ,   

    From PPPL at Princeton: “An improvement to the global software standard for analyzing fusion plasmas (Nuclear Fusion)” 

    Princeton University
    Princeton University

    PPPL Large
    PPPL

    April 28, 2015
    Raphael Rosen, Princeton Plasma Physics Laboratory

    The gold standard for analyzing the behavior of fusion plasmas may have just gotten better. Mario Podestà, a staff physicist at the U.S. Department of Energy’s Princeton Plasma Physics Laboratory (PPPL), has updated the worldwide computer program known as TRANSP to better simulate the interaction between energetic particles and instabilities – disturbances in plasma that can halt fusion reactions. The program’s updates, reported in the journal Nuclear Fusion, could lead to improved capability for predicting the effects of some types of instabilities in future facilities such as ITER, the international experiment under construction in France to demonstrate the feasibility of fusion power.

    ITER Tokamak
    ITER Tokamak

    Podestà and co-authors saw a need for better modeling techniques when they noticed that while TRANSP could accurately simulate an entire plasma discharge, the code wasn’t able to represent properly the interaction between energetic particles and instabilities. The reason was that TRANSP, which PPPL developed and has regularly updated, treated all fast-moving particles within the plasma the same way. Those instabilities, however, can affect different parts of the plasma in different ways through so-called “resonant processes.”

    The authors first figured out how to condense information from other codes that do model the interaction accurately – albeit over short time periods – so that TRANSP could incorporate that information into its simulations. Podestà then teamed up with TRANSP developer Marina Gorelenkova at PPPL to update a TRANSP module called NUBEAM to enable it to make sense of this condensed data. “Once validated, the updated module will provide a better and more accurate way to compute the transport of energetic particles,” said Podestà. “Having a more accurate description of the particle interactions with instabilities can improve the fidelity of the program’s simulations.”

    1
    Schematic of NSTX tokamak at PPPL with a cross-section showing perturbations of the plasma profiles caused by instabilities. Without instabilities, energetic particles would follow closed trajectories and stay confined inside the plasma (blue orbit). With instabilities, trajectories can be modified and some particles may eventually be pushed out of the plasma boundary and lost (red orbit). Credit: Mario Podestà

    Fast-moving particles, which result from neutral beam injection into tokamak plasmas, cause the instabilities that the updated code models. These particles begin their lives with a neutral charge but turn into negatively charged electrons and positively charged ions – or atomic nuclei – inside the plasma. This scheme is used to heat the plasma and to drive part of the electric current that completes the magnetic field confining the plasma.

    PPPL Tokamak
    PPPL Tokamak

    The improved simulation tool may have applications for ITER, which will use fusion end-products called alpha particles to sustain high plasma temperatures. But just like the neutral beam particles in current-day-tokamaks, alpha particles could cause instabilities that degrade the yield of fusion reactions. “In present research devices, only very few, if any, alpha particles are generated,” said Podestà. “So we have to study and understand the effects of energetic ions from neutral beam injectors as a proxy for what will happen in future fusion reactors.”

    PPPL, on Princeton University’s Forrestal Campus in Plainsboro, N.J., is devoted to creating new knowledge about the physics of plasmas — ultra-hot, charged gases — and to developing practical solutions for the creation of fusion energy. Results of PPPL research have ranged from a portable nuclear materials detector for anti-terrorist use to universally employed computer codes for analyzing and predicting the outcome of fusion experiments. The Laboratory is managed by the University for the U.S. Department of Energy’s Office of Science, which is the largest single supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, visit science.energy.gov.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    Princeton University Campus

    About Princeton: Overview

    Princeton University is a vibrant community of scholarship and learning that stands in the nation’s service and in the service of all nations. Chartered in 1746, Princeton is the fourth-oldest college in the United States. Princeton is an independent, coeducational, nondenominational institution that provides undergraduate and graduate instruction in the humanities, social sciences, natural sciences and engineering.

    As a world-renowned research university, Princeton seeks to achieve the highest levels of distinction in the discovery and transmission of knowledge and understanding. At the same time, Princeton is distinctive among research universities in its commitment to undergraduate teaching.

    Today, more than 1,100 faculty members instruct approximately 5,200 undergraduate students and 2,600 graduate students. The University’s generous financial aid program ensures that talented students from all economic backgrounds can afford a Princeton education.

    Princeton Shield

     
  • richardmitnick 11:45 am on April 28, 2015 Permalink | Reply
    Tags: Applied Research & Technology, ,   

    From IBM via WCG: “How can your computer help find a cure?” 

    New WCG Logo

    World Community Grid

    IBM
    IBM

    Smarter Planet
    Smarter Planet

    Undated
    No Writer Credit

    By donating unused computing power, anyone can be part of the next big breakthrough

    “If we had as much computing power as we could possibly use, how could we make an impact on the biggest humanitarian issues of our time?”

    IBM’s World Community Grid is inviting researchers everywhere to ask themselves that question. The answers so far have been both inspiring and diverse, from discovering new drug candidates for Ebola, childhood cancer and HIV, to protecting watersheds and developing new materials to capture solar energy.

    In just over a decade, World Community Grid has provided scientific researchers with more than a million years of computer run time—all of it free, and all of it donated by volunteers from their personal devices when they’re not being used for other tasks. And that’s a big deal, because computer simulations are increasingly an essential tool for scientific research.

    2
    Sophia Tu

    When it awarded the 2013 Nobel Prize in Chemistry, the Nobel committee noted, “Today the computer is just as important a tool for chemists as the test tube. Simulations are so realistic that they predict the outcome of traditional experiments.” Sophia Tu, manager for IBM Corporate Citizenship, says, “That ability for computers to predict the real world so accurately is what makes World Community Grid so valuable for humanitarian research.”

    Volunteers to the rescue

    World Community Grid relies on what’s called [public] distributed computing or “volunteer computing” to run large-scale simulations. That means big projects are broken down into small computing tasks that are sent out to volunteers’ devices: PCs, laptops, tablets, even Android smartphones and Kindle Fires. These machines do the work when they’re not otherwise being used by their owners and return the results to a central computer, in this case a number of servers hosted by IBM. There the data is verified, cleaned up and passed on to the researchers.

    “World Community Grid is inspiring researchers to think big and tackle research at a scale that never seemed possible before, with incredible results. Over the past 10 years this method has powered a huge range of discoveries: new drug candidates that show promise for the treatment of childhood cancer, HIV, malaria, dengue fever and other diseases; new organic compounds that can convert sunlight to electricity with unprecedented efficiency; more nutritious staple crops; and increased understanding of many environmental and biological phenomena, including nanoscale water flow, large-scale nutrient runoff in watersheds, ecosystem metagenomes and cancer biomarkers,” says Tu. Not bad for the spare processing power of people’s ordinary hardware.

    First steps: an experiment for experiments

    The concept of World Community Grid was born in 2003, when IBM Corporate Citizenship and IBM Research saw a chance to aid the search for a cure for smallpox. By using grid-based simulations, researchers were able to explore about 35 million potential molecules and their interaction with the smallpox virus at a rate of about 100,000 molecules an hour.

    “With computers, you are able to simulate certain experiments that would traditionally be done in a lab, but do it much faster and on a much bigger scale, without wasting your time or equipment,” Tu says. “With smallpox, for example, the scientists came up with three-dimensional models of the proteins that make up that virus, revealing the virus’ vulnerabilities, then used massive computing power to search for molecules to exploit those vulnerabilities. It’s like a key and lock puzzle: a molecule that binds very tightly to an essential virus protein can disable that protein and stop the disease.”

    At the end of the smallpox project the results were striking: the research team had identified 44 strong candidates out of those 35 million compounds for further testing and development.

    Taking on a world of problems

    The success of the smallpox experiment led IBM to formally launch World Community Grid in late 2004 – enabling anyone, anywhere in the world to help advance research on global health and sustainability issues.

    “We started with research on infectious diseases, including neglected diseases that don’t get a lot of attention or funding, but affect millions of people around the world,” Tu says. The scope of research soon expanded to include other global health issues as well as environmental studies. To date, World Community grid has supported 24 research teams at institutions around the world including Harvard University and Tsinghua University.

    And the benefits are shared as well: the raw data from every project is made publicly available to support other researchers working in the field.

    Volunteers at the heart

    All of this is supported by volunteers, nearly 700,000 of them all around the world, who have installed the World Community Grid app on computing devices that runs Mac OSX, Windows, Linux or Android (there is no app yet for iOS devices). The app is designed so as not to affect the volunteer’s day-to-day use of their device in any way.

    Tu explains, “World Community Grid is designed to run only when you’re not using the full computing capacity of your device. It has the lowest priority for all of the device’s resources: processor, battery and data. As soon as you start using your device, World Community Grid’s scientific calculations are paused; they’ll pick back up when you’re done. Additionally, our Android app only runs when your device is plugged in and at least 90 percent charged, and it will only send or receive data when your device is connected to WiFi.”

    What’s more, the app has a high level of security. “World Community Grid is very, very safe,” says Tu. “We work closely with IBM Security, and the software has been tested exhaustively from all angles. Also, it won’t affect other data on a volunteer’s device. It doesn’t touch any data except what’s associated with the project.”

    In addition to individual volunteers, more than 460 organizations—including corporations, government agencies and educational institutions—have partnered with World Community Grid. Many of them, like IBM, install the World Community Grid app on all of their machines to give their staffs the opportunity to help power essential research.

    4

    Immense potential

    With the number of computers and mobile devices in the world constantly rising, the potential for World Community Grid is enormous. In the Android world alone, there are nearly 2 billion Android devices, with another billion expected to be activated this year. “World Community Grid is currently tapping into only a tiny fraction of the total computing power in the world that goes unused every day,” says Tu. The biggest challenge is also the biggest opportunity: getting potential volunteers to understand that they could be supporting cutting-edge research at no cost or loss of convenience.

    Think bigger!

    In reaching potential volunteers, the main hurdle is awareness. Tu says that the opportunity to support research is only a few clicks away, and the team has been experimenting with different outreach strategies. “Currently, most of our recruiting is done by word of mouth, and we are very active on social media.”

    For researchers who want to benefit from World Community Grid, the questions are a bit more rigorous. What is the potential social impact of the research? What are its scientific merits? Is the team willing to make their computational results freely available to the public after the project so that others can benefit? Are there any technical obstacles to using grid-based computing for their simulations? Tu mentions a common psychological barrier as well: “When we get an application from a researcher, we almost always have to tell them to think bigger when it comes to the amount of computing power they’re requesting, sometimes by several orders of magnitude.”

    See the full article here.

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    World Community Grid (WCG) brings people together from across the globe to create the largest non-profit computing grid benefiting humanity. It does this by pooling surplus computer processing power. We believe that innovation combined with visionary scientific research and large-scale volunteerism can help make the planet smarter. Our success depends on like-minded individuals – like you.”

    WCG projects run on BOINC software from UC Berkeley.

    BOINC is a leader in the field(s) of Distributed Computing, Grid Computing and Citizen Cyberscience.BOINC is more properly the Berkeley Open Infrastructure for Network Computing.

    CAN ONE PERSON MAKE A DIFFERENCE? YOU BET!!

    “Download and install secure, free software that captures your computer’s spare power when it is on, but idle. You will then be a World Community Grid volunteer. It’s that simple!” You can download the software at either WCG or BOINC.

    Please visit the project pages-
    Outsmart Ebola together

    Outsmart Ebola Together

    Mapping Cancer Markers
    mappingcancermarkers2

    Uncovering Genome Mysteries
    Uncovering Genome Mysteries

    Say No to Schistosoma

    GO Fight Against Malaria

    Drug Search for Leishmaniasis

    Computing for Clean Water

    The Clean Energy Project

    Discovering Dengue Drugs – Together

    Help Cure Muscular Dystrophy

    Help Fight Childhood Cancer

    Help Conquer Cancer

    Human Proteome Folding

    FightAIDS@Home

    Computing for Sustainable Water

    World Community Grid is a social initiative of IBM Corporation
    IBM Corporation
    ibm

    IBM – Smarter Planet
    sp

     
  • richardmitnick 7:16 am on April 28, 2015 Permalink | Reply
    Tags: Applied Research & Technology, ,   

    From U Washington: “Tidal tugs on Teflon faults drive slow-slipping earthquakes” 

    U Washington

    University of Washington

    April 27, 2015
    Hannah Hickey

    1
    Slow earthquakes happen between the hazardous locked zone and the viscous portion that slips silently. They are found on subduction zones, like Cascadia’s, where a heavy ocean plate sinks below a lighter continental plate.UW

    Unknown to most people, the Pacific Northwest experiences a magnitude-6.6 earthquake about once a year. The reason nobody notices is that the movement happens slowly and deep underground, in a part of the fault whose behavior, known as slow-slip, was only recently discovered.

    A University of Washington seismologist who studies slow-slip quakes has looked at how they respond to tidal forces from celestial bodies and used the result to make a first direct calculation of friction deep on the fault. Though these events occur much deeper and on a different type of fault than the recent catastrophe in Nepal, the findings could improve general understanding of when and how faults break.

    The new study, published online April 27 in Nature Geoscience, shows that the gravitational pull of the sun and the moon affect the Cascadia fault a few days after it has started slipping. The timing of movement suggests that the friction at this depth on the fault is only 0.1, roughly that of two pieces of lubricated metal.

    3
    Structure of the Cascadia subduction zone

    4
    Area of the Cascadia subduction zone

    “I was able to tease out the effect of friction and found that it is not the friction of normal rocks in the lab — it’s much lower,” said author Heidi Houston, a UW professor of Earth and space sciences. “It’s closer to Teflon than to sandpaper.”

    The surprising results of the new study could help to better model the physics of these systems, and they could even tell us something about the frictional forces in the locked portion of the fault where hazardous earthquakes occur.

    The research looked at six recent slow-slip events along the Cascadia subduction zone, one of the best-studied places for these enigmatic slow quakes. The slow quakes are accompanied by tremors, weak seismic vibrations previously thought to be random noise. The tremors begin every 12 to 14 months below Puget Sound, Washington, and then travel north and south at about 5 miles (8 kilometers) per day for several weeks, affecting each section of the fault for about five days.

    The paper looks at how the gravitational pull of the sun and moon, which slightly deform the Earth and oceans, affect forces along, across and inside the fault, and what that means for the slow-slip seismic activity more than 20 miles (35 kilometers) underground.

    Results show that on the first day of tremors, the tidal forces don’t matter much. But starting at about 1 1/2 days — when Houston thinks minerals that had been deposited from the surrounding fluid and that held the fault together may have broken — the additional pull of the tides does have an effect.

    “Three days after the slipping has begun, the fault is very sensitive to the tides, and it almost slips only when the tides are encouraging it,” Houston said.

    “It implies that something is changing on the fault plane over those five days.”

    By averaging across many sections of the fault, and over all six events, she found that the amount of the tremor increases exponentially with increasing tidal force.

    Regular fast earthquakes are also very slightly affected by the tides, but they are overwhelmed by other forces and the effect is almost too small to detect.

    2
    The study includes slow-slip events of about magnitude 6.6 generated on the Cascadia fault from 2007 to 2012. Along the bottom axis is the time, in days, up to six weeks. The vertical axis is the distance in kilometers along the fault. Calculations looked at slipping for locations between the two dashed green lines, which spans roughly Olympia, Washington to southern Vancouver Island, British Columbia.H. Houston / UW

    There is no need for worry, Houston says — even when celestial bodies line up to generate the biggest king tides, the effect would only rarely be enough to actually trigger a slow-slip quake, much less a regular earthquake. But it does tell us something about the physics of a crucial process that can’t be easily studied in the lab.

    “We want to understand the physics of what is happening, to understand and predict how the fault is storing stress, and how it’s going to relieve stress,” Houston said. “Friction is a big piece of that. And it turns out that this part of the fault is much slipperier than previously thought.”

    Slow-slip earthquakes relieve stress right where they slipped, but the movement actually places more stress on neighboring parts of the fault, including the so-called locked zone, where a rupture can cause the most damaging type of earthquakes.

    In Cascadia’s slow-slip events the fault will move about an inch (3 centimeters) over several days, with different parts of the fault moving at different times. When the shallower “locked zone” ruptures, by contrast, a large section of the fault can lurch over 60 feet (18 meters) in minutes. When this occurs, as it does about every 500 years in a magnitude 9 on the Cascadia subduction zone, it generates strong damaging seismic waves in the Earth’s crust.

    Still unknown is how slow-slip events are related to the more damaging quakes. A shallower slow-slip event was detected in the weeks before the deadly 2011 Tohoku earthquake and tsunami, on a fault like Cascadia’s where an ocean plate plunges below a continental plate.

    “Understanding slow slip and tremor could give us some way to monitor what is happening in the shallower, locked part of the fault,” Houston said. “Geophysicists started with the picture of just a flat plane of sandpaper, but that picture is evolving.”

    The study was funded by the National Science Foundation.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The University of Washington is one of the world’s preeminent public universities. Our impact on individuals, on our region, and on the world is profound — whether we are launching young people into a boundless future or confronting the grand challenges of our time through undaunted research and scholarship. Ranked number 10 in the world in Shanghai Jiao Tong University rankings and educating more than 54,000 students annually, our students and faculty work together to turn ideas into impact and in the process transform lives and our world. For more about our impact on the world, every day.

    So what defines us — the students, faculty and community members at the University of Washington? Above all, it’s our belief in possibility and our unshakable optimism. It’s a connection to others, both near and far. It’s a hunger that pushes us to tackle challenges and pursue progress. It’s the conviction that together we can create a world of good. Join us on the journey.

     
  • richardmitnick 9:20 pm on April 27, 2015 Permalink | Reply
    Tags: Applied Research & Technology, ,   

    From WIRED: “Turns Out Satellites Work Great for Mapping Earthquakes” 

    Wired logo

    Wired

    1
    Satellite radar image of the magnitude 6.0 South Napa earthquake. European Space Agency

    The Nepal earthquake on Saturday devastated the region and killed over 2,500 people, with more casualties mounting across four different countries. The first 24 hours of a disaster are the most important, and first-responders scramble to get as much information about the energy and geological effects of earthquakes as they can. Seismometers can help illustrate the location and magnitude of earthquakes around the world, but for more precise detail, you need to look at three-dimensional models of the ground’s physical displacement.

    The easiest way to characterize that moving and shaking is with GPS and satellite data, together called geodetic data. That information is already used by earthquake researchers and geologists around the world to study the earth’s tectonic plate movements—long-term trends that establish themselves over years.

    4
    The tectonic plates of the world were mapped in the second half of the 20th century.

    But now, researchers at the University of Iowa and the U.S. Geological Survey (USGS) have shown a faster way to use geodetic data to assess fault lines, turning over reports in as little as a day to help guide rapid responses to catastrophic quakes.

    2
    A radar interferogram of the August 2014 South Napa earthquake. A single cycle of color represents about a half inch of surface displacement. Jet Propulsion Laboratory

    Normally, earthquake disaster aid and emergency response requires detailed information about surface movements: If responders know how much ground is displaced, they’ll know better what kind of infrastructure damage to expect, or what areas pose the greatest risk to citizens. Yet emergency response agencies don’t use geodetic data immediately, choosing instead to wait several days or even weeks before finally processing the data, says University of Iowa geologist William Barnhart. By then, the damage has been done and crews are already on the ground, with relief efforts well underway.

    The new results are evidence that first responders can get satellite data fast enough to inform how they should respond. Barnhart and his team used geodetic data to measure small deformations in the surface caused by an 6.0-magnitude quake that hit Napa Valley in August 2014 (the biggest the Bay Area had seen in 25 years). By analyzing those measurements, the geologists determined how much the ground moved with relation to the fault plane, which helps describe the exact location, orientation, and dimensions of the entire fault.

    3
    A 3D slip map of the Napa quake generated from GPS surface displacements. Jet Propulsion Laboratory

    Then they created the Technicolor map above, showing just how much the ground shifted. In this so-called interferogram of the Napa earthquake epicenter, the cycles of color represent vertical ground displacement, where every full cycle indicates 6 centimeters (e.g. between every green band is 6 cm of vertical ground).

    According to the Barnhart, this is the first demonstration of geodetic data being acquired and analyzed the same day of an earthquake. John Langbein, a geologist at the USGS, finds the results very encouraging, and hopes to see geodetic data used regularly as a tool to make earthquake responses faster and more efficient.

    Barnhart is quick to point out that this method is most useful for moderate earthquakes (between magnitudes of 5.5 and 7.0). Although the Nepal earthquake had a magnitude of 7.8, over 35 aftershocks continued to rock the region, including one as high as 6.7 on Sunday. The earthquake itself flattened broad swaths of the capital city of Kathmandu, and caused avalanches across the Himalayan mountains (including Mount Everest), killing and stranding many climbers. But the aftershocks are stymieing relief efforts, paralyzing citizens with immobilizing fear, and creating new avalanches in nearby mountains.

    It’s also worth remembering that the 2010 earthquake that devastated Haiti—and killed about 316,000 people—had a magnitude of 7.0. Most areas of the world, especially developing nations, aren’t equipped to withstand even small tremors in the earth. It’s those places that are also likely to have fewer seismometers, making the satellite information even more helpful.

    As the situation in Nepal moves forward, the aftermath might hopefully speed up plans to make geodetic data available just hours after an earthquake occurs. Satellite systems could be integral in allowing first responders to move swiftly in the face of unpredictable, unpreventable events.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 9:09 am on April 27, 2015 Permalink | Reply
    Tags: Applied Research & Technology, ,   

    From NYT: “Ancient Collision Made Nepal Earthquake Inevitable” 

    New York Times

    The New York Times

    APRIL 25, 2015
    KENNETH CHANG

    1
    Aftershocks Continue Across a Devastated Region Source: U.S.G.S.

    2
    Photograph by Grant Dixon/Hedgehog House, via Getty Image

    More than 25 million years ago, India, once a separate island on a quickly sliding piece of the Earth’s crust, crashed into Asia. The two land masses are still colliding, pushed together at a speed of 1.5 to 2 inches a year. The forces have pushed up the highest mountains in the world, in the Himalayas, and have set off devastating earthquakes.

    Experts had warned of the danger to the people of Katmandu for decades. The death toll in Nepal on Saturday was practically inevitable given the tectonics, the local geology that made the shaking worse and the lax construction of buildings that could not withstand the shaking.

    GeoHazards International, a nonprofit organization in Menlo Park, Calif., that tries to help poorer, more vulnerable regions like Nepal prepare for disasters, had noted that major earthquakes struck that region about every 75 years.

    In 1934 — 81 years ago — more than 10,000 people died in a magnitude 8.1 earthquake in eastern Nepal, about six miles south of Mount Everest. A smaller quake in 1988 with a magnitude of 6.8 killed more than 1,000 people.

    Brian Tucker, president and founder of GeoHazards, said that in the 1990s, his organization predicted that if the 1934 quake were to happen again, 40,000 people would die because of migration to the city where tall, flimsily built buildings would collapse.

    In an update just this month, GeoHazards wrote, “With an annual population growth rate of 6.5 percent and one of the highest urban densities in the world, the 1.5 million people living in the Katmandu Valley were clearly facing a serious and growing earthquake risk.”

    The organization helped set up a local nonprofit to continue preparations, including the reinforcement of schools and hospitals.

    Saturday’s earthquake occurred to the northwest of Katmandu at a relatively shallow depth, about nine miles, which caused greater shaking at the surface, but at magnitude 7.8, it released less energy than the 1934 quake.

    Roger Bilham, a professor of geological sciences at the University of Colorado who has studied the history of earthquakes in that region, said that the shaking lasted one to two minutes, and the fault slipped about 10 feet along the rupture zone, which stretched 75 miles, passing under Katmandu.

    The earthquake “translated the whole city southward by 10 feet,” Dr. Bilham said.

    Nepal’s Landmarks, Before and After the Earthquake

    Trailokya Mohan Narayan Temple, Katmandu
    Volunteers helped to remove the debris of a three-story temple.

    2
    Alok Tuladhar via Google Views
    3
    Niranjan Shrestha/Associated Press

    Vatsala Shikhara Temple, Bhaktapur
    After the earthquake, people occupied the square in front of a
    collapsed temple in Bhaktapur, eight miles east of Katmandu.

    4
    Anna Nadgrodkiewicz/sandstoneandamber.com
    5
    Omar Havana/Getty Images

    Dharahara Tower, Katmandu
    A nine-story structure built in 1832 on orders from the queen. It was made of bricks
    more than a foot thick, and had recently been reopened to the public. Sightseers could
    climb a narrow spiral staircase to a viewing platform about 200 feet above the city.

    6
    Bal Krishna Thapa Chhetri
    7
    Narendra Shrestha/European Pressphoto Agency

    Maju Deval, Katmandu
    This temple, built in 1690, is a Unesco World Heritage Site.

    8
    Anna Nadgrodkiewicz/sandstoneandamber.com
    9
    Narendra Shrestha/European Pressphoto Agency

    Aftershocks as large as magnitude 6.6 have occurred mostly to the northeast of Katmandu.

    It is possible that the Saturday quake is a preface to an even larger one, but Dr. Bilham said that was unlikely.

    Katmandu and the surrounding valley sit on an ancient dried-up lake bed, which contributed to the devastation. “Very, very soft soil, and the soft soil amplifies seismic motion,” Dr. Tucker said.

    Steep slopes in the area are also prone to avalanches like the one that the quake triggered on Mount Everest on Saturday.

    Katmandu is not the only place where a deadly earthquake has been expected.

    Dr. Tucker said Tehran; Haiti; Lima, Peru; and Padang, Indonesia, were similarly vulnerable. In those places, nearby tectonic faults are under strain, and building standards and disaster preparations are seen as inadequate.

    But not everywhere has been complacent. Over the past 76 years, many earthquakes have occurred along a fault in northern Turkey, starting in the eastern part of the country and progressing west, toward Istanbul. An earthquake in 1999 killed more than 17,000 people, mostly in the city of Izmit, east of Istanbul. The expectation is that the epicenter of the next big earthquake will be in or around Istanbul.

    “Istanbul is the place that has been most aggressive in enforcing building codes,” Dr. Tucker said. “I think Istanbul has been doing a good job.”

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
Follow

Get every new post delivered to your Inbox.

Join 437 other followers

%d bloggers like this: