Updates from richardmitnick Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 9:20 pm on April 27, 2015 Permalink | Reply
    Tags: , ,   

    From WIRED: “Turns Out Satellites Work Great for Mapping Earthquakes” 

    Wired logo

    Wired

    1
    Satellite radar image of the magnitude 6.0 South Napa earthquake. European Space Agency

    The Nepal earthquake on Saturday devastated the region and killed over 2,500 people, with more casualties mounting across four different countries. The first 24 hours of a disaster are the most important, and first-responders scramble to get as much information about the energy and geological effects of earthquakes as they can. Seismometers can help illustrate the location and magnitude of earthquakes around the world, but for more precise detail, you need to look at three-dimensional models of the ground’s physical displacement.

    The easiest way to characterize that moving and shaking is with GPS and satellite data, together called geodetic data. That information is already used by earthquake researchers and geologists around the world to study the earth’s tectonic plate movements—long-term trends that establish themselves over years.

    4
    The tectonic plates of the world were mapped in the second half of the 20th century.

    But now, researchers at the University of Iowa and the U.S. Geological Survey (USGS) have shown a faster way to use geodetic data to assess fault lines, turning over reports in as little as a day to help guide rapid responses to catastrophic quakes.

    2
    A radar interferogram of the August 2014 South Napa earthquake. A single cycle of color represents about a half inch of surface displacement. Jet Propulsion Laboratory

    Normally, earthquake disaster aid and emergency response requires detailed information about surface movements: If responders know how much ground is displaced, they’ll know better what kind of infrastructure damage to expect, or what areas pose the greatest risk to citizens. Yet emergency response agencies don’t use geodetic data immediately, choosing instead to wait several days or even weeks before finally processing the data, says University of Iowa geologist William Barnhart. By then, the damage has been done and crews are already on the ground, with relief efforts well underway.

    The new results are evidence that first responders can get satellite data fast enough to inform how they should respond. Barnhart and his team used geodetic data to measure small deformations in the surface caused by an 6.0-magnitude quake that hit Napa Valley in August 2014 (the biggest the Bay Area had seen in 25 years). By analyzing those measurements, the geologists determined how much the ground moved with relation to the fault plane, which helps describe the exact location, orientation, and dimensions of the entire fault.

    3
    A 3D slip map of the Napa quake generated from GPS surface displacements. Jet Propulsion Laboratory

    Then they created the Technicolor map above, showing just how much the ground shifted. In this so-called interferogram of the Napa earthquake epicenter, the cycles of color represent vertical ground displacement, where every full cycle indicates 6 centimeters (e.g. between every green band is 6 cm of vertical ground).

    According to the Barnhart, this is the first demonstration of geodetic data being acquired and analyzed the same day of an earthquake. John Langbein, a geologist at the USGS, finds the results very encouraging, and hopes to see geodetic data used regularly as a tool to make earthquake responses faster and more efficient.

    Barnhart is quick to point out that this method is most useful for moderate earthquakes (between magnitudes of 5.5 and 7.0). Although the Nepal earthquake had a magnitude of 7.8, over 35 aftershocks continued to rock the region, including one as high as 6.7 on Sunday. The earthquake itself flattened broad swaths of the capital city of Kathmandu, and caused avalanches across the Himalayan mountains (including Mount Everest), killing and stranding many climbers. But the aftershocks are stymieing relief efforts, paralyzing citizens with immobilizing fear, and creating new avalanches in nearby mountains.

    It’s also worth remembering that the 2010 earthquake that devastated Haiti—and killed about 316,000 people—had a magnitude of 7.0. Most areas of the world, especially developing nations, aren’t equipped to withstand even small tremors in the earth. It’s those places that are also likely to have fewer seismometers, making the satellite information even more helpful.

    As the situation in Nepal moves forward, the aftermath might hopefully speed up plans to make geodetic data available just hours after an earthquake occurs. Satellite systems could be integral in allowing first responders to move swiftly in the face of unpredictable, unpreventable events.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 2:24 pm on April 27, 2015 Permalink | Reply
    Tags: , ,   

    From NRAO: “Strange Supernova is “Missing Link” in Gamma-Ray Burst Connection” 

    NRAO Icon
    National Radio Astronomy Observatory

    NRAO Banner

    27 April 2015
    Dave Finley, Public Information Officer
    (575) 835-7302
    dfinley@nrao.edu

    1
    In an ordinary core-collapse supernova with no “central engine,” ejected material expands outward nearly spherically, left. At right, a strong central engine propels jets of material at nearly the speed of light and generates a gamma-ray burst (GRB). The center panel shows an intermediate supernova like SN 2012ap, with a weak central engine, weak jets, and no GRB.
    CREDIT: Bill Saxton, NRAO/AUI/NSF

    Astronomers using the National Science Foundation’s Very Large Array (VLA) have found a long-sought “missing link” between supernova explosions that generate gamma-ray bursts (GRBs) and those that don’t.

    NRAO VLA
    NRAO VLA

    The scientists found that a stellar explosion seen in 2012 has many characteristics expected of one that generates a powerful burst of gamma rays, yet no such burst occurred.

    “This is a striking result that provides a key insight about the mechanism underlying these explosions,” said Sayan Chakraborti, of the Harvard-Smithsonian Center for Astrophysics (CfA). “This object fills in a gap between GRBs and other supernovae of this type, showing us that a wide range of activity is possible in such blasts,” he added.

    The object, called Supernova 2012ap (SN 2012ap) is what astronomers term a core-collapse supernova [Type II]. This type of blast occurs when the nuclear fusion reactions at the core of a very massive star no longer can provide the energy needed to hold up the core against the weight of the outer parts of the star. The core then collapses catastrophically into a superdense neutron star or a black hole. The rest of the star’s material is blasted into space in a supernova explosion.

    The most common type of such a supernova blasts the star’s material outward in a nearly-spherical bubble that expands rapidly, but at speeds far less than that of light. These explosions produce no burst of gamma rays.

    In a small percentage of cases, the infalling material is drawn into a short-lived swirling disk surrounding the new neutron star or black hole. This accretion disk generates jets of material that move outward from the disk’s poles at speeds approaching that of light. This combination of a swirling disk and its jets is called an “engine,” and this type of explosion produces gamma-ray bursts.

    The new research shows, however, that not all “engine-driven” supernova explosions produce gamma-ray bursts.

    “This supernova had jets moving at nearly the speed of light, and those jets were quickly slowed down, just like the jets we see in gamma-ray bursts,” said Alicia Soderberg, also of CfA.

    An earlier supernova seen in 2009 also had fast jets, but its jets expanded freely, without experiencing the slowdown characteristic of those that generate gamma-ray bursts. The free expansion of the 2009 object, the scientists said, is more like what is seen in supernova explosions with no engine, and probably indicates that its jet contained a large percentage of heavy particles, as opposed to the lighter particles in gamma-ray-burst jets. The heavy particles more easily make their way through the material surrounding the star.

    “What we see is that there is a wide diversity in the engines in this type of supernova explosion,” Chakraborti said. “Those with strong engines and lighter particles produce gamma-ray bursts, and those with weaker engines and heavier particles don’t,” he added.

    “This object shows that the nature of the engine plays a central role in determining the characteristics of this type of supernova explosion,” Soderberg said.

    Chakraborti and Soderberg worked with an international team of scientists from five continents. In addition to the VLA, they also used data from the Giant Meterwave Radio Telescope (GMRT) in India and the InterPlanetary Network (IPN) of spacecraft equipped with GRB detectors. The team, led by Chakraborti, is reporting their work in a paper accepted to the Astrophysical Journal. Other articles, led by co-authors Raffaella Margutti and Dan Milisavljevic, also report on the X-ray and optical follow-up on SN 2012ap using a suite of space and ground-based facilities.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The NRAO operates a complementary, state-of-the-art suite of radio telescope facilities for use by the scientific community, regardless of institutional or national affiliation: the Very Large Array (VLA), the Robert C. Byrd Green Bank Telescope (GBT), and the Very Long Baseline Array (VLBA)*.

    ALMA Array

    NRAO ALMA

    NRAO GBT
    NRAO GBT

    NRAO VLA
    NRAO VLA

    The NRAO is building two new major research facilities in partnership with the international community that will soon open new scientific frontiers: the Atacama Large Millimeter/submillimeter Array (ALMA), and the Expanded Very Large Array (EVLA). Access to ALMA observing time by the North American astronomical community will be through the North American ALMA Science Center (NAASC).
    *The Very Long Baseline Array (VLBA) comprises ten radio telescopes spanning 5,351 miles. It’s the world’s largest, sharpest, dedicated telescope array. With an eye this sharp, you could be in Los Angeles and clearly read a street sign in New York City!

    Astronomers use the continent-sized VLBA to zoom in on objects that shine brightly in radio waves, long-wavelength light that’s well below infrared on the spectrum. They observe blazars, quasars, black holes, and stars in every stage of the stellar life cycle. They plot pulsars, exoplanets, and masers, and track asteroids and planets.

     
  • richardmitnick 1:55 pm on April 27, 2015 Permalink | Reply
    Tags: , ,   

    From phys.org: “Astrophyicists draw most comprehensive map of the universe” 

    physdotorg
    phys.org

    April 27, 2015
    No Writer Credit

    1
    A slice through the 3D map of the nearby universe. Our Milky Way galaxy is in the centre, marked by a cross. The map spans nearly two billion light years from side to side. Regions with many galaxies are shown in white or red, whereas regions with fewer galaxies are dark blue.

    Astrophysicists have created a 3D map of the universe that spans nearly two billion light years and is the most complete picture of our cosmic neighbourhood to date.

    The spherical map of galaxy superclusters will lead to a greater understanding of how matter is distributed in the universe and provide key insights into dark matter, one of physics’ greatest mysteries.

    Professor Mike Hudson, Jonathan Carrick and Stephen Turnbull, of the Department of Physics and Astronomy at the University of Waterloo, and Guilhem Lavaux the Institute d’Astrophysique de Paris of the Centre national de la recherche scientifique of France, created the map. Professor Hudson is also an affiliate member of the Perimeter Institute for Theoretical Physics.

    “The galaxy distribution isn’t uniform and has no pattern. It has peaks and valleys much like a mountain range. This is what we expect if the large-scale structure originates from quantum fluctuations in the early universe,” said Hudson, also associate dean of science, computing.

    The map appears online in the peer-review journal, Monthly Notices of the Royal Astronomical Society, one of the world’s leading primary research journals for astronomy and astrophysics.

    The lighter blue and white areas on the map represent greater concentrations of galaxies. The red area is the supercluster called the Shapley Concentration, the largest collection of galaxies in the nearby universe. Unexplored areas appear in medium blue.

    3
    A map of the Superclusters and voids nearest to Earth

    4
    Shapley Supercluster

    Knowing the location and motion of matter in the universe will help astrophysicists predict the universe’s expansion and identify where and how much dark matter exists.

    Scientists have observed that galaxies move differently because the universe’s expansion is not even. These differences are called peculiar velocities. Our own Milky Way galaxy and its neighbour Andromeda are moving with a speed of 2 million kilometres per hour.

    4
    Andromeda Galaxy, Adam Evans

    Previous models haven’t fully accounted for this observed motion. Hudson and his team are interested in discovering what structures are responsible for the peculiar velocities.

    These deviations in the motion of galaxies are a valuable tool to determine the distribution of matter and dark matter on the largest scales.

    Dark matter accounts for a large majority of the mass content in the universe. It is a hypothesized form of matter particle that does not reflect or emit light and as a result it can’t be seen or measured directly. The existence and properties of dark matter can only be inferred indirectly through its gravitational effects on visible matter and light.

    “A better understanding of dark matter is central to understanding the formation of galaxies and the structures they live in, such as galaxy clusters, superclusters and voids,” said Hudson.

    The next step will involve getting more detailed samples of peculiar velocities to enhance the map, in collaboration with researchers in Australia.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    About Phys.org in 100 Words

    Phys.org™ (formerly Physorg.com) is a leading web-based science, research and technology news service which covers a full range of topics. These include physics, earth science, medicine, nanotechnology, electronics, space, biology, chemistry, computer sciences, engineering, mathematics and other sciences and technologies. Launched in 2004, Phys.org’s readership has grown steadily to include 1.75 million scientists, researchers, and engineers every month. Phys.org publishes approximately 100 quality articles every day, offering some of the most comprehensive coverage of sci-tech developments world-wide. Quancast 2009 includes Phys.org in its list of the Global Top 2,000 Websites. Phys.org community members enjoy access to many personalized features such as social networking, a personal home page set-up, RSS/XML feeds, article comments and ranking, the ability to save favorite articles, a daily newsletter, and other options.

     
  • richardmitnick 9:09 am on April 27, 2015 Permalink | Reply
    Tags: , ,   

    From NYT: “Ancient Collision Made Nepal Earthquake Inevitable” 

    New York Times

    The New York Times

    APRIL 25, 2015
    KENNETH CHANG

    1
    Aftershocks Continue Across a Devastated Region Source: U.S.G.S.

    2
    Photograph by Grant Dixon/Hedgehog House, via Getty Image

    More than 25 million years ago, India, once a separate island on a quickly sliding piece of the Earth’s crust, crashed into Asia. The two land masses are still colliding, pushed together at a speed of 1.5 to 2 inches a year. The forces have pushed up the highest mountains in the world, in the Himalayas, and have set off devastating earthquakes.

    Experts had warned of the danger to the people of Katmandu for decades. The death toll in Nepal on Saturday was practically inevitable given the tectonics, the local geology that made the shaking worse and the lax construction of buildings that could not withstand the shaking.

    GeoHazards International, a nonprofit organization in Menlo Park, Calif., that tries to help poorer, more vulnerable regions like Nepal prepare for disasters, had noted that major earthquakes struck that region about every 75 years.

    In 1934 — 81 years ago — more than 10,000 people died in a magnitude 8.1 earthquake in eastern Nepal, about six miles south of Mount Everest. A smaller quake in 1988 with a magnitude of 6.8 killed more than 1,000 people.

    Brian Tucker, president and founder of GeoHazards, said that in the 1990s, his organization predicted that if the 1934 quake were to happen again, 40,000 people would die because of migration to the city where tall, flimsily built buildings would collapse.

    In an update just this month, GeoHazards wrote, “With an annual population growth rate of 6.5 percent and one of the highest urban densities in the world, the 1.5 million people living in the Katmandu Valley were clearly facing a serious and growing earthquake risk.”

    The organization helped set up a local nonprofit to continue preparations, including the reinforcement of schools and hospitals.

    Saturday’s earthquake occurred to the northwest of Katmandu at a relatively shallow depth, about nine miles, which caused greater shaking at the surface, but at magnitude 7.8, it released less energy than the 1934 quake.

    Roger Bilham, a professor of geological sciences at the University of Colorado who has studied the history of earthquakes in that region, said that the shaking lasted one to two minutes, and the fault slipped about 10 feet along the rupture zone, which stretched 75 miles, passing under Katmandu.

    The earthquake “translated the whole city southward by 10 feet,” Dr. Bilham said.

    Nepal’s Landmarks, Before and After the Earthquake

    Trailokya Mohan Narayan Temple, Katmandu
    Volunteers helped to remove the debris of a three-story temple.

    2
    Alok Tuladhar via Google Views
    3
    Niranjan Shrestha/Associated Press

    Vatsala Shikhara Temple, Bhaktapur
    After the earthquake, people occupied the square in front of a
    collapsed temple in Bhaktapur, eight miles east of Katmandu.

    4
    Anna Nadgrodkiewicz/sandstoneandamber.com
    5
    Omar Havana/Getty Images

    Dharahara Tower, Katmandu
    A nine-story structure built in 1832 on orders from the queen. It was made of bricks
    more than a foot thick, and had recently been reopened to the public. Sightseers could
    climb a narrow spiral staircase to a viewing platform about 200 feet above the city.

    6
    Bal Krishna Thapa Chhetri
    7
    Narendra Shrestha/European Pressphoto Agency

    Maju Deval, Katmandu
    This temple, built in 1690, is a Unesco World Heritage Site.

    8
    Anna Nadgrodkiewicz/sandstoneandamber.com
    9
    Narendra Shrestha/European Pressphoto Agency

    Aftershocks as large as magnitude 6.6 have occurred mostly to the northeast of Katmandu.

    It is possible that the Saturday quake is a preface to an even larger one, but Dr. Bilham said that was unlikely.

    Katmandu and the surrounding valley sit on an ancient dried-up lake bed, which contributed to the devastation. “Very, very soft soil, and the soft soil amplifies seismic motion,” Dr. Tucker said.

    Steep slopes in the area are also prone to avalanches like the one that the quake triggered on Mount Everest on Saturday.

    Katmandu is not the only place where a deadly earthquake has been expected.

    Dr. Tucker said Tehran; Haiti; Lima, Peru; and Padang, Indonesia, were similarly vulnerable. In those places, nearby tectonic faults are under strain, and building standards and disaster preparations are seen as inadequate.

    But not everywhere has been complacent. Over the past 76 years, many earthquakes have occurred along a fault in northern Turkey, starting in the eastern part of the country and progressing west, toward Istanbul. An earthquake in 1999 killed more than 17,000 people, mostly in the city of Izmit, east of Istanbul. The expectation is that the epicenter of the next big earthquake will be in or around Istanbul.

    “Istanbul is the place that has been most aggressive in enforcing building codes,” Dr. Tucker said. “I think Istanbul has been doing a good job.”

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 8:28 am on April 27, 2015 Permalink | Reply
    Tags: , , ,   

    From NOVA: “Fracking’s Hidden Hazards” 

    PBS NOVA

    NOVA

    22 Apr 2015
    Terri Cook

    Late on a Saturday evening in November 2011, Sandra Ladra was reclining in a chair in her living room in Prague, Oklahoma, watching television with her family. Suddenly, the house started to shake, and rocks began to fall off her stone-faced fireplace, onto the floor and into Ladra’s lap, onto her legs, and causing significant injuries that required immediate medical treatment.

    The first tremor that shook Ladra’s home was a magnitude-5.0 earthquake, an unusual event in what used to be a relatively calm state, seismically speaking. Two more struck the area over the next two days. More noteworthy, though, are her claims that the events were manmade. In a petition filed in the Lincoln County District Court, she alleges that the earthquake was the direct result of the actions of two energy companies, New Dominion and Spress Oil Company, that had injected wastewater fluids deep underground in the area.

    1
    House damage in central Oklahoma from a magnitude 5.7 earthquake on November 6, 2011. No image credit

    Ladra’s claim is not as preposterous as it may seem. Scientists have recognized since the 1960s that humans can cause earthquakes by injecting fluids at high pressure into the ground. This was first established near Denver, Colorado, at the federal chemical weapons manufacturing facility known as the Rocky Mountain Arsenal. Faced with the thorny issue of how to get rid of the arsenal’s chemical waste, the U.S. Army drilled a 12,044-feet-deep disposal well and began routinely injecting wastewater into it in March 1962.

    Less than seven weeks later, earthquakes were reported in the area, a region that had last felt an earthquake in 1882. Although the Army initially denied any link, when geologist David Evans demonstrated a strong correlation between the Arsenal’s average injection rate and the frequency of earthquakes, the Army agreed to halt its injections.

    Since then direct measurements, hydrologic modeling, and other studies have shown that earthquakes like those at the Rocky Mountain Arsenal occur when injection increases the fluid pressure in the pores and fractures of rocks or soil. By reducing the frictional force that resists fault slip, the increased pore pressure can lubricate preexisting faults. This increase alters the ambient stress level, potentially triggering earthquakes on favorably oriented faults.

    Although injection-induced earthquakes have become commonplace across broad swaths of the central and eastern U.S over the last few years, building codes—and the national seismic hazard maps used to update them—don’t currently take this increased hazard into account. Meanwhile, nagging questions—such as how to definitively diagnose an induced earthquake, whether manmade quakes will continue to increase in size, and how to judge whether mitigation measures are effective—have regulators, industry, and the public on shaky ground.

    Surge in Seismicity

    The quake that shook Ladra’s home is one example of the dramatic increase in seismicity that began across the central and eastern U.S. in 2001. Once considered geologically stable, the midcontinent has grown increasingly feisty, recording an 11-fold increase in the number of quakes between 2008 and 2011 compared with the previous 31 years, according to a study published in Geology in 2013.

    The increase has been especially dramatic in Oklahoma, which in 2014 recorded 585 earthquakes of magnitude 3.0 or greater—more than in the previous 35 years combined. “The increase in seismicity is huge relative to the past,” says Randy Keller, who retired in December after serving for seven years as the director of the Oklahoma Geological Survey (OGS).

    Yesterday, Oklahoma finally acknowledged that the uptick in earthquakes is likely due to wastewater disposal. “The Oklahoma Geological Survey has determined that the majority of recent earthquakes in central and north-central Oklahoma are very likely triggered by the injection of produced water in disposal wells,” the state reported on a new website. While the admission is an about-face for the government, which had previously questioned any link between the two, it doesn’t coincide with any new regulations intended to stop the earthquakes or improve building codes to cope with the tremors. For now, residents of Oklahoma may be just as vulnerable as they have been.

    This surge in seismicity has been accompanied by a spike in the number of injection wells and the corresponding amount of wastewater disposed via those wells. According to the Railroad Commission of Texas, underground wastewater injection in Texas increased from 46 million barrels in 2005 to nearly 3.5 billion barrels in 2011. Much of that fluid has been injected in the Dallas area, where prior to 2008, only one possible earthquake large enough to be noticed by people had occurred in recorded history. Since 2008, the U.S. Geological Survey (USGS) has documented over 120 quakes in the area.

    The increase in injection wells is due in large part to the rapid expansion of the shale-gas industry, which has unlocked vast new supplies of natural gas and oil that would otherwise be trapped in impermeable shale formations. The oil and gas is released by a process known as fracking, which injects a mix of water, chemicals, and sand at high enough pressure to fracture the surrounding rock, forming cracks through which the hydrocarbons, mixed with large volumes of fluid, can flow. The resulting mixture is pumped to the surface, where the hydrocarbons are separated out, leaving behind billions of gallons of wastewater, much of which is injected back underground.

    Many scientists, including Keller, believe there is a correlation between the two increases. “It’s hard to look at where the earthquakes are, and where the injection wells are, and not conclude there’s got to be some connection,” he says. Rex Buchanan, interim director of the Kansas Geological Survey (KGS), agrees there’s a correlation for most of the recent tremors in his state. “Certainly we’re seeing a huge spike in earthquakes in an area where we’ve also got big disposal wells,” he says. But there have been other earthquakes whose cause “we’re just not sure about,” Buchanan says.

    Diagnosing an Earthquake

    Buchanan’s uncertainty stems in part from the fact that determining whether a specific earthquake was natural or induced by human activity is highly controversial. Yet this is the fundamental scientific question at the core of Ladra’s lawsuit and dozens of similar cases that have been filed across the heartland over the last few years. Beyond assessing legal liability, this determination is also important for assessing potential seismic hazard as well as for developing effective methods of mitigation.

    One reason it’s difficult to assess whether a given earthquake was human-induced is that both types of earthquakes look similar on seismograms; they can’t be distinguished by casual observation. A second is that manmade earthquakes are unusual events; only about 0.1 percent of injection wells in the U.S. have been linked to induced earthquakes large enough to be felt, according to Arthur McGarr, a geologist at the USGS Earthquake Science Center. Finally, scientists have comparatively few unambiguous examples of induced earthquakes. That makes it difficult to create a yardstick against which potential “suspects” can be compared. Like a team of doctors attempting to diagnose a rare disease, scientists must examine all the “symptoms” of an earthquake to make the best possible pronouncement.

    To accomplish this, two University of Texas seismologists developed a checklist of seven “yes” and “no” questions that focus on four key characteristics: the area’s background seismicity, the proximity of an earthquake to an active injection well, the timing of the seismicity relative to the onset of injection, and the injection practices. Ultimately, “if an injection activity and an earthquake sequence correlate in space and time, with no known previous earthquake activity in the area, the earthquakes were likely induced,” wrote McGarr and co-authors in Science earlier this year.

    3
    Oilfield waste arrives by tanker truck at a wastewater disposal facility near Platteville, Colorado.

    These criteria, however, remain open to interpretation, as the Prague example illustrates. Ladra’s petition cites three scientific studies that have linked the increase in seismicity in central Oklahoma to wastewater injection operations. A Cornell University-led study, which specifically examined the earthquake in which Ladra claims she was injured, concluded that event began within about 200 meters of active injection wells—closely correlating in space—and was therefore induced.

    In a March 2013 written statement, the OGS had concluded that this earthquake was the result of natural causes, as were two subsequent tremors that shook Prague over the next few days. The second earthquake, a magnitude-5.7 event that struck less than 24 hours later, was the largest earthquake ever recorded in Oklahoma.

    The controversy hinged on several of the “symptoms,” including the timing of the seismicity. Prior to the Prague sequence, scientists believed that a lag time of weeks to months between the initiation of injection and the onset of seismicity was typical. But in Prague, the fluid injection has been occurring for nearly 20 years. The OGS therefore concluded that there was no clear temporal correlation. By contrast, the Cornell researchers decided that the diagnostic time scale of induced seismicity needs to be reconsidered.

    Another key issue that has been raised by the OGS is that of background seismicity. Oklahoma has experienced relatively large earthquakes in the past, including a magnitude-5.0 event that occurred in 1952 and more than 10 earthquakes of magnitude 4.0 or greater since then, so the Prague sequence was hardly the first bout of shaking in the region.

    The uncertainty associated with both these characteristics places the Prague earthquakes in an uncomfortable middle ground between earthquakes that are “clearly not induced” and “clearly induced” on the University of Texas checklist, making a definitive diagnosis unlikely. Meanwhile, the increasing frequency of earthquakes across the midcontinent and the significant size of the Prague earthquakes are causing scientists to rethink the region’s potential seismic hazard.

    Is the Public at Risk?

    Earthquake hazard is a function of multiple factors, including event magnitude and depth, recurrence interval, and the material through which the seismic waves propagate. These data are incorporated into calculations the USGS uses to generate the National Seismic Hazard Maps.

    Updated every six years, these maps indicate the potential for severe ground shaking across the country over a 50-year period and are used to set design standards for earthquake-resistant construction. The maps influence decisions about building codes, insurance rates, and disaster management strategies, with a combined estimated economic impact totaling hundreds of billions of dollars per year.

    When the latest version of the maps was released in July, the USGS intentionally excluded the hazard from manmade earthquakes. Part of the reason was the timing, according to Nicolas Luco, a research structural engineer at the USGS. The maps are released on a schedule that dovetails with building code revisions, so they couldn’t delay the charts even though the induced seismicity update wasn’t ready, he says.

    Such changes, however, may take years to implement. Luco notes that the building code revisions based upon the previous version of the USGS hazard maps, released in 2008, just became law in California in 2014, a six-year lag in one of the most seismically-threatened states in the country.

    Instead, the USGS is currently developing a separate procedure, which they call a hazard model, to account for the hazard associated with induced seismicity. The new model may raise the earthquake hazard level substantially in some parts of the U.S. where it has previously been quite low, according to McGarr. But there are still open questions about how to account for induced seismicity in maps of earthquake shaking and in building codes, Luco says.

    McGarr believes that the new hazard calculations will result in more rigorous building codes for earthquake-resistant construction and that adhering to these changes will affect the construction as well as the oil, gas, and wastewater injection industries. “Unlike natural earthquakes, induced earthquakes are caused by man, not nature, and so the oil and gas industry may be required to provide at least some of the funds needed to accommodate the revised building codes,” he says.

    But Luco says it may not make sense to incorporate the induced seismicity hazard, which can change from year to year, into building codes that are updated every six years. Over-engineering is also a concern due to the transient nature of induced seismicity. “Engineering to a standard of earthquake hazard that could go away, that drives up cost,” says Justin Rubinstein, a seismologist with the USGS Earthquake Science Center. A further complication, according to Luco, is that building code changes only govern new construction, so they don’t upgrade vulnerable existing structures, for which retrofit is generally not mandatory.

    The occurrence of induced earthquakes clearly compounds the risk to the public. “The risk is higher. The question is, how much higher?” Luco asks. Building codes are designed to limit the risk of casualties associated with building collapse—“and that usually means bigger earthquakes,” he says. So the critical question, according to Luco, is, “Can we can get a really large induced earthquake that could cause building collapses?”

    Others are wondering the same thing. “Is it all leading up to a bigger one?” asks Keller, former director of the OGS. “I don’t think it’s clear that it is, but it’s not clear that it isn’t, either,” he says. Recalling a magnitude-4.8 tremor that shook southern Kansas in November, KGS’ Buchanan agrees. “I don’t think there’s any reason to believe that these things are going to magically stop at that magnitude,” he says.

    Coping with Quakes

    After assessing how much the risk to the public has increased, our society must decide upon the best way to cope with human-induced earthquakes. A common regulatory approach, one which Oklahoma has adopted, has been to implement “traffic light” control systems. Normal injection can proceed under a green light, but if induced earthquakes begin to occur, the light changes to yellow, at which point the operator must reduce the volume, rate of injection, or both to avoid triggering larger events. If larger earthquakes strike, the light turns red, and further injection is prohibited. Such systems have recently been implemented in Oklahoma, Colorado, and Texas.

    But how will we know if these systems are effective? The largest Rocky Mountain Arsenal-related earthquakes, three events between magnitudes 5.0 and 5.5, all occurred more than a year after injection had ceased, so it’s unclear for how long the systems should be evaluated. Their long-term effectiveness is also uncertain because the ability to control the seismic hazard decreases over time as the pore pressure effects move away from the well, according to Shemin Ge, a hydrogeologist at the University of Colorado, Boulder.

    Traffic light systems also rely on robust seismic monitoring networks that can detect the initial, very small injection-induced earthquakes, according to Ge. To identify hazards while there is still sufficient time to take corrective action, it’s ideal to identify events of magnitude 2.0 or less, wrote McGarr and his co-authors in Science. However, the current detection threshold across much of the contiguous U.S. is magnitude 3.0, he says.

    Kansas is about to implement a mitigation approach that focuses on reducing injection in multiple wells across areas believed to be underlain by faults, rather than focusing on individual wells, according to Buchanan. He already acknowledges that it will be difficult to assess the success of this new approach because in the past, the KGS has observed reductions in earthquake activity when no action has been taken. “How do you tease apart what works and what doesn’t when you get all this variability in the system?” he asks.

    This climate of uncertainty leaves regulators, industry, and the public on shaky ground. As Ladra’s case progresses, the judicial system will decide if two energy companies are to blame for the quake that damaged her home. But it’s our society that must ultimately decide how, and even if, we should cope with manmade quakes, and what level of risk we’re willing to accept.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NOVA is the highest rated science series on television and the most watched documentary series on public television. It is also one of television’s most acclaimed series, having won every major television award, most of them many times over.

     
  • richardmitnick 7:59 am on April 27, 2015 Permalink | Reply
    Tags: , ,   

    From COSMOS: “Breakthrough for quantum computers” 

    Cosmos Magazine bloc

    COSMOS

    27 Apr 2015
    Cathal O’Connell

    1
    Andrea Morello at work. The computer he and his team are trying to build would use silicon chips not dissimilar to those in a conventional computer.Credit: Marcus Eno

    Electrical engineers at the University of New South Wales trying to develop a silicon quantum computer have cleared one of the last hurdles to building a simple device. The researchers have reported this missing piece in the journal Science Advances.

    “Once you have demonstrated all the parts, then it’s like a Lego box – you can start building up a large architecture by piecing its components together,” project leader Andrea Morello says.

    In their quest to build a silicon quantum computer, Morello and his colleagues have so far been perfecting its basic element, the “quantum bit”. This is a single phosphorus atom entombed in a silicon crystal. Using a carefully tuned magnetic field, the researchers can manipulate the atom’s quantum “spin”, flipping it up or down.

    That phosphorus atom is equivalent to a transistor in an ordinary computer. A transistor is on or off, which is how it represents the 1s and 0s of the binary code the computer uses to process instructions. A quantum bit is more complex. It can be spin-up, spin-down or in a “superposition” of both: 1 and 0 at the same time. Theoretically, this should enable a quantum computer to weigh multiple solutions to a complex problem at once, and solve it at phenomenal speed.

    A quantum computer is “not just a ‘faster’ computer,” Morello says. “They are the equivalent of a jet plane to a bicycle.”

    Last year the UNSW team showed they can write, read and store the spin of a single quantum bit with better than 99.99% accuracy using a magnetic field. But to carry out complex calculations, a quantum computer needs thousands, or even millions of quantum bits, that can all be individually controlled. And for that, the high frequency oscillating magnetic fields Morello has been using to master the control of a single quantum bit are not suitable.

    For a start, the magnetic field generators Morello and his team used are around $100,000 a pop. If they had to use one for each quantum bit in a large array, the cost would be astronomical. There is also a practical problem. Magnetic fields spread, making it impossible to control one quantum bit in an array without inadvertently affecting all its neighbours.

    In their latest work, carried out by experimental physicist Arne Laucht, Morello and his team found a way to control each quantum bit using a simple electrical pulse. Instead of each phosphorus atom having a dedicated magnetic field generator to control it, their new design floods the whole device with a single magnetic field.

    This field is broadcast at a frequency the phosphorus atoms are not tuned in to, and so they don’t feel its magnetic tug. But when a precise electrical pulse is applied to the quantum bit, the electron orbiting the phosphorus atom feels a strong force, stretching its orbit. This distortion to the electron’s orbit works like twisting a tuning knob on a radio – the phosphorus atom is tuned in to the frequency of the magnetic field being broadcast around it, which then causes the quantum bit to flip.

    By timing their electrical pulses, the team can tune the phosphorus atom in and out of the oscillating magnetic field, and so flip the phosphorus atom’s spin into any position they want – up, down or an intermediate superposition – without affecting its neighbours.

    This idea of combining electric and magnetic fields to control individual quantum bits in an array, called “A-gate” control, has been around since 1998. Bruce Kane, an American quantum physicist who was then working at UNSW, proposed it in a paper in Nature that Morello calls “visionary”. Now, 17 years later, technology has caught up with Kane’s ideas as we can now routinely make structures at the scale needed to build his design.

    Kane – now at the University of Maryland and not directly involved in Morello’s research – says he’s been impressed by the “outstanding” work on the design done at UNSW in recent years. The devices work even better than he anticipated. Back in 1998, Kane worried that imperfections in the materials would prevent the device from working as it should. But, he says, the recent work at UNSW, such as the demonstration of an A-gate, proves material imperfections “will not be a show-stopper for silicon quantum computing”.

    Kane cautions that we are still a long way from large-scale quantum computing in silicon, as the challenges that remain, such as moving quantum information around and controlling interactions between large numbers of spins, are daunting. “I continue to believe that large-scale silicon quantum computing will become a reality, but there is still a long, steep road ahead of us,” he says.

    The group is already at work on these challenges. Morello is confident they will have all the elements in place to build a small-scale test-system within 10 years.

    And as for a large-scale quantum computer capable of making useful calculations? Here, Morello is more coy: “To quote Niels Bohr, ‘It’s hard to make predictions, especially about the future’.”

    More on this topic from Cosmos: The quantum spinmeister

    Can physics protect us from Big Brother’s snooping?

    Quantum computing? Yes, no and maybe.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 7:17 am on April 27, 2015 Permalink | Reply
    Tags: , ,   

    From AAAS: “Breast cancer drug may help men with prostate cancer” 

    AAAS

    AAAS

    24 April 2015
    Jocelyn Kaiser

    1
    Prostate cancer cells SPL/Science Source

    A new type of cancer drug originally aimed at women with rare, inherited forms of breast and ovarian cancer may also help a broader swath of patients, according to a small clinical study. The drug halted tumor growth in a third of men with a typically deadly form of advanced prostate cancer. Nearly all of those who responded had related mutations in their tumors, indicating the drug was targeting a common cell process, researchers reported here this week at the annual meeting of the American Association for Cancer Research (AACR).

    The drug blocks an enzyme called poly (adenosine diphosphate [ADP]-ribose) polymerase (PARP), which helps cells repair a certain type of DNA damage. Oncologists are mostly testing PARP inhibitors in ovarian and breast cancer patients born with mutations in BRCA1 or BRCA2, two of the most infamous cancer-related genes. These mutations raise a woman’s risk for breast and ovarian cancer, as well as a man’s risk of prostate cancer, because they disable proteins that repair DNA damage that can result in additional cancer-spurring mutations. But flaws in either gene also make tumor cells vulnerable to PARP inhibitors, because the drugs further impair tumor cells’ DNA repair machinery. This combination renders tumor cells unable to fix DNA damage and they die, an idea known as synthetic lethality.

    In December, the first PARP inhibitor, AstraZeneca’s olaparib, received approval in the United States and Europe for ovarian cancer patients who had inherited a BRCA1 or BRCA2 mutation.

    But some cancer patients who lack such mutations have also seen their tumors shrink in trials. A team led by Johann de Bono of the Institute of Cancer Research and the Royal Marsden NHS Foundation Trust, both in London, suspected that these patients had inherited errors in other DNA repair genes or had acquired mutations in BRCA or the other genes in a tumor as it formed or grew. Three years ago, a large sequencing project found that such DNA repair gene defects are common in advanced prostate tumors.

    To test their hypothesis, de Bono’s group and collaborators, whose funding was independent from AstraZeneca, gave the drug to 50 men with metastatic castration-resistant prostate cancer, which means their tumors had stopped responding to drugs that block the hormones that drive prostate cancer growth. Of the 49 men who stayed in the trial, 33%, or 16 patients, responded to the drug, according to one of three measures—a drop in levels of tumor cells in the patient’s blood, a decline in blood levels of the biomarker prostate-specific antigen, or imaging scans that found their tumors shrank. When the researchers sequenced the patients’ tumor DNA, they found their hunch was correct: Fourteen of the 16 who responded had mutations in one or more of a dozen DNA repair genes in their tumors, and only two nonresponders had these mutations, reported Joaquin Mateo, a clinical fellow in de Bono’s lab, at the AACR meeting. (While three responders had inherited BRCA2 mutations, four had apparently new mutations in this gene.) Most of these patients responded to the drug for at least 6 months (four for more than 1 year), while those without such mutations usually got worse within 3 months.

    Although genetic tests of tumors are already used to determine whether certain drugs will work for several types of cancer, this is the first time researchers have found such a test for prostate cancer, de Bono’s group says. Olaparib could offer a new option for these men: The trial shows “this is a good swat at that disease,” said prostate cancer researcher William Nelson of Johns Hopkins University in Baltimore, Maryland, at an AACR press conference, adding that the prospect of genetic testing to identify prostate cancer patients who could benefit from olaparib “looks very promising.”

    The results also suggest that women with ovarian and breast cancer who lack an inherited BRCA mutation might still respond to PARP inhibitors, if they have DNA repair mutations in their tumors, de Bono’s group says. Ursula Matulonis of the Dana-Farber Cancer Institute in Boston, who presented results at AACR from a trial of olaparib combined with another drug for breast and ovarian cancer patients, said at the press conference that her team plans to explore that possibility by DNA testing biopsies from the patients.

    See the full article here.

    The American Association for the Advancement of Science is an international non-profit organization dedicated to advancing science for the benefit of all people.

    Please help promote STEM in your local schools.
    STEM Icon
    Stem Education Coalition

     
  • richardmitnick 11:24 am on April 26, 2015 Permalink | Reply
    Tags: , , ,   

    From livescience: “Melanoma Tumor ‘Dissolves’ After 1 Dose of New Drug Combo” 

    Livescience

    April 24, 2015
    Laura Geggel

    1
    A CT scan of the woman’s tumor highlighted by the asterisk (C) before treatment and after treatment (D).
    Credit: The New England Journal of Medicine, Copyright 2015

    A large melanoma tumor on a woman’s chest disappeared so quickly that it left a gaping hole in its place after she received a new treatment containing two melanoma drugs, a new case report finds.

    Doctors are still monitoring the 49-year-old woman, but she was free of melanoma — a type of skin cancer that can be deadly — at her last checkup, said the report’s lead author, Dr. Paul Chapman, an attending physician and head of the melanoma section at the Memorial Sloan Kettering Cancer Center in New York.

    The woman took the same two drugs as more than 100 people with melanoma who took part in a recent study. For most of the study participants who took these drugs, the combination worked better than one drug alone. But the doctors were surprised by how well the drug combination worked to treat this particular woman’s cancer — they had not anticipated that a melanoma tumor could disappear so quickly that it would leave a cavity in the body — and thus wrote the report describing her case.

    “What was unusual was the magnitude [of recovery], and how quickly it happened,” Chapman told Live Science. However, doctors are wary of the drug combination because it does not work for everyone, and can have side effects, such as severe diarrhea.

    Both the study of the drug combination and the woman’s case report were published Monday (April 20) in the New England Journal of Medicine. The drug combination is part of a relatively recent approach to treating melanoma with medications that boost a person’s own immune system, called immunotherapy.

    One of the drugs in the combination was ipilimumab (sold under the brand name Yervoy), which works by removing an inhibitory mechanism that can stop certain immune cells from killing cancer cells.

    In the study, researchers combined ipilimumab with another drug, called nivolumab (brand name Opdivo), which can prevent immune cells called T cells from dying, Chapman said.

    The U.S. Food and Drug Administration has approved ipilimumab and nivolumab separately as melanoma drugs but has not approved their combined use. The researchers’ study was aimed at testing how the two drugs worked when used in tandem.

    In the study, doctors gave treatments to 142 people with metastatic melanoma (melanoma that has spread to other parts of the body) — some participants received the combination, and others received ipilimumab plus a placebo. Neither the participants nor their doctors knew who had received which treatment until the trial had ended.

    2
    A woman with melanoma developed a large tumor on her abdomen (A), but after one combination treatment of two immunotherapy drugs, it disappeared (B) within three weeks. Credit: The New England Journal of Medicine, Copyright 2015.

    The new drug combination had better results than the ipilimumab-plus-placebo treatment, the researchers found.

    In one analysis, the researchers focused on 109 patients who did not have a mutation in a gene called the BRAF gene. (BRAF mutations are linked to a number of cancers, including melanoma, and there are other melanoma drugs that target BRAF mutations.) Among the 72 people in this group who took the combination, 61 percent saw their cancer shrink, compared with just 11 percent of the 37 people in the group who took only ipilimumab.

    What’s more, melanoma was undetectable in 22 percent of the combination group at the end of the study, which was funded by Bristol-Myers Squibb, which makes the drugs. None of the people taking ipilimumab plus a placebo saw their melanoma disappear by the time the study had ended.

    Twenty-two percent may not sound high, but in the world of melanoma treatment, it is significant, said Dr. Sylvia Lee, an assistant professor of medicine at the University of Washington, Seattle Cancer Care Alliance and Fred Hutchinson Cancer Research Center. Lee was not involved in the new study, but she is working with patients who are receiving the drug combination in Seattle.

    A complete response to treatment is “the Holy Grail,” she said. “That’s what everyone wants, where all of the cancer disappears. We’re talking about patients with stage IV melanoma. Usually, in cancers, when someone has stage IV disease, for the majority of people, it’s no longer curable.”

    It’s unclear whether melanoma will reoccur in any of the patients in the new study. Doctors are following them to see whether the people who are taking the combination drugs live longer than expected, Chapman said.

    Side effects

    However, the ipilimumab with nivolumab combination comes with serious side effects, such as colitis (swelling of the colon), diarrhea and problems with the endocrine glands (which produce hormones).

    About 54 percent of the patients in the study who were taking the combination reported serious side effects, compared with 24 percent of the people taking only ipilimumab, the researchers found.

    The treatments are given three weeks apart, but some people can tolerate only one or two treatments out of the suggested four before they stop taking the medicine, Lee said. In the new study, about 60 percent of the participants taking the combination finished all four treatments, compared with 70 percent of the ipilimumab-only group.

    The side effects can be brutal, Lee said. “This is diarrhea that is 25 to 40 times a day,” she said.

    Future trials may help researchers refine the number of treatments needed and figure out how effective just one or two treatments can be. The current trial is over, but certain cancer centers are still offering the drug combination through an expanded access program, which is how the woman whose tumor disappeared got the medicine.

    Her case shows that immunotherapy can work quickly: Her tumor vanished within three weeks of receiving her first treatment, the researchers found.

    “I was astonished; I’d never seen anything like that,” Chapman said. “She said the tumor had just kind of dissolved.”

    However, the combination may pose a risk if it dissolves a tumor somewhere else the body, and leaves a hole behind.

    “I think that it is a huge concern,” Lee said. “It is something to consider if you do have a patient with a tumor [invading] a vital organ.”

    The medications are also pricey. Ipilimumab costs $120,000 for four treatments, and nivolumab is priced at $12,500 a month, the Wall Street Journal reported.

    Still, the drug combination may offer a new and promising treatment for people with melanoma if the FDA approves it, Chapman said.

    “It kind of confirms an assumption that we’ve all had for many decades: that the immune system can recognize cancers and can kill large tumors if properly activated,” Chapman said.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 11:02 am on April 26, 2015 Permalink | Reply
    Tags: , ,   

    From Brown: “Tapeworm drug shows promise against MRSA” 

    Brown University
    Brown University

    April 23, 2015
    David Orenstein

    1
    A nasty, dangerous superbug Methicillin-resistant Staphylococcus aureus — MRSA — kills thousands of people in America every year. A common tapeworm drug, already approved for use in humans, could be a new tool against MRSA. Image: NIH/NIAID

    A new study provides evidence from lab experiments that a drug already used in people to fight tapeworms might also prove effective against strains of the superbug MRSA, which kills thousands of people a year in the United States.

    The paper, published in the journal PLoS ONE, showed that niclosamide, which is on World Health Organization’s list of essential medicines, suppressed the growth of dozens of methicillin-resistant Staphylococcus aureus (MRSA) cultures in lab dishes and preserved the lives of nematode worms infected with the superbug. In these tests, both niclosamide and a closely related veterinary parasite drug, oxyclozanide, proved to be as effective (at lower concentrations) as the current last-resort clinical treatment, vancomycin.

    The drugs both belong to a family of medicines called salicylanilide anthelmintics and they both also trounced another “gram positive” pathogen, Enterococcus faecium, in lab tests.

    “Since niclosamide is FDA approved and all of the salicylanilide anthelmintic drugs are already out of patent, they are attractive candidates for drug repurposing and warrant further clinical investigation for treating staphylococcal infections,” wrote the lead author Rajmohan Rajamuthiah, a postdoctoral scholar in the Warren Alpert Medical School of Brown University and Rhode Island Hospital.

    Last year the team reported that after screening more than 600 drugs against infected nematode worms, it had found that the salicylanilide anthelmintic drug closantel appeared to be protective for the worms. That led to the new research, where they tested niclosamide and oxyclozanide.

    Encouraging experiments

    In their experiments, even low concentrations of the drugs allowed more than 90 percent of MRSA-infected worms to survive, compared to less than 20 percent survival among controls. In the petri dishes the drugs cleared gaping zones of growth inhibition in MRSA culture spread over the plate, while a control substance did nothing.

    Between the two, oxyclozanide proved to be a more effective MRSA killer, while niclosamide effectively suppressed MRSA growth but did not completely eradicate the bacteria. Although niclosamide proved to be “bacteriostatic” instead of “bactericidal” like oxyclozanide, it may still pack plenty of punch to keep MRSA in check and give the body’s immune system the upper hand, Rajamuthiah said.

    The researchers tested the effects of the drugs on mammalian cells, including sheep red blood cells (which fared just fine) and cancerous human liver cells (which happen to be easier to use than healthy liver cells). Niclosamide proved to be significantly toxic against the cancer cells, which other studies had shown before, but the drug is already approved for human use.

    The team also tested a hypothesis about how the drugs attack the bacteria. As they suspected, oxyclozanide appeared to work by disrupting the bacterial cell’s membranes, but there was no sign that niclosamide worked the same way.

    Further testing

    The researchers acknowledge that petri dishes and worms are not substitutes for people, and some issues need further investigation. For example, people have been shown to clear niclosamide out of their systems quickly, and the drug does a poor job of working its way out of the bloodstream and deep into tissues.

    “The low level of systemic circulation coupled with the rapid elimination profile of niclosamide suggests the necessity for further testing of the potential of niclosamide and oxyclozanide for treating systemic infections,” they wrote. “Further studies should include the evaluation of these compounds in systemic and localized infection models in rodents.”

    Rodent experiments are being planned.

    But there may also be an upside to the rapid clearance, Rajamuthiah said. That might limit the toxicity of the drug, and until it is tested, it’s not clear that quick clearance would undermine the drug’s performance against MRSA.

    “Remember that no one has ever tested niclosamide for treating bacterial infections,” he said.

    If niclosamide, which is already used in humans for one purpose, can also help them fight off a superbug, or if its apparently more effective and less toxic cousin oxyclozanide can gain approval for human use, doctors could obtain much needed ammunition against MRSA.

    “The relatively mild toxicity of oxyclozanide is encouraging based on in vitro tests,” Rajamuthiah said. “Since it has never been tested in humans and since it belongs to the same structural family as niclosamide, our findings give strong impetus to using oxyclozanide for further investigations.”

    Particularly important is that because oxyclozanide attacks the cell membrane instead of metabolic pathways, it may be more difficult for MRSA to develop resistance, Rajamuthiah said.

    In addition to Rajamuthiah, the paper’s other authors are senior and corresponding author Dr. Eleftherios Mylonakis, Beth Burgwyn Fuchs, Elamparithi Jayamani, Bumsup Kwon, and Wooseong Kim, all of Brown University and Rhode Island Hospital, and Annie L. Conery, and Frederick M. Ausubel of Massachusetts General Hospital.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    Welcome to Brown

    Brown U Robinson Hall
    Located in historic Providence, Rhode Island and founded in 1764, Brown University is the seventh-oldest college in the United States. Brown is an independent, coeducational Ivy League institution comprising undergraduate and graduate programs, plus the Alpert Medical School, School of Public Health, School of Engineering, and the School of Professional Studies.

    With its talented and motivated student body and accomplished faculty, Brown is a leading research university that maintains a particular commitment to exceptional undergraduate instruction.

    Brown’s vibrant, diverse community consists of 6,000 undergraduates, 2,000 graduate students, 400 medical school students, more than 5,000 summer, visiting and online students, and nearly 700 faculty members. Brown students come from all 50 states and more than 100 countries.

    Undergraduates pursue bachelor’s degrees in more than 70 concentrations, ranging from Egyptology to cognitive neuroscience. Anything’s possible at Brown—the university’s commitment to undergraduate freedom means students must take responsibility as architects of their courses of study.

     
  • richardmitnick 2:13 pm on April 25, 2015 Permalink | Reply
    Tags: , ,   

    From NOVA: “Invisible Universe Revealed”- The Story of the Hubble Space Telescope 

    PBS NOVA

    NOVA

    Apr 25, 2015

    25 years ago, NASA launched one of the most ambitious experiments in the history of astronomy: the Hubble Space Telescope. In honor of Hubble’s landmark anniversary, this show tells the remarkable story of the telescope that forever changed our understanding of the cosmos and our place in it. But amazingly, when the telescope first sent images back to earth, it seemed that the entire project was a massive failure; a one-millimeter engineering blunder had turned the billion-dollar telescope into an object of ridicule. It fell to five heroic astronauts in a daring mission to return Hubble to the cutting edge of science. Hear from the scientists and engineers on the front line who tell the amazing Hubble story as never before. This single telescope has helped astronomers pinpoint the age of the universe, revealed the birthplace of stars and planets, advanced our understanding of dark energy and cosmic expansion, and uncovered black holes lurking at the heart of galaxies. For more than a generation, Hubble’s stunning images have brought the beauty of the heavens to millions, revealing a cosmos richer and more wondrous than we ever imagined. Enjoy the story of this magnificent machine and its astonishing discoveries.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NOVA is the highest rated science series on television and the most watched documentary series on public television. It is also one of television’s most acclaimed series, having won every major television award, most of them many times over.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
Follow

Get every new post delivered to your Inbox.

Join 434 other followers

%d bloggers like this: