Recent Updates Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 4:24 pm on March 27, 2017 Permalink | Reply
    Tags: , , , , Naval Research Laboratory/ Discovery Channel Telescope, WAS 49   

    From JPL-Caltech: “NuSTAR Probes Puzzling Galaxy Merger” 

    NASA JPL Banner

    JPL-Caltech

    March 27, 2017
    Elizabeth Landau
    Jet Propulsion Laboratory, Pasadena, Calif.
    818-354-6425
    elizabeth.landau@jpl.nasa.gov

    1
    This optical image shows the Was 49 system, which consists of a large disk galaxy, Was 49a, merging with a much smaller “dwarf” galaxy Was 49b. Image credit: DCT/NRL [Discovery Channel Telescope, Happy Jack, AZ, USA/ Naval Research Laboratory.]

    A supermassive black hole inside a tiny galaxy is challenging scientists’ ideas about what happens when two galaxies become one.

    Was 49 is the name of a system consisting of a large disk galaxy, referred to as Was 49a, merging with a much smaller “dwarf” galaxy called Was 49b. The dwarf galaxy rotates within the larger galaxy’s disk, about 26,000 light-years from its center. Thanks to NASA’s Nuclear Spectroscopic Telescope Array (NuSTAR) mission, scientists have discovered that the dwarf galaxy is so luminous in high-energy X-rays, it must host a supermassive black hole much larger and more powerful than expected.


    NASA/NuSTAR

    “This is a completely unique system and runs contrary to what we understand of galaxy mergers,” said Nathan Secrest, lead author of the study and postdoctoral fellow at the U.S. Naval Research Laboratory in Washington.

    Data from NuSTAR and the Sloan Digital Sky Survey suggest that the mass of the dwarf galaxy’s black hole is huge, compared to similarly sized galaxies, at more than 2 percent of the galaxy’s own mass.


    SDSS Telescope at Apache Point Observatory, NM, USA

    “We didn’t think that dwarf galaxies hosted supermassive black holes this big,” Secrest said. “This black hole could be hundreds of times more massive than what we would expect for a galaxy of this size, depending on how the galaxy evolved in relation to other galaxies.”

    The dwarf galaxy’s black hole is the engine of an active galactic nucleus (AGN), a cosmic phenomenon in which extremely high-energy radiation bursts forth as a black hole devours gas and dust. This particular AGN appears to be covered by a donut-shaped structure made of gas and dust. NASA’s Chandra and Swift missions were used to further characterize the X-ray emission.


    NASA/Chandra Telescope


    NASA/SWIFT Telescope

    Normally, when two galaxies start to merge, the larger galaxy’s central black hole becomes active, voraciously gobbling gas and dust, and spewing out high-energy X-rays as matter gets converted into energy. That is because, as galaxies approach each other, their gravitational interactions create a torque that funnels gas into the larger galaxy’s central black hole. But in this case, the smaller galaxy hosts a more luminous AGN with a more active supermassive black hole, and the larger galaxy’s central black hole is relatively quiet.

    Normally, when two galaxies start to merge, the larger galaxy’s central black hole becomes active, voraciously gobbling gas and dust, and spewing out high-energy X-rays as matter gets converted into energy. That is because, as galaxies approach each other, their gravitational interactions create a torque that funnels gas into the larger galaxy’s central black hole. But in this case, the smaller galaxy hosts a more luminous AGN with a more active supermassive black hole, and the larger galaxy’s central black hole is relatively quiet.

    An optical image of the Was 49 system, compiled using observations from the Discovery Channel Telescope in Happy Jack, Arizona, uses the same color filters as the Sloan Digital Sky Survey.


    Discovery Channel Telescope at Lowell Observatory, Happy Jack AZ, USA

    Since Was 49 is so far away, these colors are optimized to separate highly-ionized gas emission, such as the pink-colored region around the feeding supermassive black hole, from normal starlight, shown in green. This allowed astronomers to more accurately determine the size of the dwarf galaxy that hosts the supermassive black hole.

    The pink-colored emission stands out in a new image because of the intense ionizing radiation emanating from the powerful AGN. Buried within this region of intense ionization is a faint collection of stars, believed to be part of the galaxy surrounding the enormous black hole. These striking features lie on the outskirts of the much larger spiral galaxy Was 49a, which appears greenish in the image due to the distance to the galaxy and the optical filters used.

    Scientists are still trying to figure out why the supermassive black hole of dwarf galaxy Was 49b is so big. It may have already been large before the merger began, or it may have grown during the very early phase of the merger.

    “This study is important because it may give new insight into how supermassive black holes form and grow in such systems,” Secrest said. “By examining systems like this, we may find clues as to how our own galaxy’s supermassive black hole formed.”

    In several hundred million years, the black holes of the large and small galaxies will merge into one enormous beast.

    NuSTAR is a Small Explorer mission led by Caltech and managed by JPL for NASA’s Science Mission Directorate in Washington. NuSTAR was developed in partnership with the Danish Technical University and the Italian Space Agency (ASI). The spacecraft was built by Orbital Sciences Corp., Dulles, Virginia. NuSTAR’s mission operations center is at UC Berkeley, and the official data archive is at NASA’s High Energy Astrophysics Science Archive Research Center. ASI provides the mission’s ground station and a mirror archive.

    For more information on NuSTAR, visit:

    http://www.nasa.gov/nustar

    http://www.nustar.caltech.edu

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NASA JPL Campus

    Jet Propulsion Laboratory (JPL) is a federally funded research and development center and NASA field center located in the San Gabriel Valley area of Los Angeles County, California, United States. Although the facility has a Pasadena postal address, it is actually headquartered in the city of La Cañada Flintridge [1], on the northwest border of Pasadena. JPL is managed by the nearby California Institute of Technology (Caltech) for the National Aeronautics and Space Administration. The Laboratory’s primary function is the construction and operation of robotic planetary spacecraft, though it also conducts Earth-orbit and astronomy missions. It is also responsible for operating NASA’s Deep Space Network.

    Caltech Logo

    NASA image

     
  • richardmitnick 10:54 am on March 27, 2017 Permalink | Reply
    Tags: , , , , Power factor,   

    From Rutgers: “How Graphene Could Cool Smartphone, Computer and Other Electronics Chips” 

    Rutgers University
    Rutgers University

    March 27, 2017
    Todd B. Bates

    1
    Graphene, a one-atom-thick layer of graphite, consists of carbon atoms arranged in a honeycomb lattice. Photo: OliveTree/Shutterstock

    Rutgers scientists lead research that discovers potential advance for the electronics industry.

    With graphene, Rutgers researchers have discovered a powerful way to cool tiny chips – key components of electronic devices with billions of transistors apiece.

    “You can fit graphene, a very thin, two-dimensional material that can be miniaturized, to cool a hot spot that creates heating problems in your chip, said Eva Y. Andrei, Board of Governors professor of physics in the Department of Physics and Astronomy. “This solution doesn’t have moving parts and it’s quite efficient for cooling.”

    The shrinking of electronic components and the excessive heat generated by their increasing power has heightened the need for chip-cooling solutions, according to a Rutgers-led study published recently in Proceedings of the National Academy of Sciences. Using graphene combined with a boron nitride crystal substrate, the researchers demonstrated a more powerful and efficient cooling mechanism.

    “We’ve achieved a power factor that is about two times higher than in previous thermoelectric coolers,” said Andrei, who works in the School of Arts and Sciences.

    The power factor refers to the effectiveness of active cooling. That’s when an electrical current carries heat away, as shown in this study, while passive cooling is when heat diffuses naturally.

    Graphene has major upsides. It’s a one-atom-thick layer of graphite, which is the flaky stuff inside a pencil. The thinnest flakes, graphene, consist of carbon atoms arranged in a honeycomb lattice that looks like chicken wire, Andrei said. It conducts electricity better than copper, is 100 times stronger than steel and quickly diffuses heat.

    The graphene is placed on devices made of boron nitride, which is extremely flat and smooth as a skating rink, she said. Silicon dioxide – the traditional base for chips – hinders performance because it scatters electrons that can carry heat away.

    In a tiny computer or smartphone chip, billions of transistors generate lots of heat, and that’s a big problem, Andrei said. High temperatures hamper the performance of transistors – electronic devices that control the flow of power and can amplify signals – so they need cooling.

    Current methods include little fans in computers, but the fans are becoming less efficient and break down, she said. Water is also used for cooling, but that bulky method is complicated and prone to leaks that can fry computers.

    “In a refrigerator, you have compression that does the cooling and you circulate a liquid,” Andrei said. “But this involves moving parts and one method of cooling without moving parts is called thermoelectric cooling.”

    Think of thermoelectric cooling in terms of the water in a bathtub. If the tub has hot water and you turn on the cold water, it takes a long time for the cold water below the faucet to diffuse in the tub. This is passive cooling because molecules slowly diffuse in bathwater and become diluted, Andrei said. But if you use your hands to push the water from the cold end to the hot, the cooling process – also known as convection or active cooling – will be much faster.

    The same process takes place in computer and smartphone chips, she said. You can connect a piece of wire, such as copper, to a hot chip and heat is carried away passively, just like in a bathtub.

    Now imagine a piece of metal with hot and cold ends. The metal’s atoms and electrons zip around the hot end and are sluggish at the cold end, Andrei said. Her research team, in effect, applied voltage to the metal, sending a current from the hot end to the cold end. Similar to the case of active cooling in the bathtub example, the current spurred the electrons to carry away the heat much more efficiently than via passive cooling. Graphene is actually superior in both its passive and active cooling capability. The combination of the two makes graphene an excellent cooler.

    “The electronics industry is moving towards this kind of cooling,” Andrei said. “There’s a very big research push to incorporate these kinds of coolers. There is a good chance that the graphene cooler is going to win out. Other materials out there are much more expensive, they’re not as thin and they don’t have such a high power factor.”

    The study’s lead author is Junxi Duan, a Rutgers physics post-doctoral fellow. Other authors include Xiaoming Wang, a Rutgers mechanical engineering post-doctoral fellow; Xinyuan Lai, a Rutgers physics undergraduate student; Guohong Li, a Rutgers physics research associate; Kenji Watanabe and Takashi Taniguchi of the National Institute for Materials Science in Tsukuba, Japan; Mona Zebarjadi, a former Rutgers mechanical engineering professor who is now at the University of Virginia; and Andrei. Zebarjadi conducted a previous study on electronic cooling using thermoelectric devices.

    See the full article here .

    Follow Rutgers Research here .

    Please help promote STEM in your local schools.

    rutgers-campus

    STEM Icon

    Stem Education Coalition

    Rutgers, The State University of New Jersey, is a leading national research university and the state’s preeminent, comprehensive public institution of higher education. Rutgers is dedicated to teaching that meets the highest standards of excellence; to conducting research that breaks new ground; and to providing services, solutions, and clinical care that help individuals and the local, national, and global communities where they live.

    Founded in 1766, Rutgers teaches across the full educational spectrum: preschool to precollege; undergraduate to graduate; postdoctoral fellowships to residencies; and continuing education for professional and personal advancement.

    Rutgers smaller
    Get us back our original seal, one of the most beautiful seals of any university, stolen from our use by the University.

     
  • richardmitnick 10:24 am on March 27, 2017 Permalink | Reply
    Tags: , , , , , , ESO X-shooter, IRAS F23128-5919, Stars Born in Winds from Supermassive Black Holes   

    From ESO: “Stars Born in Winds from Supermassive Black Holes” 

    ESO 50 Large

    European Southern Observatory

    27 March 2017
    Roberto Maiolino
    Cavendish Laboratory, Kavli Institute for Cosmology
    University of Cambridge, UK
    Email: r.maiolino@mrao.cam.ac.uk

    Richard Hook
    ESO Public Information Officer
    Garching bei München, Germany
    Tel: +49 89 3200 6655
    Cell: +49 151 1537 3591
    Email: rhook@eso.org

    1
    Observations using ESO’s Very Large Telescope have revealed stars forming within powerful outflows of material blasted out from supermassive black holes at the cores of galaxies. These are the first confirmed observations of stars forming in this kind of extreme environment. The discovery has many consequences for understanding galaxy properties and evolution. The results are published in the journal Nature.

    A UK-led group of European astronomers used the MUSE and X-shooter instruments on the Very Large Telescope (VLT) at ESO’s Paranal Observatory in Chile to study an ongoing collision between two galaxies, known collectively as IRAS F23128-5919, that lie around 600 million light-years from Earth. The group observed the colossal winds of material — or outflows — that originate near the supermassive black hole at the heart of the pair’s southern galaxy, and have found the first clear evidence that stars are being born within them [1].

    2
    IRAS F23128-5919 https://inspirehep.net/record/1265769/plots


    ESO/MUSE on VLT


    ESO X-shooter on VLT

    Such galactic outflows are driven by the huge energy output from the active and turbulent centres of galaxies. Supermassive black holes lurk in the cores of most galaxies, and when they gobble up matter they also heat the surrounding gas and expel it from the host galaxy in powerful, dense winds [2].

    “Astronomers have thought for a while that conditions within these outflows could be right for star formation, but no one has seen it actually happening as it’s a very difficult observation,” comments team leader Roberto Maiolino from the University of Cambridge. “Our results are exciting because they show unambiguously that stars are being created inside these outflows.”

    The group set out to study stars in the outflow directly, as well as the gas that surrounds them. By using two of the world-leading VLT spectroscopic instruments, MUSE and X-shooter, they could carry out a very detailed study of the properties of the emitted light to determine its source.

    Radiation from young stars is known to cause nearby gas clouds to glow in a particular way. The extreme sensitivity of X-shooter allowed the team to rule out other possible causes of this illumination, including gas shocks or the active nucleus of the galaxy.

    The group then made an unmistakable direct detection of an infant stellar population in the outflow [3]. These stars are thought to be less than a few tens of millions of years old, and preliminary analysis suggests that they are hotter and brighter than stars formed in less extreme environments such as the galactic disc.

    As further evidence, the astronomers also determined the motion and velocity of these stars. The light from most of the region’s stars indicates that they are travelling at very large velocities away from the galaxy centre — as would make sense for objects caught in a stream of fast-moving material.

    Co-author Helen Russell (Institute of Astronomy, Cambridge, UK) expands: “The stars that form in the wind close to the galaxy centre might slow down and even start heading back inwards, but the stars that form further out in the flow experience less deceleration and can even fly off out of the galaxy altogether.”

    The discovery provides new and exciting information that could better our understanding of some astrophysics, including how certain galaxies obtain their shapes [4]; how intergalactic space becomes enriched with heavy elements [5]; and even from where unexplained cosmic infrared background radiation may arise [6].

    Maiolino is excited for the future: “If star formation is really occurring in most galactic outflows, as some theories predict, then this would provide a completely new scenario for our understanding of galaxy evolution.”
    Notes

    [1] Stars are forming in the outflows at a very rapid rate; the astronomers say that stars totalling around 30 times the mass of the Sun are being created every year. This accounts for over a quarter of the total star formation in the entire merging galaxy system.

    [2] The expulsion of gas through galactic outflows leads to a gas-poor environment within the galaxy, which could be why some galaxies cease forming new stars as they age. Although these outflows are most likely to be driven by massive central black holes, it is also possible that the winds are powered by supernovae in a starburst nucleus undergoing vigorous star formation.

    [3] This was achieved through the detection of signatures characteristic of young stellar populations and with a velocity pattern consistent with that expected from stars formed at high velocity in the outflow.

    [4] Spiral galaxies have an obvious disc structure, with a distended bulge of stars in the centre and surrounded by a diffuse cloud of stars called a halo. Elliptical galaxies are composed mostly of these spheroidal components. Outflow stars that are ejected from the main disc could give rise to these galactic features.

    [5] How the space between galaxies — the intergalactic medium — becomes enriched with heavy elements is still an open issue, but outflow stars could provide an answer. If they are jettisoned out of the galaxy and then explode as supernovae, the heavy elements they contain could be released into this medium.

    [6] Cosmic-infrared background radiation, similar to the more famous cosmic microwave background, is a faint glow in the infrared part of the spectrum that appears to come from all directions in space. Its origin in the near-infrared bands, however, has never been satisfactorily ascertained. A population of outflow stars shot out into intergalactic space may contribute to this light.
    More information

    This research was presented in a paper entitled “Star formation in a galactic outflow” by Maiolino et al., to appear in the journal Nature on 27 March 2017 [link is above with image detail].

    The team is composed of R. Maiolino (Cavendish Laboratory; Kavli Institute for Cosmology, University of Cambridge, UK), H.R. Russell (Institute of Astronomy, Cambridge, UK), A.C. Fabian (Institute of Astronomy, Cambridge, UK), S. Carniani (Cavendish Laboratory; Kavli Institute for Cosmology, University of Cambridge, UK), R. Gallagher (Cavendish Laboratory; Kavli Institute for Cosmology, University of Cambridge, UK), S. Cazzoli (Departamento de Astrofisica-Centro de Astrobiología, Madrid, Spain), S. Arribas (Departamento de Astrofisica-Centro de Astrobiología, Madrid, Spain), F. Belfiore ((Cavendish Laboratory; Kavli Institute for Cosmology, University of Cambridge, UK), E. Bellocchi (Departamento de Astrofisica-Centro de Astrobiología, Madrid, Spain), L. Colina (Departamento de Astrofisica-Centro de Astrobiología, Madrid, Spain), G. Cresci (Osservatorio Astrofisico di Arcetri, Firenze, Italy), W. Ishibashi (Universität Zürich, Zürich, Switzerland), A. Marconi (Osservatorio Astrofisico di Arcetri, Firenze, Italy), F. Mannucci (Osservatorio Astrofisico di Arcetri, Firenze, Italy), E. Oliva (Osservatorio Astrofisico di Arcetri, Firenze, Italy), and E. Sturm (Max-Planck-Institut für Extraterrestrische Physik, Garching, Germany).

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition
    Visit ESO in Social Media-

    Facebook

    Twitter

    YouTube

    ESO Bloc Icon

    ESO is the foremost intergovernmental astronomy organisation in Europe and the world’s most productive ground-based astronomical observatory by far. It is supported by 16 countries: Austria, Belgium, Brazil, the Czech Republic, Denmark, France, Finland, Germany, Italy, the Netherlands, Poland, Portugal, Spain, Sweden, Switzerland and the United Kingdom, along with the host state of Chile. ESO carries out an ambitious programme focused on the design, construction and operation of powerful ground-based observing facilities enabling astronomers to make important scientific discoveries. ESO also plays a leading role in promoting and organising cooperation in astronomical research. ESO operates three unique world-class observing sites in Chile: La Silla, Paranal and Chajnantor. At Paranal, ESO operates the Very Large Telescope, the world’s most advanced visible-light astronomical observatory and two survey telescopes. VISTA works in the infrared and is the world’s largest survey telescope and the VLT Survey Telescope is the largest telescope designed to exclusively survey the skies in visible light. ESO is a major partner in ALMA, the largest astronomical project in existence. And on Cerro Armazones, close to Paranal, ESO is building the 39-metre European Extremely Large Telescope, the E-ELT, which will become “the world’s biggest eye on the sky”.

    ESO LaSilla
    ESO/Cerro LaSilla 600 km north of Santiago de Chile at an altitude of 2400 metres

    ESO VLT
    VLT at Cerro Paranal, with an elevation of 2,635 metres (8,645 ft) above sea level

    ESO Vista Telescope
    ESO/Vista Telescope at Cerro Paranal, with an elevation of 2,635 metres (8,645 ft) above sea level

    ESO NTT
    ESO/NTT at Cerro LaSilla 600 km north of Santiago de Chile at an altitude of 2400 metres

    ESO VLT Survey telescope
    VLT Survey Telescope at Cerro Paranal with an elevation of 2,635 metres (8,645 ft) above sea level

    ALMA Array
    ALMA on the Chajnantor plateau at 5,000 metres

    ESO E-ELT
    ESO/E-ELT to be built at Cerro Armazones at 3,060 m

    ESO APEX
    APEX Atacama Pathfinder 5,100 meters above sea level, at the Llano de Chajnantor Observatory in the Atacama desert

     
  • richardmitnick 7:02 am on March 27, 2017 Permalink | Reply
    Tags: , , ,   

    From UC Riverside: “Researchers Crack Structure of Key Protein in Zika Virus” 

    UC Riverside bloc

    UC Riverside

    March 27, 2017
    Iqbal Pittalwala

    1
    The image shows the crystal structure of ZIKV NS5 protein. The regions with different colors represent individual domains or motifs of ZIKV NS5. The black circle marks the location of the potential inhibitor-binding site. Image credit: Song lab, UC Riverside.

    Zika virus (ZIKV), which causes Zika virus disease, is spread to people primarily through the bite of an infected Aedes aegypti or Aedes albopictus mosquito. An infected pregnant woman can pass ZIKV to her fetus during pregnancy or around the time of birth. Sex is yet another way for infected persons to transmit ZIKV to others.

    The genomic replication of the virus is made possible by its “NS5” protein. This function of ZIKV NS5 is unique to the virus, making it an ideal target for anti-viral drug development. Currently, there is no vaccine or medicine to fight ZIKV infection.

    In a research paper just published in Nature Communications, University of California, Riverside scientists report that they have determined the crystal structure of the entire ZIKV NS5 protein and demonstrated that NS5 is functional when purified in vitro. Knowing the structure of ZIKV NS5 helps the researchers understand how ZIKV replicates itself.

    Furthermore, the researchers’ structural analysis of ZIKV NS5 reveals a potential binding site in the protein for an inhibitor, thereby providing a strong basis for developing potential inhibitors against ZIKV NS5 to suppress ZIKV infection. The identification of the inhibitor-binding site of NS5 can now enable scientists to design potential potent drugs to fight ZIKV.

    “We started this work realizing that the full structure of ZIKV NS5 was missing,” said Jikui Song, an assistant professor of biochemistry, who co-led the research with Rong Hai, an assistant professor of plant pathology and microbiology. “The main challenge for us occurred during the protein’s purification process when ZIKV NS5 got degraded – chopped up – by bacterial enzymes.”

    Song, Hai and their colleagues overcame this challenge by developing an efficient protocol for protein purification, which in essence minimizes the purification time for NS5.

    “Our work provides a framework for future studies of ZIKV NS5 and opportunities for drug development against ZIKV based on its structural similarity to the NS5 protein of other flaviviruses, such as the dengue virus,” Hai said. “No doubt, ZIKV therapeutics can benefit from the wealth of knowledge that has already been generated in the dengue virus field.”

    Next, the researchers plan to investigate the antiviral potential on ZIKV NS5 of a chemical compound that has been shown to work effectively in inhibiting the NS5 protein in the dengue virus.

    Song and Hai were joined in the research by graduate students Boxiao Wang (first author), Xiao-Feng Tan, Stephanie Thurmond, Zhi-Min Zhang, and Asher Lin.

    The research was supported by grants to Song from the March of Dimes Foundation, the Sidney Kimmel Foundation for Cancer Research and the National Institutes of Health.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    UC Riverside Campus

    The University of California, Riverside is one of 10 universities within the prestigious University of California system, and the only UC located in Inland Southern California.

    Widely recognized as one of the most ethnically diverse research universities in the nation, UCR’s current enrollment is more than 21,000 students, with a goal of 25,000 students by 2020. The campus is in the midst of a tremendous growth spurt with new and remodeled facilities coming on-line on a regular basis.

    We are located approximately 50 miles east of downtown Los Angeles. UCR is also within easy driving distance of dozens of major cultural and recreational sites, as well as desert, mountain and coastal destinations.

     
  • richardmitnick 6:25 am on March 27, 2017 Permalink | Reply
    Tags: "Cancer Biology Reproducibility Project Sees Mixed Results" Read it and Weep, , , Cancer Biology Reproducibility Project Sees Mixed Results, ,   

    From NOVA: “Cancer Biology Reproducibility Project Sees Mixed Results” Read it and Weep 

    PBS NOVA

    NOVA

    18 Jan 2017 [Don’t know how I missed this, or maybe they never put it up in social media before?]
    Courtney Humphries

    How trustworthy are the findings from scientific studies?

    A growing chorus of researchers says there’s a “reproducibility crisis” in science, with too many discoveries published that may be flukes or exaggerations. Now, an ambitious project to test the reproducibility of top studies in cancer research by independent laboratories has published its first five studies in the open-access journal eLife.

    “These are the first public replication studies conducted in biomedical science, and that in itself is a huge achievement,” says Elizabeth Iorns, CEO of Science Exchange and one of the project’s leaders.

    1
    Cancer biology is just one of many fields being scrutinized for the reproducibility of its studies.

    The Reproducibility Project: Cancer Biology is a collaboration between the non-profit Center for Open Science and the for-profit Science Exchange, which runs a network of laboratories for outsourcing biomedical research. It began in 2013 with the goal of repeating experiments from top-cited cancer papers; all of the work has been planned, executed, and published in the open, in consultation with the studies’ original authors. These papers are the first of many underway and slated to be published in the coming months.

    The outcome so far has been mixed, the project leaders say. While some results are similar, none of the studies looks exactly like the original, says Tim Errington, the project’s manager. “They’re all different in some way. They’re all different in different ways.” In some studies, the experimental system didn’t behave the same. In others, the result was slightly different, or it did not hold up under the statistical scrutiny project leaders used to analyze results. All in all, project leaders report, one study failed to reproduce the original finding, two supported key aspects of the original papers, and two were inconclusive because of technical issues.

    Errington says the goal is not to single out any individual study as replicable or not. “Our intent with this project is to perform these direct replications so that we can understand collectively how reproducible our research is,” he says.

    Indeed, there are no agreed-upon criteria for judging whether a replication is successful. At the project’s end, he says, the team will analyze the replication studies collectively by several different standards—including simply asking scientists what they think. “We’re not going to force an agreement—we’re trying to create a discussion,” he says.

    The project has been controversial; some cancer biologists say it’s designed to make them look bad bad at a time when federal research funding is under threat. Others have praised it for tackling a system that rewards shoddy research. If the first papers are any indication, those arguments won’t be easily settled. So far, the studies provide a window into the challenges of redoing complex laboratory studies. They also underscore the need that, if cancer biologists want to improve the reproducibility of their research, they have to agree on a definition of success.

    An Epidemic?

    A recent survey in Nature of more than 1,500 researchers found that 70% have tried and failed to reproduce others’ experiments, and that half have failed to reproduce their own. But you wouldn’t know it by reading published studies. Academic scientists are under pressure to publish new findings, not replicate old research. There’s little funding earmarked toward repeating studies, and journals favor publishing novel discoveries. Science relies on a gradual accumulation of studies that test hypotheses in new ways. If one lab makes a discovery using cell lines, for instance, the same lab or another lab might investigate the phenomenon in mice. In this way, one study extends and builds on what came before.

    For many researchers, that approach—called conceptual replication, which gives supporting evidence for a previous study’s conclusion using another model—is enough. But a growing number of scientists have been advocating for repeating influential studies. Such direct replications, Errington says, “will allow us to understand how reliable each piece of evidence we have is.” Replications could improve the efficiency of future research by winnowing out false hypotheses early and help scientists recreate others’ work in order to build on it.

    In the field of cancer research, some of the pressure to improve reproducibility has come from the pharmaceutical industry, where investing in a spurious hypothesis or therapy can threaten profits. In a 2012 commentary in Nature, cancer scientists Glenn Begley and Lee Ellis wrote that they had tried to reproduce 53 high-profile cancer studies while working at the pharmaceutical company Amgen, and succeeded with just six. A year earlier, scientists at Bayer HealthCare announced that they could replicate only 20–25% of 47 cancer studies. But confidentiality rules prevented both teams from sharing data from those attempts, making it difficult for the larger scientific community to assess their results.

    ‘No Easy Task’

    Enter the Reproducibility Project: Cancer Biology. It was launched with a $1.3 million grant from the Laura and John Arnold Foundation to redo key experiments from 50 landmark cancer papers from 2010 to 2012. The work is carried out in the laboratory network of Science Exchange, a Palo Alto-based startup, and the results tracked and made available through a data-sharing platform developed by the Center for Open Science. Statisticians help design the experiments to yield rigorous results. The protocols of each experiment have been peer-reviewed and published separately as a registered report beforehand, which advocates say prevents scientists from manipulating the experiment or changing their hypothesis midstream.

    The group has made painstaking efforts to redo experiments with the same methods and materials, reaching out to original laboratories for advice, data, and resources. The labs that originally wrote the studies have had to assemble information from years-old research. Studies have been delayed because of legal agreements for transferring materials from one lab to another. Faced with financial and time constraints, the team has scaled back its project; so far 29 studies have been registered, and Errington says the plan is to do as much as they can over the next year and issue a final paper.

    “This is no easy task, and what they’ve done is just wonderful,” says Begley, who is now chief scientific officer at Akriveia Therapeutics and was originally on the advisory board for the project but resigned because of time constraints. His overall impression of the studies is that they largely flunked replication, even though some data from individual experiments matched. He says that for a study to be valuable, the major conclusion should be reproduced, not just one or two components of the study. This would demonstrate that the findings are a good foundation for future work. “It’s adding evidence that there’s a challenge in the scientific community we have to address,” he says.

    Begley has argued that early-stage cancer research in academic labs should follow methods that clinical trials use, like randomizing subjects and blinding investigators as to which ones are getting a treatment or not, using large numbers of test subjects, and testing positive and negative controls. He says that when he read the original papers under consideration for replication, he assumed they would fail because they didn’t follow these methods, even though they are top papers in the field.. “This is a systemic problem; it’s not one or two labs that are behaving badly,” he says.

    Details Matter

    For the researchers whose work is being scrutinized, the details of each study matter. Although the project leaders insist they are not designing the project to judge individual findings—that would require devoting more resources to each study—cancer researchers have expressed concern that the project might unfairly cast doubt on their discoveries. The responses of some of those scientists so far raise issues about how replication studies should be carried out and analyzed.

    One study, for instance, replicated a 2010 paper led by Erkki Ruoslahti, a cancer researcher at Sanford Burnham Prebys Medical Discovery Institute in San Diego, which identified a peptide that could stick to and penetrate tumors. Ruoslahti points to a list of subsequent studies by his lab and others that support the finding and suggest that the peptide could help deliver cancer drugs to tumors. But the replication study found that the peptide did not make tumors more permeable to drugs in mice. Ruoslahti says there could be a technical reason for the problem, but the replication team didn’t try to troubleshoot it. He’s now working to finish preclinical studies and secure funding to move the treatment into human trials through a company called Drugcendr. He worries that replication studies that fail without fully exploring why could derail efforts to develop treatments. “This has real implications to what will happen to patients,” he says.

    Atul Butte, a computational biologist at the University of California San Francisco, who led one of the original studies that was reproduced, praises the diligence of the team. “I think what they did is unbelievably disciplined,” he says. But like some other scientists, he’s puzzled by the way the team analyzed results, which can make a finding that subjectively seems correct appear as if it failed. His original study used a data-crunching model to sort through open-access genetic information and identify potential new uses for existing drugs. Their model predicted that the antiulcer medication cimetidine would have an effect against lung cancer, and his team validated the model by testing the drug against lung cancer tumors in mice. The replication found very similar effects. “It’s unbelievable how well it reproduces our study,” Butte says. But the replication team used a statistical technique to analyze the results that found them not statistically significant. Butte says it’s odd that the project went to such trouble to reproduce experiments exactly, only to alter the way the results are interpreted.

    Errington and Iorns acknowledge that such a statistical analysis is not common in biological research, but they say it’s part of the group’s effort to be rigorous. “The way we analyzed the result is correct statistically, and that may be different from what the standards are in the field, but they’re what people should aspire to,” Iorns says.

    In some cases, results were complicated by inconsistent experimental systems. One study tested a type of experimental drug called a BET inhibitor against multiple myeloma in mice. The replication found that the drug improved the survival of diseased mice compared to controls, consistent with the original study. But the disease developed differently in the replication study, and statistical analysis of the tumor growth did not yield a significant finding. Constantine Mitsiades, the study’s lead author and a cancer researcher at the Dana-Farber Cancer Institute, says that despite the statistical analysis, the replication study’s data “are highly supportive of and consistent with our original study and with subsequent studies that also confirmed it.”

    A Fundamental Debate

    These papers will undoubtedly provoke debate about what the standards of replication should be. Mitsiades and other scientists say that complex biological systems like tumors are inherently variable, so it’s not surprising if replication studies don’t exactly match their originals. Inflexible study protocols and rigid statistics may not be appropriate for evaluating such systems—or needed.

    Some scientists doubt the need to perform copycat studies at all. “I think science is self-correcting,” Ruoslahti says. “Yes, there’s some loss of time and money, but that’s just part of the process.” He says that, on the positive side, this project might encourage scientists to be more careful, but he also worries that it might discourage them from publishing new discoveries.

    Though the researchers who led these studies are, not surprisingly, focused on the correctness of the findings, Errington says that the variability of experimental models and protocols is important to document. Advocates for replication say that current published research reflects an edited version of what happened in the lab. That’s why the Reproducibility Project has made a point to publish all of its raw data and include experiments that seemed to go awry, when most researchers would troubleshoot them and try again.

    “The reason to repeat experiments is to get a handle on the intrinsic variability that happens from experiment to experiment,” Begley says. With a better understanding of biology’s true messiness, replication advocates say, scientists might have a clearer sense of whether or not to put credence in a single study. And if more scientists published the full data from every experiment, those original results may look less flashy to begin with, leading fewer labs to chase over-hyped hypotheses and therapies that never pan out. An ultimate goal of the project is to identify factors that make it easier to produce replicable research, like publishing detailed protocols and validating that materials used in a study, such as antibodies, are working properly.


    Access mp4 video here .

    Beyond this project, the scientific community is already taking steps to address reproducibility. Many scientific journals are making stricter requirements for studies and publishing registered reports of studies before they’re carried out. The National Institutes of Health has launched training and funding initiatives to promote robust and reproducible research. F1000Research, an open-access, online publisher launched a Preclinical Reproducibility and Robustness Channel in 2016 for researchers to publish results from replication studies. Last week several scientists published a reproducibility manifesto in the journal Human Behavior that lays out a broad series of steps to improve the reliability of research findings, from the way studies are planned to the way scientists are trained and promoted.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NOVA is the highest rated science series on television and the most watched documentary series on public television. It is also one of television’s most acclaimed series, having won every major television award, most of them many times over.

     
  • richardmitnick 8:10 am on March 25, 2017 Permalink | Reply
    Tags: ESA 50   

    From ESA: “ESOC Control Room 1978” 

    ESA Space For Europe Banner

    European Space Agency
    3.24.17

    1
    ESOC Control Room 1978

    In 2017, ESA’s European Space Operations Centre will celebrate its 50th anniversary, having been inaugurated on 8 September 1967 as part of the European Space Research Organisation (ESRO).

    Within months, the centre was controlling its first mission, ESRO-2B , a 74 kg satellite carrying seven instruments to study solar and cosmic radiation and their interaction with Earth and its magnetosphere.

    The 1970s were a busy time at the control centre, with the number of missions steadily increasing.

    The decade saw 10 new missions launched by ESRO, of which eight (HEOS-A2, TD-1A, ESRO-4, COS-B, GEOS-1, Meteosat-1, GEOS-2 and OTS-2) were operated from here.

    In 1975, ESRO joined with the European Launcher Development Organisation (ELDO) to become the European Space Agency.

    Today, teams are operating comprising 18 spacecraft on 11 missions, including Mars Express, Gaia, ExoMars, Sentinel-1 and -2, and CryoSat. Twelve more missions are in preparation for launch in the next few years.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The European Space Agency (ESA), established in 1975, is an intergovernmental organization dedicated to the exploration of space, currently with 19 member states. Headquartered in Paris, ESA has a staff of more than 2,000. ESA’s space flight program includes human spaceflight, mainly through the participation in the International Space Station program, the launch and operations of unmanned exploration missions to other planets and the Moon, Earth observation, science, telecommunication as well as maintaining a major spaceport, the Guiana Space Centre at Kourou, French Guiana, and designing launch vehicles. ESA science missions are based at ESTEC in Noordwijk, Netherlands, Earth Observation missions at ESRIN in Frascati, Italy, ESA Mission Control (ESOC) is in Darmstadt, Germany, the European Astronaut Centre (EAC) that trains astronauts for future missions is situated in Cologne, Germany, and the European Space Astronomy Centre is located in Villanueva de la Cañada, Spain.

    ESA50 Logo large

     
  • richardmitnick 8:01 am on March 25, 2017 Permalink | Reply
    Tags: , , , Birkeland currents, , , Our atmosphere, U Calgary   

    From ESA: Supersonic Plasma Jets Discovered 

    ESA Space For Europe Banner

    European Space Agency

    23 March 2017
    No writer credit

    Information from ESA’s magnetic field Swarm mission has led to the discovery of supersonic plasma jets high up in our atmosphere that can push temperatures up to almost 10 000°C.

    Presenting these findings at this week’s Swarm Science Meeting in Canada, scientists from the University of Calgary explained how they used measurements from the trio of Swarm satellites to build on what was known about vast sheets of electric current in the upper atmosphere.

    The theory that there are huge electric currents, powered by solar wind and guided through the ionosphere by Earth’s magnetic field, was postulated more than a century ago by Norwegian scientist Kristian Birkeland.

    1
    Birkeland currents
    Released 23/03/2017 10:19 am
    Copyright University of Calgary/ESA
    Description

    ESA’s Swarm has been used to improve our understanding about vast sheets of electric current in the upper atmosphere. Birkeland currents carry up to 1 TW of electric power to the upper atmosphere – about 30 times the energy consumed in New York during a heatwave. They are also responsible for ‘aurora arcs’, the familiar, slow-moving green curtains of light that can extend from horizon to horizon.
    Recent observations by Swarm have revealed that they are associated with large electrical fields and occur where upwards and downwards Birkeland currents connect through the ionosphere. Scientists have also discovered that these strong electric fields drive supersonic plasma jets.

    2
    Upward and downward current sheets
    Released 23/03/2017 10:10 am
    Copyright University of Calgary/ESA
    Description
    Birkeland currents carry up to 1 TW of electric power to the upper atmosphere – about 30 times the energy consumed in New York during a heatwave. They are also responsible for ‘aurora arcs’, the familiar, slow-moving green curtains of light that can extend from horizon to horizon. Recent observations by Swarm have revealed that they are associated with large electrical fields and occur where upwards and downwards Birkeland currents connect through the ionosphere. Scientists have also discovered that these strong electric fields drive supersonic plasma jets.

    It wasn’t until the 1970s, after the advent of satellites, however, that these ‘Birkeland currents’ were confirmed by direct measurements in space. These currents carry up to 1 TW of electric power to the upper atmosphere – about 30 times the energy consumed in New York during a heatwave. They are also responsible for ‘aurora arcs’, the familiar, slow-moving green curtains of light that can extend from horizon to horizon. While much is known about these current systems, recent observations by Swarm have revealed that they are associated with large electrical fields.

    These fields, which are strongest in the winter, occur where upwards and downwards Birkeland currents connect through the ionosphere.

    Bill Archer from the University of Calgary explained, “Using data from the Swarm satellites’ electric field instruments, we discovered that these strong electric fields drive supersonic plasma jets.

    “The jets, which we call ‘Birkeland current boundary flows’, mark distinctly the boundary between current sheets moving in opposite direction and lead to extreme conditions in the upper atmosphere.

    “They can drive the ionosphere to temperatures approaching 10 000°C and change its chemical composition. They also cause the ionosphere to flow upwards to higher altitudes where additional energisation can lead to loss of atmospheric material to space.”

    4
    Magnetic field sources
    Released 31/10/2012 4:35 pm
    Copyright ESA/DTU Space
    The different sources that contribute to the magnetic field measured by Swarm. The coupling currents or field-aligned currents flow along magnetic field lines between the magnetosphere and ionosphere.

    David Knudsen, also from the University of Calgary, added, “These recent findings from Swarm add knowledge of electric potential, and therefore voltage, to our understanding of the Birkeland current circuit, perhaps the most widely recognised organising feature of the coupled magnetosphere–ionosphere system.”

    This discovery is just one of the new findings presented at the week-long science meeting dedicated to the Swarm mission. Also presented this week and focusing on Birkeland currents, for example, Swarm was used to confirm that these currents are stronger in the northern hemisphere and vary with the season.

    Since they were launched in 2013, the identical Swarm satellites have been measuring and untangling the different magnetic signals that stem from Earth’s core, mantle, crust, oceans, ionosphere and magnetosphere.

    5
    Front of Swarm satellite
    Released 04/02/2014 5:07 pm
    Copyright ESA/ATG medialab
    Swarm is ESA’s first Earth observation constellation of satellites. The trio of identical satellites are designed to identify and measure precisely the different magnetic signals that make up Earth’s magnetic field. The electrical field instrument, positioned at the front of each satellite, measures plasma density, drift and acceleration in high resolution to characterise the electric field around Earth.

    As well as a package of instruments to do this, each satellite has an electric field instrument positioned at the front to measure plasma density, drift and velocity.

    Rune Floberghagen, ESA’s Swarm mission manager, said, “The electric field instrument is the first ionospheric imager in orbit so it’s very exciting to see such fantastic results that are thanks to this new instrument.

    “The dedication of scientists working with data from the mission never ceases to amaze me and we are seeing some brilliant results, such as this, discussed at this week’s meeting.

    “Swarm is really opening our eyes to the workings of the planet from deep down in Earth’s core to the highest part of our atmosphere.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The European Space Agency (ESA), established in 1975, is an intergovernmental organization dedicated to the exploration of space, currently with 19 member states. Headquartered in Paris, ESA has a staff of more than 2,000. ESA’s space flight program includes human spaceflight, mainly through the participation in the International Space Station program, the launch and operations of unmanned exploration missions to other planets and the Moon, Earth observation, science, telecommunication as well as maintaining a major spaceport, the Guiana Space Centre at Kourou, French Guiana, and designing launch vehicles. ESA science missions are based at ESTEC in Noordwijk, Netherlands, Earth Observation missions at ESRIN in Frascati, Italy, ESA Mission Control (ESOC) is in Darmstadt, Germany, the European Astronaut Centre (EAC) that trains astronauts for future missions is situated in Cologne, Germany, and the European Space Astronomy Centre is located in Villanueva de la Cañada, Spain.

    ESA50 Logo large

     
  • richardmitnick 7:42 am on March 25, 2017 Permalink | Reply
    Tags: , , , ,   

    From Goddard: “OSIRIS-REx asteroid search tests instruments, science team” 

    NASA Goddard Banner
    NASA Goddard Space Flight Center

    March 24, 2017
    Erin Morton
    morton@orex.lpl.arizona.edu
    University of Arizona, Tucson

    Nancy Neal Jones
    nancy.n.jones@nasa.gov
    NASA’s Goddard Space Flight Center, Greenbelt, Md.

    1
    The path of the Main Belt asteroid 12 Victoria, as imaged by NASA’s OSIRIS-REx spacecraft on Feb. 11, 2017, during the mission’s Earth-Trojan Asteroid Search. This animation is made of a series of five images taken by the spacecraft’s MapCam camera that were then cropped and centered on Victoria. The images were taken about 51 minutes apart and each was exposed for 10 seconds. Credits: NASA/Goddard/University of Arizona


    OSIRIS-REx spacecraft

    During an almost two-week search, NASA’s OSIRIS-REx mission team activated the spacecraft’s MapCam imager and scanned part of the surrounding space for elusive Earth-Trojan asteroids — objects that scientists believe may exist in one of the stable regions that co-orbits the sun with Earth. Although no Earth-Trojans were discovered, the spacecraft’s camera operated flawlessly and demonstrated that it could image objects two magnitudes dimmer than originally expected.

    The spacecraft, currently on its outbound journey to the asteroid Bennu, flew through the center of Earth’s fourth Lagrangian area — a stable region 60 degrees in front of Earth in its orbit where scientists believe asteroids may be trapped, such as asteroid 2010 TK7 discovered by NASA’s Wide-field Infrared Survey Explorer (WISE) satellite in 2010. Though no new asteroids were discovered in the region that was scanned, the spacecraft’s cameras MapCam and PolyCam successfully acquired and imaged Jupiter and several of its moons, as well as Main Belt asteroids.

    “The Earth-Trojan Asteroid Search was a significant success for the OSIRIS-REx mission,” said OSIRIS-REx Principal Investigator Dante Lauretta of the University of Arizona, Tucson. “In this first practical exercise of the mission’s science operations, the mission team learned so much about this spacecraft’s capabilities and flight operations that we are now ahead of the game for when we get to Bennu.”

    The Earth Trojan survey was designed primarily as an exercise for the mission team to rehearse the hazard search the spacecraft will perform as it approaches its target asteroid Bennu. This search will allow the mission team to avoid any natural satellites that may exist around the asteroid as the spacecraft prepares to collect a sample to return to Earth in 2023 for scientific study.

    The spacecraft’s MapCam imager, in particular, performed much better than expected during the exercise. Based on the camera’s design specifications, the team anticipated detecting four Main Belt asteroids. In practice, however, the camera was able to detect moving asteroids two magnitudes fainter than expected and imaged a total of 17 Main Belt asteroids. This indicates that the mission will be able to detect possible hazards around Bennu earlier and from a much greater distance that originally planned, further reducing mission risk.

    Scientists are still analyzing the implications of the search’s results for the potential population of Earth-Trojan asteroids and will publish conclusions after a thorough study of mission data.

    NASA’s Goddard Space Flight Center in Greenbelt, Maryland, provides overall mission management, systems engineering and the safety and mission assurance for OSIRIS-REx. Dante Lauretta of the University of Arizona, Tucson, is the principal investigator, and the University of Arizona also leads the science team and the mission’s observation planning and processing. Lockheed Martin Space Systems in Denver built the spacecraft and is providing flight operations. Goddard and KinetX Aerospace are responsible for navigating the OSIRIS-REx spacecraft. OSIRIS-REx is the third mission in NASA’s New Frontiers Program. NASA’s Marshall Space Flight Center in Huntsville, Alabama, manages the agency’s New Frontiers Program for its Science Mission Directorate in Washington.

    For more information on OSIRIS-REx, visit:

    http://www.nasa.gov/osirisrex and http://www.asteroidmission.org

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NASA’s Goddard Space Flight Center is home to the nation’s largest organization of combined scientists, engineers and technologists that build spacecraft, instruments and new technology to study the Earth, the sun, our solar system, and the universe.

    Named for American rocketry pioneer Dr. Robert H. Goddard, the center was established in 1959 as NASA’s first space flight complex. Goddard and its several facilities are critical in carrying out NASA’s missions of space exploration and scientific discovery.

    NASA Goddard Campus
    NASA/Goddard Campus
    NASA

     
  • richardmitnick 3:08 pm on March 24, 2017 Permalink | Reply
    Tags: , , , Nov. 2016 Kaikoura earthquake   

    From JPL-Caltech: “Study of Complex 2016 Quake May Alter Hazard Models” 

    NASA JPL Banner

    JPL-Caltech

    March 23, 2017
    Alan Buis
    Jet Propulsion Laboratory, Pasadena, Calif.
    818-354-0474
    alan.buis@jpl.nasa.gov

    Ian Hamling
    GNS Science, Avalon, New Zealand
    011-04-570-4568

    1
    Two ALOS-2 satellite images show ground displacements from the Nov. 2016 Kaikoura earthquake as colors proportional to the surface motion in two directions. The purple areas in the left image moved up and east 13 feet (4 meters); purple areas in the right image moved north up to 30 feet (9 meters). Credit: NASA/JPL-Caltech/JAXA

    Last November’s magnitude 7.8 Kaikoura earthquake in New Zealand was so complex and unusual, it is likely to change how scientists think about earthquake hazards in plate boundary zones around the world, finds a new international study.

    The study, led by GNS Science, Avalon, New Zealand, with NASA participation, is published this week in the journal Science. The team found that the Nov. 14, 2016, earthquake was the most complex earthquake in modern history. The quake ruptured at least 12 major crustal faults, and there was also evidence of slip along the southern end of the Hikurangi subduction zone plate boundary, which lies about 12 miles (20 kilometers) below the North Canterbury and Marlborough coastlines.

    Lead author and geodesy specialist Ian Hamling of GNS Science says the quake has underlined the importance of re-evaluating how rupture scenarios are defined for seismic hazard models in plate boundary zones worldwide.

    “This complex earthquake defies many conventional assumptions about the degree to which earthquake ruptures are controlled by individual faults, and provides additional motivation to re-think these issues in seismic hazard models,” Hamling says.

    The research team included 29 co-authors from 11 national and international institutes. To conduct the study, they combined multiple datasets, including satellite radar interferometry and GPS data that measure the amount of ground movement associated with the earthquake, along with field observations and coastal uplift data. The team found that parts of New Zealand’s South Island moved more than 16 feet (5 meters) closer to New Zealand’s North Island and were uplifted by as much as 26 feet (8 meters).

    The Kaikoura earthquake rupture began in North Canterbury and propagated northward for more than 106 miles (170 kilometers) along both well-known and previously unknown faults. It straddled two distinct active fault domains, rupturing faults in both the North Canterbury Fault zone and the Marlborough Fault system.

    The largest movement during the earthquake occurred on the Kekerengu fault, where pieces of Earth’s crust were displaced relative to each other by up to 82 feet (25 meters), at a depth of about 9 miles (15 kilometers). Maximum rupture at the surface was measured at 39 feet (12 meters) of horizontal displacement.

    Hamling says there is growing evidence internationally that conventional seismic hazard models are too simple and restrictive. “Even in the New Zealand modeling context, the Kaikoura event would not have been included because so many faults linked up unexpectedly,” he said. “The message from Kaikoura is that earthquake science should be more open to a wider range of possibilities when rupture propagation models are being developed.”

    The scientists analyzed interferometric synthetic aperture radar (InSAR) data from the Copernicus Sentinel-1A and -1B satellites, which are operated by the European Space Agency, along with InSAR data from the Japan Aerospace Exploration Agency’s ALOS-2 satellite. They compared pre- and post-earthquake images of Earth’s surface to measure land movement across large areas and infer movement on faults at depth. The Sentinel and ALOS-2 satellites orbit Earth in near-polar orbits at altitudes of 373 and 434 miles (600 and 700 kilometers), respectively, and image the same point on Earth at repeat intervals ranging from six to 30 days. The Sentinel and ALOS-2 satellites use different wavelengths, which means they pick up different aspects of surface deformation, adding to the precision and completeness of the investigation.

    In the spirit of international cooperation, both space agencies had re-prioritized their satellites immediately after the quake to collect more images of New Zealand to help with research and support the emergency response activities.

    Before the earthquake, coauthors Cunren Liang and Eric Fielding of NASA’s Jet Propulsion Laboratory, Pasadena, California, developed new InSAR data processing techniques to measure the ground deformation in the satellite flight direction using wide-swath images acquired by the ALOS-2 satellite. This is the first time this new approach has been successfully used in earthquake research.

    “We were surprised by the amazing complexity of the faults that ruptured in the Kaikoura earthquake when we processed the satellite radar images,” said Fielding. “Understanding how all these faults moved in one event will improve seismic hazard models.”

    The authors say the Kaikoura earthquake was one of the most recorded large earthquakes anywhere in the world, enabling scientists to undertake analysis in an unprecedented level of detail. This paper is the first in a series of studies to be published on the rich array of data collected from this earthquake.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NASA JPL Campus

    Jet Propulsion Laboratory (JPL) is a federally funded research and development center and NASA field center located in the San Gabriel Valley area of Los Angeles County, California, United States. Although the facility has a Pasadena postal address, it is actually headquartered in the city of La Cañada Flintridge [1], on the northwest border of Pasadena. JPL is managed by the nearby California Institute of Technology (Caltech) for the National Aeronautics and Space Administration. The Laboratory’s primary function is the construction and operation of robotic planetary spacecraft, though it also conducts Earth-orbit and astronomy missions. It is also responsible for operating NASA’s Deep Space Network.

    Caltech Logo

    NASA image

     
  • richardmitnick 2:39 pm on March 24, 2017 Permalink | Reply
    Tags: , , , , , , , ,   

    From WIRED: “Astronomers Don’t Point This Telescope—The Telescope Points Them” 

    Wired logo

    WIRED

    03.23.17
    Sarah Scoles

    1
    U Texas Austin McDonald Observatory Hobby-Eberly Telescope

    The hills of West Texas rise in waves around the Hobby-Eberly Telescope, a powerful instrument encased in a dome that looks like the Epcot ball. Soon, it will become more powerful still: Scientists recently primed the telescope to find evidence of dark energy in the early universe, prying open its eye so it can see and process a wide swath of sky. On April 8, scientists will dedicate the new telescope, capping off the $40 million upgrade and beginning the real work.

    The dark energy experiment, called Hetdex, isn’t how astronomy has traditionally been done. In the classical model, a lone astronomer goes to a mountaintop and solemnly points a telescope at one predetermined object. But Hetdex won’t look for any objects in particular; it will just scan the sky and churn petabytes of the resulting data through a silicon visual cortex. That’s only possible because of today’s steroidal computers, which let scientists analyze, store, and send such massive quantities of data.

    “Dark energy is not only terribly important for astronomy, it’s the central problem for physics. It’s been the bone in our throat for a long time.”

    Steven Weinberg
    Nobel Laureate
    University of Texas at Austin

    The hope is so-called blind surveys like this one will find stuff astronomers never even knew to look for. In this realm, computers take over curation of the sky, telling astronomers what is interesting and worthy of further study, rather than the other way around. These wide-eyed projects are becoming a standard part of astronomers’ arsenal, and the greatest part about them is that their best discoveries are still totally TBD.

    Big Sky Country

    To understand dark energy—that mysterious stuff that pulls the taffy of spacetime—the Hetdex team needed Hobby-Eberly to study one million galaxies 9-11 billion light-years away as they fly away from Earth. To get that many galaxies in a reasonable amount of time, they broadened the view of its 91 tessellated stop-sign-shaped mirrors by 100. They also created an instrument called Virus, with 35,000 optical fibers that send the light from the universe to a spectrograph, which splits it up into constituent wavelengths. All that data can determine both how far away a galaxy is and how fast it’s traveling away from Earth.

    But when a telescope takes a ton of data down from the sky, scientists can also uncover the unexpected. Hetdex’s astronomers will find more than just the stretch marks of dark energy. They’ll discover things about supermassive black holes, star formation, dark matter, and the ages of stars in nearby galaxies.

    The classical method still has advantages; if you know exactly what you want to look at, you write up a nice proposal to Hubble and explain why a fixed gaze at the Whirlpool Galaxy would yield significant results. “But what you see is what you get,” says astronomer Douglas Hudgins. “This is an object, and the science of that object is what you’re stuck with.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: