Tagged: Applied Research & Technology Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 1:59 pm on March 17, 2017 Permalink | Reply
    Tags: , Applied Research & Technology, , , Mapping the Topographic Fingerprints of Humanity Across Earth   

    From Eos: “Mapping the Topographic Fingerprints of Humanity Across Earth” 

    AGU bloc

    AGU
    Eos news bloc

    Eos

    16 March 2017
    Paolo Tarolli
    Giulia Sofia
    Erle Ellis

    1
    Fig. 1. Three-dimensional view of Bingham Canyon Mine, Utah, a human-made topographic signature, based on a free, open-access high-resolution data set. Credit: Data from Utah AGRC

    Since geologic time began, Earth’s surface has been evolving through natural processes of tectonic uplift, volcanism, erosion, and the movement of sediment. Now a new force of global change is altering Earth’s surface and morphology in unprecedented ways: humanity.

    Human activities are leaving their fingerprints across Earth (Figure 1), driven by increasing populations, technological capacities, and societal demands [e.g., Ellis, 2015; Brown et al., 2017; Waters et al., 2016]. We have altered flood patterns, created barriers to runoff and erosion, funneled sedimentation into specific areas, flattened mountains, piled hills, dredged land from the sea, and even triggered seismic activity [Tarolli and Sofia, 2016]. These and other changes can pose broad threats to the sustainability of human societies and environments.

    If increasingly globalized societies are to make better land management decisions, the geosciences must globally evaluate how humans are reshaping Earth’s surface. A comprehensive mapping of human topographic signatures on a planet-wide scale is required if we are to understand, model, and forecast the geological hazards of the future.

    Understanding and addressing the causes and consequences of anthropogenic landform modifications are a worldwide challenge. But this challenge also poses an opportunity to better manage environmental resources and protect environmental values [DeFries et al., 2012].

    The Challenge of Three Dimensions

    “If life happens in three dimensions, why doesn’t science?” This question, posed more than a decade ago in Nature [Butler, 2006], resonates when assessing human reshaping of Earth’s landscapes.

    Landforms are shaped in three dimensions by natural processes and societal demands [e.g., Sidle and Ziegler, 2012; Guthrie, 2015]; societies in turn are shaped by the landscapes they alter. Understanding and modeling these interacting forces across Earth are no small challenge.

    For example, observing and modeling the direct effects of some of the most widespread forms of human topographic modification, such as soil tillage and terracing [Tarolli et al., 2014], are possible only with very fine spatial resolutions (i.e., ≤1 meter). Yet these features are common all over the world. High-resolution three-dimensional topographic data at global scales are needed to observe and appraise them.

    The Need for a Unified, Global Topographic Data Set

    High-resolution terrain data such as lidar [Tarolli, 2014], aerial photogrammetry [Eltner et al., 2016], and satellite observations [Famiglietti et al., 2015] are increasingly available to the scientific community. These data sets are also becoming available to land planners and the public, as governments, academic institutions, and others in the remote sensing community seize the opportunity for high-resolution topographic data sharing (Figure 2) [Wulder and Coops, 2014; Verburg et al., 2015]

    2
    Fig. 2. High-resolution geodata reveal the topographic fingerprints of humanity: (a) terraces in the Philippines, (b) agricultural practices in Germany, and (c) roads in Antarctica. The bottom images are lidar images of the same landscapes. Credit: Data from University of the Philippines TCAGP/Freie und Hansestadt Hamburg/Noh and Howat [2015]. Top row: © Google, DigitalGlobe

    Thanks to these geodata, anthropogenic signatures are widely observable across the globe, under vegetation cover (Figure 2a), at very fine spatial scales (e.g., agricultural practices and plowing; Figure 2b) and at large spatial scales (e.g., major open pit mines; Figure 3), and far from contemporary human settlements (Figure 2c). So the potential to assess the global topographic fingerprints of humanity using high-resolution terrain data is a tantalizing prospect.

    However, despite a growing number of local projects at fine scales, a global data set remains nonetheless elusive. This lack of global data is largely the result of technical challenges to sharing very large data sets and issues of data ownership and permissions.

    But once a global database exists, advances in the technical capacity to handle and analyze large data sets could be utilized to map anthropogenic signatures in detail (e.g., using a close-range terrestrial laser scanner) and across larger areas (e.g., using satellite data). Together with geomorphic analyses, the potential is clear for an innovative, transformative, and global-scale assessment of the extent to which humans shape Earth’s landscapes.

    For example, a fine-scale analysis of terrain data can detect specific anthropogenic configurations in the organization of surface features (Figure 3b) [Sofia et al., 2014], revealing modifications that humans make across landscapes (Figure 3c). Such fine-scale geomorphic changes are generally invisible to coarser scales of observation and analysis, making it appear that natural landforms and natural hydrological and sedimentary processes are unaltered. Failure to observe such changes misrepresents the true extent and form of human modifications of terrain, with huge consequences when inaccurate data are used to assess risks from runoff, landslides, and other geologic hazards to society [Tarolli, 2014].

    3
    Fig. 3. This potential detection of anthropogenic topographic signatures has been derived from satellite data. (a) This satellite image shows an open-pit mine in North Korea. (b) That image has been processed in an autocorrelation analysis, a measure of the organization of the topography (slope local length of autocorrelation, SLLAC [Sofia et al., 2014]). The variation in the natural landscape is noisy (e.g., top right corner), whereas anthropogenic structures are more organized and leave a clear topographic signature. (c) The degree of landscape organization can be empirically related to the amount of human-made alterations to the terrain, as demonstrated by Sofia et al. [2014]. Credit: Data from CNES© Distribution Airbus DS

    Topography for Society

    A global map of the topographic signatures of humanity would create an unparalleled opportunity to change both scientific and public perspectives on the human role in reshaping Earth’s land surface. A worldwide inventory of anthropogenic geomorphologies would enable geoscientists to assess the extent to which human societies have reshaped geomorphic processes globally and provide a tool for monitoring these changes over time.

    Such monitoring would facilitate unprecedented insights into the dynamics and sensitivity of landscapes and their responses to human forcings at global scale. In turn, these insights would help cities, resource managers, and the public better understand and mediate their social and environmental actions.

    As we move deeper into the Anthropocene, a comprehensive mapping of human topographic signatures will be increasingly necessary to understand, model, and forecast the geological hazards of the future. These hazards will likely be manifold.

    4
    Fig. 4. (a) This road, in the HJ Andrews Experimental Forest in Oregon’s Cascade Range, was constructed in 1952. A landslide occurred in 1964, and its scar was still visible in 1994, when the image was acquired. The landslide starts from the road and flows toward the top right corner of the image. (b) An index called the relative path impact index (RPII) [Tarolli et al., 2013] is evaluated here using a lidar data set from 2008. The RPII analyzes the potential water surface flow accumulation based on the lidar digital terrain model, and the index is highest where the flows are increased because of the presence of anthropogenic features. High values beyond one standard deviation (σ) highlight potential road-induced erosion. Credit: Data from NSF LTER, USFS Research, OSU; background image © Google, USGS.

    For example, landscapes across the world face altered flooding regimes in densely populated floodplains, erosion rates associated with road networks, altered runoff and erosion due to agricultural practices, and sediment release and seismic activity from mining [Tarolli and Sofia, 2016]. Modifications in land use (e.g., urbanization and changes in agricultural practices) alter water infiltration and runoff production, increasing flooding risks in floodplains. Increases in road density cause land degradation and erosion (Figure 4), especially when roads are poorly planned and constructed without well-designed drainage systems, leading to destabilized hillslopes and landslides. Erosion from agricultural fields can exceed rates of soil production, causing soil degradation and reducing crop yields, water quality, and food production. Mining areas, even years after reclamation, can induce seismicity, landslides, soil erosion, and terrain collapse, damaging environments and surface structures.

    Without accurate data on anthropogenic topography, communities will find it difficult to develop and implement strategies and practices aimed at reducing or mitigating the social and environmental impacts of anthropogenic geomorphic change.

    Earth Science Community’s Perspective Needed

    Technological advances in Earth observation have made possible what might have been inconceivable just a few years ago. A global map and inventory of human topographic signatures in three dimensions at high spatial resolution can now become a reality.

    Collecting and broadening access to high spatial resolution (meter to submeter scale), Earth science–oriented topography data acquired with lidar and other technologies would promote scientific discovery while fostering international interactions and knowledge exchange across the Earth science community. At the same time, enlarging the search for humanity’s topographical fingerprints to the full spectrum of environmental and cultural settings across Earth’s surface will require a more generalized methodology for discovering and assessing these signatures.

    These two parallel needs are where scientific efforts should focus. It is time for the Earth science community to come together and bring the topographic fingerprints of humanity to the eyes and minds of the current and future stewards, shapers, curators, and managers of Earth’s land surface.
    Acknowledgments

    Data sets for Figure 1 are from Utah Automated Geographic Reference Center (AGRC), Geospatial Information Office. Data sets for Figures 2(a)–2(c) are from the University of the Philippines Training Center for Applied Geodesy and Photogrammetry (TCAGP), Noh and Howat [2015], and Freie und Hansestadt Hamburg (from 2014), respectively. Data sets for Figure 3 are from Centre National d’Études Spatiales (CNES©), France, Distribution Airbus DS. Data sets for Figure 4 are from the HJ Andrews Experimental Forest research program, National Science Foundation’s Long-Term Ecological Research Program (NSF LTER, DEB 08-23380), U.S. Forest Service (USFS) Pacific Northwest Research Station, and Oregon State University (OSU).
    References

    Butler, D. (2006), Virtual globes: The web-wide world, Nature, 439, 776–778, https://doi.org/10.1038/439776a.

    Brown, A. G., et al. (2017), The geomorphology of the Anthropocene: Emergence, status and implications, Earth Surf. Processes Landforms, 42, 71–90, https://doi.org/10.1002/esp.3943.

    DeFries, R. S., et al. (2012), Planetary opportunities: A social contract for global change science to contribute to a sustainable future, BioScience, 62, 603–606, https://doi.org/10.1525/bio.2012.62.6.11.

    Ellis, E. C. (2015), Ecology in an anthropogenic biosphere, Ecol. Monogr., 85, 287–331, https://doi.org/10.1890/14-2274.1.

    Eltner, A., et al. (2016), Image-based surface reconstruction in geomorphometry—Merits, limits and developments, Earth Surf. Dyn., 4, 359–389, https://doi.org/10.5194/esurf-4-359-2016.

    Famiglietti, J. S., et al. (2015), Satellites provide the big picture, Science, 349, 684–685, https://doi.org/10.1126/science.aac9238.

    Guthrie, R. (2015), The catastrophic nature of humans, Nat. Geosci. 8, 421–422, https://doi.org/10.1038/ngeo2455.

    Noh, M. J., and I. M. Howat (2015), Automated stereo-photogrammetric DEM generation at high latitudes: Surface Extraction with TIN-based Search-space Minimization (SETSM) validation and demonstration over glaciated regions, GIScience Remote Sens., 52(2), 198–217, https://doi.org/10.1080/15481603.2015.1008621.

    Sidle, R. C., and A. D. Ziegler (2012), The dilemma of mountain roads, Nat. Geosci, 5, 437–438, https://doi.org/10.1038/ngeo1512.

    Sofia, G., F. Marinello, and P. Tarolli (2014), A new landscape metric for the identification of terraced sites: The slope local length of auto-correlation (SLLAC), ISPRS J. Photogramm. Remote Sens., 96, 123–133, https://doi.org/10.1016/j.isprsjprs.2014.06.018.

    Tarolli, P. (2014), High-resolution topography for understanding Earth surface processes: Opportunities and challenges, Geomorphology, 216, 295–312, https://doi.org/10.1016/j.geomorph.2014.03.008.

    Tarolli, P., and G. Sofia (2016), Human topographic signatures and derived geomorphic processes across landscapes, Geomorphology, 255, 140–161, https://doi.org/10.1016/j.geomorph.2015.12.007.

    Tarolli, P., et al. (2013), Recognition of surface flow processes influenced by roads and trails in mountain areas using high-resolution topography, Eur. J. Remote Sens., 46, 176–197.

    Tarolli, P., F. Preti, and N. Romano (2014), Terraced landscapes: From an old best practice to a potential hazard for soil degradation due to land abandonment, Anthropocene, 6, 10–25, https://doi.org/10.1016/j.ancene.2014.03.002.

    Verburg, P. H., et al. (2015), Land system science and sustainable development of the Earth system: A global land project perspective, Anthropocene, 12, 29–41, https://doi.org/10.1016/j.ancene.2015.09.004.

    Waters, C. N., et al. (2016), The Anthropocene is functionally and stratigraphically distinct from the Holocene, Science, 351, aad2622, https://doi.org/10.1126/science.aad2622.

    Wulder, M. A., and N. C. Coops (2014), Satellites: Make Earth observations open access, Nature, 513, 30–31, https://doi.org/10.1038/513030a.

    —Paolo Tarolli (email: paolo.tarolli@unipd.it; @TarolliP) and Giulia Sofia (@jubermensch2), Department of Land, Environment, Agriculture, and Forestry, University of Padova, Legnaro, Italy; and Erle Ellis (@erleellis), Department of Geography and Environmental Systems, University of Maryland, Baltimore County, Baltimore
    Citation: Tarolli, P., G. Sofia, and E. Ellis (2017), Mapping the topographic fingerprints of humanity across Earth, Eos, 98, https://doi.org/10.1029/2017EO069637. Published on 16 March 2017.
    © 2017. The authors. CC BY-NC-ND 3.0

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

     
  • richardmitnick 11:17 am on March 14, 2017 Permalink | Reply
    Tags: Applied Research & Technology, Deccan Traps, Earth’s lost history of planet-altering eruptions revealed, Enormous volcanoes vomited lava over the ancient Earth, Venus Mars Mercury and the Moon all show signs of enormous eruptions,   

    From Nature: “Earth’s lost history of planet-altering eruptions revealed” 

    Nature Mag
    Nature

    14 March 2017
    Alexandra Witze

    1
    India’s Western Ghats mountains contain igneous rock deposited 66 million years ago by a volcanic eruption in the Deccan Traps. Dinodia Photos/Getty

    Enormous volcanoes vomited lava over the ancient Earth much more often than geologists had suspected. Eruptions as big as the biggest previously known ones happened at least 10 times in the past 3 billion years, an analysis of the geological record shows.

    Such eruptions are linked with some of the most profound changes in Earth’s history. These include the biggest mass extinction, which happened 252 million years ago when volcanoes blanketed Siberia with molten rock and poisonous gases.

    “As we go back in time, we’re discovering events that are every bit as big,” says Richard Ernst, a geologist at Carleton University in Ottawa, Canada, and Tomsk State University in Russia, who led the work. “These are magnificent huge things.”

    Knowing when and where such eruptions occurred can help geologists to pinpoint ore deposits, reconstruct past supercontinents and understand the birth of planetary crust. Studying this type of volcanic activity on other planets can even reveal clues to the geological history of the early Earth.

    Ernst presented the findings this month to an industry consortium that funded the work (see ‘Earth’s biggest eruptions’). He expects to make the data public by the end of the year, through a map from the Commission for the Geological Map of the World in Paris.

    2

    “This will probably be the defining database for the next decade,” says Mike Coffin, a marine geophysicist at the University of Tasmania in Hobart, Australia.

    Surprisingly, the ancient eruptions lurk almost in plain sight. The lava they spewed has long since eroded away, but the underlying plumbing that funnelled molten rock from deep in the Earth up through the volcanoes is still there.

    Telltale tips

    Ernst and his colleagues scoured the globe for traces of this plumbing. It usually appears as radial spokes of ancient squirts of lava, fanned out around the throat of a long-gone volcano. The geologists mapped these features, known as dyke swarms, and used uranium–lead dating to pinpoint the age of the rock in each dyke. By matching the ages of the dykes, the researchers could connect those that came from a single huge eruption. During their survey, they found evidence of many of these major volcanic events.

    Each of those newly identified eruptions goes into Ernst’s database. “We’ve got about 10 or 15 so far that are probably comparable to the Siberian event,” Ernst says, “that we either didn’t know about or had a little taste, but no idea of their true extent.”

    They include a 1.32-billion-year-old eruption in Australia that connects to one in northern China. By linking dyke swarms across continents, scientists can better understand how Earth’s crust has shuffled around over time, says Nasrrddine Youbi, a geologist at Cadi Ayyad University in Marrakesh.

    Technically, the eruptions are known as ‘large igneous provinces’ (LIPs). They can spew more than one million cubic kilometres of rock in a few million years. By comparison, the 1980 eruption of Mount St Helens in Washington state put out just 10 cubic kilometres.

    These large events also emit gases that can change atmospheric temperature and ocean chemistry in a geological blink of an eye. A modelling study published last month suggests that global temperatures could have soared by as much as 7 °C per year at the height of the Siberian eruptions (F. Stordal et al. Palaeogeogr. Palaeoclimatol. Palaeoecol. 471, 96–107; 2017). Sulfur particles from the eruptions would have soon led to global cooling and acid rain; more than 96% of marine species went extinct.

    But the picture of how LIPs affected the global environment gets murkier the further back in time you get, says Morgan Jones, a volcanologist at the University of Oslo. Uncertainties in dating grow, and it becomes hard to correlate individual eruptions with specific environmental impacts. “It’s at the limit of our understanding,” he says.

    On average, LIPs occur every 20 million years or so. The most recent one was the Columbia River eruption 17 million years ago, in what is now the northwestern United States.

    Discovering more LIPs on Earth helps to put the geological history of neighbouring planets in perspective, says Tracy Gregg, a volcanologist at the University at Buffalo in New York. She and Ernst will lead a meeting on LIPs across the Solar System at a planetary-science meeting in Texas next week.

    Venus, Mars, Mercury and the Moon all show signs of enormous eruptions, Gregg notes. On the Moon, LIP-style volcanism started as early as 3.8 billion years ago; on Mars, possibly 3.5 billion years ago. But without plate tectonics to keep the surface active, those eruptions eventually ceased.

    “Other planetary bodies retain information about the earliest parts of planetary evolution, information that we’ve lost on Earth,” Gregg says. “They can give us a window into the early history of our own planet.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Nature is a weekly international journal publishing the finest peer-reviewed research in all fields of science and technology on the basis of its originality, importance, interdisciplinary interest, timeliness, accessibility, elegance and surprising conclusions. Nature also provides rapid, authoritative, insightful and arresting news and interpretation of topical and coming trends affecting science, scientists and the wider public.

     
  • richardmitnick 10:41 am on March 14, 2017 Permalink | Reply
    Tags: Applied Research & Technology, dark web, , , Wrangling crime in the deep   

    From Science Node: “Wrangling crime in the deep, dark web” 

    Science Node bloc
    Science Node

    06 Mar, 2017
    Jorge Salazar

    Much of the internet hides like an iceberg below the surface.

    This so-called ‘deep web’ is estimated to be 500 times bigger than the ‘surface web’ seen through search engines like Google. For scientists and others, the deep web holds important computer code and licensing agreements.

    Nestled further inside the deep web, one finds the ‘dark web,’ a place where images and video are used by traders in illicit drugs, weapons, and human lives.

    “Behind forms and logins, there are bad things,” says Chris Mattmann, chief architect in the instrument and science data systems section of the NASA Jet Propulsion Laboratory (JPL) at the California Institute of Technology.

    “Behind the dynamic portions of the web, people are doing nefarious things, and on the dark web, they’re doing even more nefarious things. They traffic in guns and human organs. They’re doing these activities and then they’re tying them back to terrorism.”

    In 2014, the Defense Advanced Research Projects Agency (DARPA) started a program called Memex to make the deep web accessible. “The goal of Memex was to provide search engines the retrieval capacity to deal with those situations and to help defense and law enforcement go after the bad guys on the deep web,” Mattmann says.

    At the same time, the US National Science Foundation (NSF) invested $11.2 million in a first-of-its-kind data-intensive supercomputer – the Wrangler supercomputer, now housed at the Texas Advanced Computing Center (TACC). The NSF asked engineers and computer scientists at TACC, Indiana University, and the University of Chicago if a computer could be built to handle massive amounts of input and output.


    TACC Wrangler

    Wrangler does just that, enabling the speedy file transfers needed to fly past big data bottlenecks that can slow down even the fastest computers. It was built to work in tandem with number crunchers such as TACC’s Stampede, which in 2013 was the sixth fastest computer in the world.


    Dell Poweredge U Texas Austin Stampede Supercomputer. Texas Advanced Computer Center 9.6 PF

    “Although we have a lot of search-based queries through different search engines like Google, it’s still a challenge to query the system in way that answers your questions directly,” says Karanjeet Singh.

    Singh is a University of Southern California graduate student who works with Chris Mattmann on Memex and other projects.

    “The objective is to get more and more domain-specific information from the internet and to associate facts from that information.”

    Once the Memex user extracts the information they need, they can apply tools such as named entity recognizer, sentiment analysis, and topic summarization. This can help law enforcement agencies find links between different activities, such as illegal weapon sales and human trafficking.

    The problem is that even the fastest computers like Stampede weren’t designed to handle the input and output of the millions of files needed for the Memex project.

    “Let’s say that we have one system directly in front of us, and there is some crime going on,” Singh says. “What the JPL is trying to do is automate a lot of domain-specific query processes into a system where you can just feed in the questions and receive the answers.”

    For that, he works with an open source web crawler called Apache Nutch. It retrieves and collects web page and domain information of the deep web. The MapReduce framework powers those crawls with a divide-and-conquer approach to big data that breaks it up into small pieces that run simultaneously.

    The NSF asked engineers and computer scientists at TACC, Indiana University, and the University of Chicago if a computer could be built to handle massive amounts of input and output.

    Wrangler avoids data overload by virtue of its 600 terabytes of speedy flash storage. What’s more, Wrangler supports the Hadoop framework, which runs using MapReduce.

    Together, Wrangler and Memex constitute a powerful crime-fighting duo. NSF investment in advanced computation has placed powerful tools in the hands of public defense agencies, moving law enforcement beyond the limitations of commercial search engines.

    “Wrangler is a fantastic tool that we didn’t have before as a mechanism to do research,” says Mattman. “It has been an amazing resource that has allowed us to develop techniques that are helping save people, stop crime, and stop terrorism around the world.”

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    Science Node is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, Science Node is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read Science Node via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
  • richardmitnick 10:02 am on March 13, 2017 Permalink | Reply
    Tags: Applied Research & Technology, , Stromboli volcano,   

    From Eos: “Tracking Volcanic Bombs in Three Dimensions” 

    AGU bloc

    AGU
    Eos news bloc

    Eos

    3.13.17
    Leah Crane

    1
    Off the north coast of Sicily, an eruption of the Stromboli volcano sends decametric lava fragments flying into the air. A new method allows researchers to track these “bombs” and to reconstruct their flight trajectories in three dimensions. Credit: Florian Becker/Vulkankultour

    In explosive volcanic eruptions, bits of fragmented magma can be shot through the air by the release and expansion of pressurized gas. The trajectory map of these centimetric to decametric fragments, called “bombs,” is an important parameter in the study of explosive eruptions and the dangers that they present: Understanding how fast the debris is moving, how far it moves, and in which direction pieces travel could help scientists assess the hazards of volcanic eruptions or man-made explosions. In a new paper, Gaudin et al . present a method for studying the motion of volcanic bombs in three dimensions, allowing for more precise trajectory reconstructions.

    There are several conditions that make observing active volcanic vents and bombs difficult, including the obvious difficulty of getting cameras close to the vents. The most significant of the problems is the large number of bombs from each explosive event that may change shape in flight and whose flight paths overlap with one another.

    2
    When observing a bubble bursting in Halema‘uma‘u lava lake in Hawaii, researchers manually tracked selected pieces of debris on stills of a video. These two images of the resulting set of trajectories could then be combined to produce a three-dimensional map. Credit: Gaudin et al. [2016]

    These limitations make any automatic tracking difficult or impossible, so the scientists simplified their procedure by relying on manual tracking of a few representative bombs rather than a computerized account of every single one. By placing two or more high-speed video cameras at well-documented positions around the volcanic vent, they were able to manually determine an object’s location in all of the images, computing the object’s position in three dimensions.

    The human component of this manual process can be a major source of error since the person tracking the bombs makes a series of subjective choices, like deciding where exactly on the object to select as a representative point in each frame. If the cameras are tilted at all, that can also be a significant component of uncertainty in the measurements.

    In the new study, the team was able to reduce uncertainty to 10° in angle and 20% in speed of the bombs. They used three events as examples: a bursting bubble at the Halema‘uma‘u lava lake in Hawaii, in-flight bomb collision, and an explosive ejection event at Stromboli volcano in Italy. A video showing the bursting bubble followed by the explosive ejection and their model in action is given below.


    In Stromboli’s case, the reliability of the trajectory reconstruction was demonstrated by comparing the 3-D reconstruction with the low-speed, low-resolution cameras of the Stromboli permanent monitoring network. These case studies demonstrated just a few of the numerous contexts in which this 3-D tracking method could be useful, both within and beyond the study of volcanic vents and magma. (Geochemistry, Geophysics, Geosystems, https://doi.org/10.1002/2016GC006560, 2016)

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

     
  • richardmitnick 9:10 am on March 13, 2017 Permalink | Reply
    Tags: Applied Research & Technology, , , , Stanford engineers use soup additive to create a stretchable plastic electrode   

    From Stanford: “So long stiffness: Stanford engineers use soup additive to create a stretchable plastic electrode” 

    Stanford University Name
    Stanford University

    March 10, 2017
    Shara Tonn

    .
    Courtesy Bao Research Group
    Access mp4 video here .
    A robotic test instrument stretches over a curved surface a nearly transparent, flexible electrode based on a special plastic developed in the lab of Stanford chemical engineer. Zhenan Bao.

    Chemical engineer Zhenan Bao is trying to change that. For more than a decade, her lab has been working to make electronics soft and flexible so that they feel and operate almost like a second skin. Along the way, the team has started to focus on making brittle plastics that can conduct electricity more elastic.

    Now in Science Advances, Bao’s team describes how they took one such brittle plastic and modified it chemically to make it as bendable as a rubber band, while slightly enhancing its electrical conductivity. The result is a soft, flexible electrode that is compatible with our supple and sensitive nerves.

    “This flexible electrode opens up many new, exciting possibilities down the road for brain interfaces and other implantable electronics,” said Bao, a professor of chemical engineering. “Here, we have a new material with uncompromised electrical performance and high stretchability.”

    The material is still a laboratory prototype, but the team hopes to develop it as part of their long-term focus on creating flexible materials that interface with the human body.

    1
    A printed electrode pattern of the new polymer being stretched to several times of its original length (top), and a transparent, highly stretchy “electronic skin” patch forming an intimate interface with the human skin to potentially measure various biomarkers (bottom). (Image credit: Bao Lab)

    Flexible interface

    Electrodes are fundamental to electronics. Conducting electricity, these wires carry back and forth signals that allow different components in a device to work together. In our brains, special thread-like fibers called axons play a similar role, transmitting electric impulses between neurons. Bao’s stretchable plastic is designed to make a more seamless connection between the stiff world of electronics and the flexible organic electrodes in our bodies.

    “One thing about the human brain that a lot of people don’t know is that it changes volume throughout the day,” says postdoctoral research fellow Yue Wang, the first author on the paper. “It swells and deswells.” The current generation of electronic implants can’t stretch and contract with the brain and make it complicated to maintain a good connection.

    “If we have an electrode with a similar softness as the brain, it will form a better interface,” said Wang.

    To create this flexible electrode, the researchers began with a plastic that had two essential qualities: high conductivity and biocompatibility, meaning that it could be safely brought into contact with the human body. But this plastic had a shortcoming: It was very brittle. Stretching it even 5 percent would break it.

    Tightly wound and brittle

    As Bao and her team sought to preserve conductivity while adding flexibility, they worked with scientists at the SLAC National Accelerator Laboratory to use a special type of X-ray to study this material at the molecular level. All plastics are polymers; that is, chains of molecules strung together like beads. The plastic in this experiment was actually made up of two different polymers that were tightly wound together. One was the electrical conductor. The other polymer was essential to the process of making the plastic. When these two polymers combined they created a plastic that was like a string of brittle, sphere-like structures. It was conductive, but not flexible.

    The researchers hypothesized that if they could find the right molecular additive to separate these two tightly wound polymers, they could prevent this crystallization and give the plastic more stretch. But they had to be careful – adding material to a conductor usually weakens its ability to transmit electrical signals.

    After testing more than 20 different molecular additives, they finally found one that did the trick. It was a molecule similar to the sort of additives used to thicken soups in industrial kitchens. This additive transformed the plastic’s chunky and brittle molecular structure into a fishnet pattern with holes in the strands to allow the material to stretch and deform. When they tested their new material’s elasticity, they were delighted to find that it became slightly more conductive when stretched to twice its original length. The plastic remained very conductive even when stretched 800 percent its original length.

    “We thought that if we add insulating material, we would get really poor conductivity, especially when we added so much,” said Bao. But thanks to their precise understanding of how to tune the molecular assembly, the researchers got the best of both worlds: the highest possible conductivity for the plastic while at the same transforming it into a very robust and stretchy substance.

    “By understanding the interaction at the molecular level, we can develop electronics that are soft and stretchy like skin, while remaining conductive,” Wang says.

    Other authors include postdoctoral fellows Chenxin Zhu, Francisco Molina-Lopez, Franziska Lissel and Jia Liu; graduate students Shucheng Chen and Noelle I. Rabiah; Hongping Yan and Michael F. Toney, staff scientists at SLAC National Accelerator Laboratory; Christian Linder, an assistant professor of civil and environmental engineering who is also a member of Stanford Bio-X and of the Stanford Neurosciences Institute; Boris Murmann, a professor of electrical engineering and a member of the Stanford Neurosciences Institute; Lihua Jin, now an assistant professor of mechanical and aerospace engineering at the University of California, Los Angeles; Zheng Chen, now an assistant professor of nano engineering at the University of California, San Diego; and colleagues from the Materials Science Institute of Barcelona, Spain, and Samsung Advanced Institute of Technology.

    This work was funded by Samsung Electronics and the Air Force Office of Science Research.

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    Leland and Jane Stanford founded the University to “promote the public welfare by exercising an influence on behalf of humanity and civilization.” Stanford opened its doors in 1891, and more than a century later, it remains dedicated to finding solutions to the great challenges of the day and to preparing our students for leadership in today’s complex world. Stanford, is an American private research university located in Stanford, California on an 8,180-acre (3,310 ha) campus near Palo Alto. Since 1952, more than 54 Stanford faculty, staff, and alumni have won the Nobel Prize, including 19 current faculty members

    Stanford University Seal

     
  • richardmitnick 8:35 am on March 10, 2017 Permalink | Reply
    Tags: Applied Research & Technology, , Hyak and Lolo supercomputing, Seeking to unravel DNA, U Wahington   

    From U Washington: “Seeking to unravel DNA” 

    U Washington

    University of Washington

    3.10.17
    No writer credit

    1
    Tim Durham, a graduate student in the William Stafford Noble Lab in the UW Department of Genome Sciences, adopted cloud computing to do his research.

    When Timothy Durham looks at the human genome, he sees an encyclopedia of precise instructions that tell approximately 31 trillion cells in the human body how to do their jobs.

    Figuring out how cells read and interpret these instructions—and how they can misread them—could help researchers unravel the mysteries of what leads to disease and point to cures. This is a complicated ongoing work being performed by thousands of researchers across the globe.

    Over the past decade, their efforts have produced large amounts of rich data. So when Durham, a graduate student and researcher in the William Stafford Noble Lab in the UW Department of Genome Sciences, decided to join the research, he found that a desktop computer and small department servers would not be up to the task.

    That’s why he turned to University of Washington Information Technology’s Research Computing experts, who recommended a cloud computing solution to do his work. The cloud, Durham said, provided him with virtually unlimited resources for computation, storage, networking and data management, the sort of tools he needed to build a complex three-dimensional model that would capture the state of the genome in different cell types. The model, he hopes, will help other researchers advance the field of genomics.

    Interpreting the human genome has been a tremendous challenge. It is like looking at a cookbook written in a foreign language with its own unique rules of grammar. In this cookbook, Durham said, genes are like “recipes” that cells use to construct the machinery they need to function

    “Now, we are starting to learn the language and the grammar of the genome, which is like learning to read the recipes and to understand which ones work well together and how the cell decides what to make,” he said.

    The ultimate goal is to be able to understand how the genome is used in different types of cells in the body to answer questions such as, “Which genes are important to the function of skin cells versus liver cells?”

    And in the same way that a cook doesn’t make every recipe in a cookbook when planning a meal, specific kinds of cells only care about certain subsets of genes when they are doing their work.

    “If we can understand how cells pick the genes they need out of all 20,000 genes in the genome cookbook, it will have a profound impact on the way we understand human biology and disease,” Durham said.

    Noble’s Lab is a perfect place for Durham’s work. The lab develops and applies computational techniques for modeling and understanding biological processes at the molecular level. Machine learning, a subfield of computer science focused on the study and construction of algorithms that can learn from and make predictions on data, is an important area for research, and Durham relied on its principles to develop his model.

    “I am developing a model that captures the state of the genome across 127 different cell types. The full data set is more than 2 TB, which is more than the memory capacity of our entire lab cluster,” Durham said.

    UW-IT set up Durham with Microsoft Azure and Amazon Web Services, which offer cloud services to the University of Washington. To help fund this, Durham applied for awards from Amazon’s Cloud Credits for Research program and from Microsoft’s Azure for Research program, and was granted $30,000 in cloud research credits, an extremely valuable contribution that helped accelerate his work.

    “Research funding is not easy to come by, so the credit program is really valuable,” Durham said. “It helps you through the initial learning curve involved in moving to the cloud by removing some of the risk of adopting a new technology and allowing you an extended trial period in which you can really dive deep to see how well it works for your application,” Durham said.

    Before Durham moved to the cloud, he was using lab servers, and even one of his smallest processing runs would take up to two full weeks to complete, said Rob Fatland, a UW-IT Research Computing Director who offers consulting and support to researchers looking at cloud computing solutions or other innovative tools offered at the UW, such as Hyak, the University’s shared cluster supercomputer.

    3
    U Washington Hyak and Lolo

    2
    Rob Fatland, UW-IT Research Computing Director, helps researchers navigate the cloud.

    “When he was using the department servers for his work, no one else could use them,” Fatland said. “In the cloud, he reduced processing time to hours without the restrictions that come with shared resources.”

    Large-scale cloud computing for research is relatively new to the University, but it is quickly establishing itself as a valuable tool, Fatland said. When talking to researchers, he discusses security, management and cost to operate in the cloud.

    Fatland said many researchers who have switched to the cloud have found that it is more cost effective for many types of computing, with costs decreasing over time. It is also extremely secure, so they don’t have to worry about losing their work. And it offers an elastic environment, easily allowing researchers to scale up their work instantly.

    “It’s an empowering technology,” Fatland said.

    That has been the case for Durham, whose goal for his three-dimensional model is to predict what parts of the genome are most important in a particular cell type, such as a liver or a heart cell.

    “It is challenging to train one of these computing models,” he said. “You have to do a lot of fine tuning and it takes a lot of computing time to optimize it, a lot of trial and error.” But with the cloud, he doesn’t have to wait for anyone to get done with their work. It is always available when he needs it.

    “In the end, if we can predict the most relevant portions of the genome in a particular cell type, this can help us zero in on specific regions of the genome that might, for example, harbor mutations that can contribute to disease,” he said.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    u-washington-campus
    The University of Washington is one of the world’s preeminent public universities. Our impact on individuals, on our region, and on the world is profound — whether we are launching young people into a boundless future or confronting the grand challenges of our time through undaunted research and scholarship. Ranked number 10 in the world in Shanghai Jiao Tong University rankings and educating more than 54,000 students annually, our students and faculty work together to turn ideas into impact and in the process transform lives and our world. For more about our impact on the world, every day.

    So what defines us — the students, faculty and community members at the University of Washington? Above all, it’s our belief in possibility and our unshakable optimism. It’s a connection to others, both near and far. It’s a hunger that pushes us to tackle challenges and pursue progress. It’s the conviction that together we can create a world of good. Join us on the journey.

     
  • richardmitnick 10:13 am on March 9, 2017 Permalink | Reply
    Tags: Applied Research & Technology, , , Embryos can be repaired, in vitro fertilization, Triple helix,   

    From Yale: “Gene editing opens the door to a “revolution” in treating and preventing disease” 

    Yale University bloc

    Yale University

    March 8, 2017
    John Dent Curtis

    Today, in vitro fertilization provides a way for couples to avoid passing potentially disease-causing genes to their offspring. A couple will undergo genetic screening. Tests will determine whether their unborn children are at risk. If embryos created through IVF show signs of such a genetic mutation, they can be discarded.

    Flash forward a few years, and, instead of being discarded, those embryos can be repaired with new gene editing technologies. And those repairs will affect not only those children, but all their descendants.

    “This is definitely new territory,” said Pasquale Patrizio, M.D., director of the Yale Fertility Center and Fertility Preservation Program. “We are at the verge of a huge revolution in the way disease is treated.”

    We are at the verge of a huge revolution in the way disease is treated.”
    Pasquale Patrizio, M.D., director of the Yale Fertility Center and Fertility Preservation Program

    In a move that seems likely to help clear the path for the use of gene editing in the clinical setting, on February 14 the Committee on Human Gene Editing, formed by the National Academy of Medicine and the National Academy of Sciences, recommended that research into human gene editing should go forward under strict ethical and safety guidelines. Among their concerns were ensuring that the technology be used to treat only serious diseases for which there is no other remedy, that there be broad oversight, and that there be equal access to the treatment. These guidelines provide a framework for discussion of technology that has been described as an “ethical minefield” and for which there is no government support in the United States.

    A main impetus for the committee’s work appears to be the discovery and widespread use of CRISPR-Cas9, a defense that bacteria use against viral infection. Scientists including former Yale faculty member Jennifer Doudna, Ph.D., now at the University of California, Berkeley, and Emmanuelle Charpentier, Ph.D., of the Max Planck Institute for Infection Biology in Berlin, discerned that the CRISPR enzyme could be harnessed to make precision cuts and repairs to genes. Faster, easier, and cheaper than previous gene editing technologies, CRISPR was declared the breakthrough of the year in 2015 by Science magazine, and has become a basic and ubiquitous laboratory research tool. The committee’s guidelines, said scientists, physicians, and ethicists at Yale, could pave the way for thoughtful and safe use of this and other human gene editing technologies. In addition to CRISPR, the committee described three commonly used gene editing techniques; zinc finger nucleases, meganucleases, and transcription activator-like effector nucleases.

    Patrizio, professor of obstetrics, gynecology, and reproductive sciences, said the guidelines are on the mark, especially because they call for editing only in circumstances where the diseases or disabilities are serious and where there are not alternative treatments. He and others cited such diseases as cystic fibrosis, sickle cell anemia, and thalassemia as targets for gene editing. Because they are caused by mutations in a single gene, repairing that one gene could prevent disease.

    Peter Glazer, M.D. ’87, Ph.D. ’87, HS ’91, FW ’91, chair and the Robert E. Hunter Professor of Therapeutic Radiology and professor of genetics, said, “The field will benefit from guidelines that are thoughtfully developed. This was a step in the right direction.”

    The panel recommended that gene editing techniques should be limited to deal with genes proven to cause or predispose to specific diseases. It should be used to convert mutated genes to versions that are already prevalent in the population. The panel also called for stringent oversight of the process and for a prohibition against use of the technology for “enhancements,” rather than to treat disease. “As physicians, we understand what serious diseases are. Many of them are very well known and well characterized on a genetic level,” Glazer said. “The slippery slope is where people start thinking about modifications in situations where people don’t have a serious disorder or disease.”

    Mark Mercurio, M.D., professor of pediatrics (neonatology), and director of the Program for Biomedical Ethics, echoed that concern. While he concurs with the panel’s recommendations, he urged a clear definition of disease prevention and treatment. “At some point we are not treating, but enhancing.” This in turn, he said, conjures up the nation’s own medical ethical history, which includes eugenics policies in the early 20th century that were later adopted in Nazi Germany. “This has the potential to help a great many people, and is a great advance. But we need to be cognizant of the history of eugenics in the United States and elsewhere, and need to be very thoughtful in how we use this technology going forward,” he said.

    The new technology, he said, can lead to uncharted ethical waters. “Pediatric ethics are more difficult,” Mercurio said. “It is one thing to decide for yourself–is this a risk I’m willing to take—and another thing to decide for a child. It is another thing still further, which we have never had to consider, to decide for future generations.”

    Myron Genel, M.D., emeritus professor of pediatrics and senior research scientist, served on Connecticut’s stem cell commission and four years on the Health and Human Services Secretary’s Advisory Committee on Human Research Protections. He believes that Connecticut’s guidelines on stem cell research provide a framework for addressing the issues associated with human gene editing. “There is a whole regulatory process that has been evolved governing the therapeutic use of stem cells,” he said. “There are mechanisms that have been put in place for effective local oversight and national oversight for stem cell research.”

    Although CRISPR has been the subject of a bitter patent dispute between Doudna and Charpentier and The Broad Institute in Cambridge, Mass., a recent decision by the U.S. Patent Trial and Appeal Board in favor of Broad is unlikely to affect research at Yale and other institutions. Although Broad, an institute of Harvard and the Massachusetts Institute of Technology, can now claim the patent, universities do not typically enforce patent rights against other universities over research uses.

    At Yale, scientists and physicians noted that gene editing is years away from human trials, and that risks remain. The issue now, said Glazer, is “How do we do it safely? It is never going to be risk-free. Many medical therapies have side effects and we balance the risks and benefits.” Despite its effectiveness, CRISPR is also known for what’s called “off-target risk,” imprecise cutting and splicing of genes that could lead to unforeseen side effects that persist in future generations. “CRISPR is extremely potent in editing the gene it is targeting,” Glazer said. “But it is still somewhat promiscuous and will cut other places. It could damage a gene you don’t want damaged.”

    Glazer has been working with a gene editing technology called triple helix that hijacks DNA’s own repair mechanisms to fix gene mutations. Triple helix, as its name suggests, adds a third strand to the double helix of DNA. That third layer, a peptide nucleic acid, binds to DNA and provokes a natural repair process that copies a strand of DNA into a target gene. Unlike CRISPR and other editing techniques, it does not use nucleases that cut DNA. “This just recruits a process that is natural. Then you give the cell this piece of DNA, this template that has a new sequence,” Glazer said, adding that triple helix is more precise than CRISPR and leads to fewer off-target effects, but is a more complex technology that requires advanced synthetic chemistry.

    Along with several scientists across Yale, Glazer is studying triple helix as a potential treatment for cystic fibrosis, HIV/AIDS, spherocytosis, and thalassemia.

    Adele Ricciardi, a student in her sixth year of the M.D./Ph.D. program, is working with Glazer and other faculty on use of triple helix to make DNA repairs in utero. She also supports the panel’s decision, but believes that more public discussion is needed to allay fears of misuse of the technology. In a recent presentation to her lab mates, she noted that surveys show widespread public concern about such biomedical advances. One study found that most of those surveyed felt it should be illegal to change the genes of unborn babies, even to prevent disease.

    “There is, I believe, a misconception of what we are using gene editing for,” Ricciardi said. “We are using it to edit disease-causing mutations, not to improve the intelligence of our species or get favorable characteristics in babies. We can improve quality of life in kids with severe genetic disorders.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Yale University Campus

    Yale University comprises three major academic components: Yale College (the undergraduate program), the Graduate School of Arts and Sciences, and the professional schools. In addition, Yale encompasses a wide array of centers and programs, libraries, museums, and administrative support offices. Approximately 11,250 students attend Yale.

     
  • richardmitnick 9:54 am on March 9, 2017 Permalink | Reply
    Tags: Applied Research & Technology, , Personal Genome Project,   

    From Wyss: “Wyss Institute and Lumos Labs Launch Research Collaboration on Memory of High Performing Individuals” 

    Harvard bloc tiny
    Wyss Institute bloc
    Wyss Institute

    March 9, 2017
    Eriona Hysolli

    Personal Genome Project will integrate brain training tests to help identify key memory genes towards understanding neurodegeneration.

    Researchers at the Wyss Institute for Biologically Inspired Engineering and Harvard Medical School (HMS)’s Personal Genome Project (PGP) announced today a new collaboration with Lumos Labs, makers of brain training program Lumosity. The PGP-Lumosity memory project aims to leverage the PGP’s and Lumos Labs’ unique resources and expertise to investigate the relationship between genetics and memory, attention and reaction speed.

    Wyss scientists plan to recruit 10,000 members from the PGP which started in 2005 in the laboratory of George Church, PhD, a founding Core Faculty member of the Wyss Institute and also Professor of Genetics at Harvard Medical School. Participants in the PGP publicly share their genome sequences, biospecimens and healthcare data for unrestricted research on genetic and environmental relationships to disease and wellness. Wyss Institute researchers will use a select set of cognitive tests from Lumos Labs’ NeuroCognitive Performance Test (NCPT), a brief, repeatable, accessible web-based alternative to traditional pencil-paper cognitive assessments to evaluate participant’s memory functions, including their ability to recall objects, memorize object patterns, and response times.

    Church’s research team at the Wyss Institute and HMS Postdoctoral Fellows Elaine Lim, Ph.D., and Rigel Chan, Ph.D., will correlate extremely high performance scores with naturally-occurring variations in the participants’ genomes. “Our goal is to get people who have remarkable memory traits and engage them in the PGP. If you are exceptional in any way, you should share it not hoard it,” said Church.

    To validate their findings, the team will take advantage of the Wyss Institute’s exceptional abilities to sequence, edit and visualize DNA, model neuronal development in 3D brain organoids ex vivo, and, ultimately, to test emerging hypotheses in experimental models of neurodegeneration.

    “The Wyss Institute’s extraordinary scientific program and the Personal Genome Project’s commitment to research that is both pioneering and responsible make them ideal collaborators,” said Bob Schafer, Ph.D., Director of Research at Lumos Labs. “Combining Lumosity’s potential as a research tool could help us learn more about how our online assessment can help power innovative, large-scale studies.”

    Drs. Church, Lim and Chan plan to begin recruitment for this study in early March.

    The PGP-Lumosity memory project is the latest in a long line of exciting research collaborations supported by each platform. Through their Human Cognition Project, Lumos Labs is currently working with independent researchers at over 60 different institutions and investigating a range of topics, including normal aging, certain clinical conditions and the relationship between exercise and Lumosity training. Existing collaborative projects available to PGP participants include stem cell banking with the New York Stem Cell Foundation, “Go Viral” real-time Cold & Flu surveillance, the biology of Circles with Harvard Medical School, Genetics of Perfect pitch with the Feinstein Institute for Medical Research, characterizing the human microbiome in collaboration with American Gut, and discounted whole genome sequencing strategies.

    With the PGP’s aim to serve as a portal that empowers the public to drive scientific discovery through their participation, this collaboration is a synergistic convergence of two uniquely positioned organizations that combine science with broad outreach.

    “What excites us about this project is opening up groundbreaking technologies developed at the Wyss Institute to explore the relationship between genetics and memory with possible implications for Alzheimer’s and other diseases,” said Wyss Institute Founding Director Donald Ingber, M.D., Ph.D., who is also the Judah Folkman Professor of Vascular Biology at Harvard Medical School and Boston Children’s Hospital, and Professor of Bioengineering at Harvard SEAS.

    For more information or to register in the study, please visit: https://wyss.harvard.edu/pgp-lumosity

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Wyss Institute campus

    The Wyss (pronounced “Veese”) Institute for Biologically Inspired Engineering uses Nature’s design principles to develop bioinspired materials and devices that will transform medicine and create a more sustainable world.

    Working as an alliance among Harvard’s Schools of Medicine, Engineering, and Arts & Sciences, and in partnership with Beth Israel Deaconess Medical Center, Boston Children’s Hospital, Brigham and Women’s Hospital, Dana Farber Cancer Institute, Massachusetts General Hospital, the University of Massachusetts Medical School, Spaulding Rehabilitation Hospital, Tufts University, and Boston University, the Institute crosses disciplinary and institutional barriers to engage in high-risk research that leads to transformative technological breakthroughs.

     
  • richardmitnick 9:39 am on March 9, 2017 Permalink | Reply
    Tags: Applied Research & Technology, Autism Spectrum Disorder (ASD), Big data reveals more suspect autism genes, ,   

    From COSMOS: “Big data reveals more suspect autism genes” 

    Cosmos Magazine bloc

    COSMOS

    09 March 2017
    Paul Biegler

    1
    Deep data dives are revealing more complexities in the autism story. luckey_sun

    Researchers have isolated 18 new genes believed to increase risk for Autism Spectrum Disorder (ASD), a finding that may pave the way for earlier diagnosis and possible future drug treatments for the disorder.

    The study, published this week in Nature Neuroscience, used a technique called whole genome sequencing (WGS) to map the genomes of 5193 people with ASD.

    WGS goes beyond traditional analyses that look at the roughly 1% of DNA that makes up our genes to take in the remaining “noncoding” or “junk” DNA once thought to have little biological function.

    The study, led by Ryan Yuen of the Hospital for Sick Children in Toronto, Canada, used a cloud-based “big data” approach to link genetic variations with participants’ clinical data.

    Researchers identified 18 genes that increased susceptibility to ASD, noting people with mutations in those genes had reduced “adaptive functioning”, including the ability to communicate and socialise.

    “Detection of the mutation would lead to prioritisation of these individuals for comprehensive clinical assessment and referral for earlier intervention and could end long-sought questions of causation,” the authors write.

    But the study also found increased variations in the noncoding DNA of people with ASD, including so-called “copy number variations” where stretches of DNA are repeated. The finding highlights the promise of big data to link fine-grained genetic changes with real world illness, something the emerging discipline of precision medicine will harness to better target treatments.

    Commenting on the study, Dr Jake Gratten from the Institute for Molecular Bioscience at the University of Queensland said, “whole genome sequencing holds real promise for understanding the genetics of ASD, but establishing the role of noncoding variation in the disorder is an enormous challenge.”

    “This study is a good first step but we’re not there yet – much larger studies will be needed,” he said. ASD affects around 1% of the population, and is characterised by impaired social and emotional communication, something poignantly depicted by John Elder Robeson in his 2016 memoir Switched On.

    But the study findings went beyond autism, isolating ASD-linked genetic changes that increase risk for heart problems and diabetes, raising the possibility of preventative screening for participants and relatives.

    The authors note that 80% of the 61 ASD-risk genes already discovered by the project, a collaboration between advocacy group Autism Speaks and Verily Life Sciences, and known as MSSNG, are potential research targets for new drug treatments.

    But the uncomfortable nexus between scientific advances and public policy is also highlighted this week in an editorial in the New England Journal of Medicine. Health policy researchers David Mandell and Colleen Barry argue that planned Trump administration rollbacks threaten services to people with autism.

    Any repeal of the Affordable Care Act (“Obamacare”) they write, could include cuts to the public insurer Medicaid and subsequent limits on physical, occupational and language therapy for up to 250,000 children with autism.

    The authors also warn that comments made by US Attorney General Jeff Sessions bode ill for the Individuals with Disabilities Education Act (IDEA), legislation that guarantees free education for children with disabilities such as autism. Sessions has reportedly said the laws “may be the single most irritating problem for teachers throughout America today.”

    The authors also voice concern the Trump administration’s embrace of debunked links between vaccination and autism are a major distraction from these “growing threats to essential policies that support the health and well-being of people with autism or other disabilities”.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 11:02 am on March 8, 2017 Permalink | Reply
    Tags: "Gulden Othman, Applied Research & Technology, , , ,   

    From UNC: Women in STEM – “Gulden Othman” 

    U NC bloc

    University of North Carolina

    1
    Gulden Othman is a third-year graduate student in the Department of Physics and Astronomy within the UNC College of Arts & Sciences. She currently works in the Experimental Nuclear and Astroparticle Physics group and is also on the executive board of UNC Women in Science and Engineering (WISE). Her research focuses on observing the interactions of the building blocks of matter to understand how the universe has evolved from the Big Bang to present day.

    March 8th, 2017

    When you were a child, what was your response to this question: “What do you want to be when you grow up?”

    I always wanted to be an astronaut. I grew up in west Texas in an area where there was not very much light pollution. I spent a lot of time musing at the stars, imagining the vast unknown. I decided that, one day, I would go to the stars and discover the unknown myself.

    Share the pivotal moment in your life that helped you choose research as a career path.

    When I began my involvement in research as a sophomore undergraduate, I was astonished at how much work was still being done to understand physics. What was even more amazing was that I was able to make a contribution, although small, to this field — to working toward a better understanding of our universe. The more I progressed through my undergraduate coursework, the more certain I was that I would not be done learning by the time I graduated. Now that I am in graduate school, I know that I am still not done learning and never will be.

    What’s an interesting thing that’s happened during your research?

    I spent a few months designing a large electromagnet that will be used in an experiment I am no longer involved in. Because of the complexity of the design, we could not build it on campus and needed to submit the design to a vendor. I was very worried that the magnet would be built and come nowhere near the specifications I intended it for. After double-checking my design, though, the vendor believed it would meet the specifications we desired. It’s surreal to think that my design will actually be built and functional someday soon.

    In honor of Women’s History Month, share an anecdote that shows why women need to continue breaking barriers.

    Upon beginning to do research my sophomore year, an upperclassman tutored me on advanced physics topics that I had not yet taken courses on, but that would be necessary for my research. He quizzed me on courses I had already taken and asked me to write down equations from memory. Being put on the spot was difficult, and I could not write down most of what he asked for. He responded by telling me I wasn’t smart enough to be a physicist and that I should consider other career options in sciences that are “less difficult.” He believed he was being helpful. I was distraught and, for one night, considered changing my major. But I couldn’t think of any subject I wanted to study more than physics.

    The next day, I talked to the professor who was advising my research about what happened. I told him that I was fine with not being the best physicist, as long as I could study physics, and that I would work hard and not give up or change my major. He was very supportive of me, even after I chose to leave his group and transition into the field of research I am in now — experimental nuclear and particle physics. Almost six years later, I now have a prestigious fellowship and am working toward my PhD in physics. I am glad I did not let someone else’s view of me discourage me from reaching my goals.

    What advice would you give to up-and-coming female researchers in your field?

    If you love science, never give up pursuing it. You may at times, as I did, feel like everyone around you is so naturally brilliant, and that you will never be able to be as smart or talented as them. That is never the case. Hard work means a lot more than you might think. Always have a support group. At UNC, two great places to find support are the local Women in Physics group and the Society of Physics Students chapter.

    UNC Research is proud of every scientist on this campus, but we are especially excited to promote our female researchers in 2017. Each week this year, we will publish a short Q&A feature on one of them — whether she is an undergrad, PhD candidate, or full professor.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition</a

    U NC campus

    Carolina’s vibrant people and programs attest to the University’s long-standing place among leaders in higher education since it was chartered in 1789 and opened its doors for students in 1795 as the nation’s first public university. Situated in the beautiful college town of Chapel Hill, N.C., UNC has earned a reputation as one of the best universities in the world. Carolina prides itself on a strong, diverse student body, academic opportunities not found anywhere else, and a value unmatched by any public university in the nation.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: