Tagged: Applied Research & Technology Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 10:42 am on March 23, 2017 Permalink | Reply
    Tags: Applied Research & Technology, , , , , Scientists switch on 'artificial sun' in German lab   

    From DLR via phys.org: “Scientists switch on ‘artificial sun’ in German lab” 

    DLR Bloc

    German Aerospace Center

    phys.org

    March 23, 2017

    1
    In this March 21, 2017 photo engineer Volkmar Dohmen stands in front of xenon short-arc lamps in the DLR German national aeronautics and space research center in Juelich, western Germany. The lights are part of an artificial sun that will be used for research purposes. (Caroline Seidel/dpa via AP)

    Scientists in Germany are flipping the switch on what’s being described as “the world’s largest artificial sun,” hoping it will help shed light on new ways of making climate-friendly fuel.

    The “Synlight” experiment in Juelich, about 30 kilometers (19 miles) west of Cologne, consists of 149 giant spotlights normally used for film projectors.

    Starting Thursday, scientists from the German Aerospace Center will start experimenting with this dazzling array to try to find ways of tapping the enormous amount of energy that reaches Earth in the form of light from the sun.

    One area of research will focus on how to efficiently produce hydrogen, a first step toward making artificial fuel for airplanes.

    The experiment uses as much electricity in four hours as a four-person household would in a year.

    2
    n this March 21, 2017 photo engineer Volkmar Dohmen stands in front of xenon short-arc lamps in the DLR German national aeronautics and space research center in Juelich, western Germany. The lights are part of an artificial sun that will be used for research purposes. (Caroline Seidel/dpa via AP)

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    DLR Center

    DLR is the national aeronautics and space research centre of the Federal Republic of Germany. Its extensive research and development work in aeronautics, space, energy, transport and security is integrated into national and international cooperative ventures. In addition to its own research, as Germany’s space agency, DLR has been given responsibility by the federal government for the planning and implementation of the German space programme. DLR is also the umbrella organisation for the nation’s largest project management agency.

    DLR has approximately 8000 employees at 16 locations in Germany: Cologne (headquarters), Augsburg, Berlin, Bonn, Braunschweig, Bremen, Goettingen, Hamburg, Juelich, Lampoldshausen, Neustrelitz, Oberpfaffenhofen, Stade, Stuttgart, Trauen, and Weilheim. DLR also has offices in Brussels, Paris, Tokyo and Washington D.C.

     
  • richardmitnick 7:58 am on March 21, 2017 Permalink | Reply
    Tags: Applied Research & Technology, Incomplete descriptions are a waste of healthcare research, ,   

    From Nature Index: “Incomplete descriptions are a waste of healthcare research” 

    Nature Mag
    Nature

    10 March 2017
    Tammy Hoffmann

    1
    No image caption. No image credit

    Most of us have probably tried to recreate a meal we’ve enjoyed in a restaurant. But would you attempt it without a recipe? And if you had to guess most of the ingredients, how confident would you be about the end result?

    Worryingly, a similar situation frequently occurs in healthcare. Health professional often have to guess the details of the intervention that will help their patients. The intervention might be a drug, or a non-drug treatment, such as an exercise, psychosocial, or dietary advice.

    Part of this problem lies with researchers who inadequately describe interventions in research reports — one of the contributors of waste in research. Estimates suggest up to 85% of health research is wasted.

    For an intervention to be useful in practice, clinicians need to know details such as: when and how much of the intervention (e.g. intensity, number and schedule of sessions), details (including any training) of the intervention provider, the actual steps involved in providing the intervention, and any materials (information or physical) needed as part of it.

    But these crucial details are missing in up to 60% of trials of non-drug interventions. While this problem is more common in studies of non-drug interventions, the problem also occurs in drug studies. In a 2015 study of cancer chemotherapy trials, only 11% reported all the essential elements of the interventions.

    Why does this problem occur?

    Authors of research reports are often unaware of what a complete description of an intervention means. Upon request, they will often provide missing details.

    But deficiencies in intervention reporting are often not detected by peer reviewers or editors. Some authors do not provide intervention details as they may not have access to all the details, or have concerns about word limits or the copyright implications of sharing intervention materials.

    For such a prevalent problem, the issue of inadequate intervention reporting has received little attention until recently.

    In 2014, my colleagues and I published in BMJ a reporting guide to increase authors’ awareness of the problem, and provide them with a checklist for the essential elements of a intervention description.

    The guide makes it easier for authors to describe their interventions, for reviewers and editors to assess the descriptions, and for readers to use the information.

    When including all intervention details in the main paper itself isn’t possible, the guide encourages authors to use other means to provide this information and to state where it can be located. This includes online supplementary materials, permitted in about 75% of journals, additional journal articles, study or university websites, or online repositories such as Figshare.

    While the guide implores researchers to include intervention details in new papers, the information remains missing in many studies that have already been published. For drug interventions, doctors at least have access to drug formularies, where they can look up basic information, such as the active ingredient, dose, and route of administration.

    Equivalent resources for non-drug interventions typically do not exist. One exception is a new handbook recently developed by the Royal Australian College of General Practitioners, which gives general practitioners the details they need to provide evidence-based non-drug interventions.

    Authors, reviewers, and editors have a responsibility to improve the comprehensivity of intervention reporting. Without adequate details, effective interventions cannot reliably be used by others. Clinicians, patients, and policymakers can’t reliably use effective interventions and researchers are unable to replicate or build upon the research. Given the cost of conducting trials, this is an enormous waste.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Nature is a weekly international journal publishing the finest peer-reviewed research in all fields of science and technology on the basis of its originality, importance, interdisciplinary interest, timeliness, accessibility, elegance and surprising conclusions. Nature also provides rapid, authoritative, insightful and arresting news and interpretation of topical and coming trends affecting science, scientists and the wider public.

     
  • richardmitnick 10:33 am on March 20, 2017 Permalink | Reply
    Tags: Applied Research & Technology, , , , , , , , , Sardines in Space   

    From astrobites: “Sardines in Space: The Intensely Densely-Packed Planets Orbiting Kepler-11” 

    Astrobites bloc

    Astrobites

    Title: A Closely-Packed System of Low-Mass, Low-Density Planets Transiting Kepler-11
    Authors: Jack J. Lissauer, Daniel C. Fabrycky, Eric B. Ford, et al.
    Lead Author’s Institution: NASA Ames Research Center, Moffett Field, CA, 94035, USA

    Status: Published in Nature 2011 [open access]

    The dawn of the Kepler Space Telescope data has unearthed a treasure trove of new and unusual celestial objects. Among these new discoveries is the planetary system Kepler-11. The system contains six transiting planets that are packed incredibly close around the Sun-like star, much like sardines are packed very closely in cans. The first five of these planets fall within the orbit of Mercury, and the sixth one falls well within the orbit of Venus. Few systems like this have been discovered; most planetary systems have a much larger separation between the planets, yet this system has its planets arranged in an extremely packed, yet extraordinarily still stable, way.

    1
    Figure 1: This figure from the NASA website is a visual representation of the Kepler-11 system, overlaid with the orbits of Mercury and Venus.

    When a single planet orbits a star, its period follows Kepler’s Laws to a tee; however, when other planets are introduced in the system, the orbiting bodies tend to perturb each other’s orbits. Their periods differ slightly according to the gravitational perturbations, and this variation is called a transit timing variation (TTV). Since Kepler-11 has five planets orbiting in extreme proximity to one another, it is the perfect illustration of measurements from transit-timing variations.


    Planet transit. NASA/Ames

    The photometric Kepler data marked the discovery of this system. The transits for each of the planets appeared separately in the light curve of the system. The light curve is just a measurement of the brightness of the star over time, so when a planet passes in front of the star, the brightness decreases, causing the dip in the light curve. The shape varies with each planet based on differences in size of the planet and orbital radius. From this data, it is possible to measure the radius of the transiting planet. This team followed up their photometric data with spectroscopic analysis from the Keck I telescope. This additional data allowed for the precise measurements of transit-timing variations, which yielded mass measurements for the inner five planets.

    For the first five planets, the TTVs were successfully measured, and with this information, the research team found the densities of the inner five planets, which yielded a surprising result. These planets, despite being densely packed, are not made of very dense material. Kepler-11b is both closest to the Sun and densest, but only with an overall density of 3.31 g/cm3. For comparison, Earth has an overall density of about 5.5 g/cm3. The densities of the planets orbiting Kepler-11 are depicted in Figure 2.

    2
    Figure 2: This shows the mass versus radius of the planets in the Kepler-11 system. The planets orbiting Kepler-11 are represented by the filled in circles. The other marking on the graph indicate planets in our solar system, shown for comparison. Figure 5 from today’s paper.

    While transit timing variations worked like a charm for the inner five planets, the sixth planet (Kepler-11g) was too distant from the others for this method to work well, so to confirm this planet, another method was employed. This team used several simulations to rule out alternate scenarios, which include chance alignment of the Kepler-11 system with and eclipsing star or with another star-planet system. This analysis successfully confirmed Kepler-11g , but because no TTVs could be measured for this particular planet, its mass and radius remain unknown.

    Even though this system has been more closely studied than most, the measurements have raised nearly as many questions as they have answered. The inner five have small inclinations and eccentricities, which implies some planetary migration process. However, since the periods of these planets are not in resonance, slow and convergent migration theories—which would naturally force the planets into resonant orbits—seem unlikely to be at play in this system. Formation of such a system is still a bit of a mystery. After all, such low-density planets are unusual and do not completely fit within the current understanding of planet formation.

    Kepler-11 continues to be one of the more intriguing planetary systems discovered, and its formation is not fully understood. Even though this system has been more closely studied than most, the measurements have raised nearly as many questions as they have answered. Systems like this extend our understanding of astrophysics, perhaps in a bit of an unexpected way; these closely packed planets have so much more to teach us about their system formation.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    What do we do?

    Astrobites is a daily astrophysical literature journal written by graduate students in astronomy. Our goal is to present one interesting paper per day in a brief format that is accessible to undergraduate students in the physical sciences who are interested in active research.
    Why read Astrobites?

    Reading a technical paper from an unfamiliar subfield is intimidating. It may not be obvious how the techniques used by the researchers really work or what role the new research plays in answering the bigger questions motivating that field, not to mention the obscure jargon! For most people, it takes years for scientific papers to become meaningful.
    Our goal is to solve this problem, one paper at a time. In 5 minutes a day reading Astrobites, you should not only learn about one interesting piece of current work, but also get a peek at the broader picture of research in a new area of astronomy.

     
  • richardmitnick 10:15 am on March 20, 2017 Permalink | Reply
    Tags: Angelman’s syndrome, Applied Research & Technology, , Cerebellin 1 (CBLN1), Chemogenetics, Circuit Breaker, , Isodicentric chromosome 15q, , The gene UBE3A   

    From HMS: “Circuit Breaker” Autism Studies 

    Harvard University

    Harvard University

    Harvard Medical School

    Harvard Medical School

    March 16, 2017
    JACQUELINE MITCHELL

    1
    ktsimage/Getty Images

    Harvard Medical School researchers at Beth Israel Deaconess Medical Center have gained new insight into the genetic and neuronal circuit mechanisms that may contribute to impaired sociability in some forms of autism spectrum disorder.

    Led by Matthew Anderson, HMS associate professor of pathology and director of neuropathology at Beth Israel Deaconess, the scientists determined how a gene linked to one common form of autism works in a specific population of brain cells to impair sociability.

    The research, published today in the journal Nature, reveals the neurobiological control of sociability and could represent important first steps toward interventions for patients with autism.

    Anderson and colleagues focused on the gene UBE3A, multiple copies of which cause a form of autism in humans (called isodicentric chromosome 15q). Conversely, the lack of this same gene in humans leads to a developmental disorder called Angelman’s syndrome, characterized by increased sociability.

    In previous work, Anderson’s team demonstrated that mice engineered with extra copies of the UBE3A gene show impaired sociability, as well as heightened repetitive self grooming and reduced vocalizations with other mice.

    “In this study, we wanted to determine where in the brain this social behavior deficit arises and where and how increases of the UBE3A gene repress it,” said Anderson, who is also director of the Autism BrainNET, Boston Node.

    “We had tools in hand that we built ourselves. We not only introduced the gene into specific brain regions of the mouse, but we could also direct it to specific cell types to test which ones played a role in regulating sociability,” Anderson said.

    When Anderson and colleagues compared the brains of the mice engineered to model autism to those of normal—or wild type—mice, they observed that the increased UBE3A gene copies interacted with nearly 600 other genes.

    After analyzing and comparing protein interactions between the UBE3A regulated gene and genes altered in human autism, the researchers noticed that increased doses of UBE3A repressed Cerebellin genes.

    Cerebellin is a family of genes that physically interact with other autism genes to form glutamatergic synapses, the junctions where neurons communicate with each other via the neurotransmitter glutamate.

    The researchers chose to focus on one of them, Cerebellin 1 (CBLN1), as the potential mediator of UBE3A’s effects. When they deleted CBLN1 in glutamate neurons, they recreated the same impaired sociability produced by increased UBE3A.

    “Selecting Cerebellin 1 out of hundreds of other potential targets was something of a leap of faith,” Anderson said. “When we deleted the gene and were able to reconstitute the social deficits, that was the moment we realized we’d hit the right target. Cerebellin 1 was the gene repressed by UBE3A that seemed to mediate its effects,” he said.

    In another series of experiments, Anderson and colleagues demonstrated an even more definitive link between UBE3A and CBLN1. Seizures are a common symptom among people with autism including this genetic form.

    Seizures themselves, when sufficiently severe, also impaired sociability.

    Anderson’s team suspected this seizure-induced impairment of sociability was the result of repressing the Cerebellin genes. Indeed, the researchers found that deleting UBE3A, upstream from Cerebellin genes, prevented the seizure-induced social impairments and blocked seizures ability to repress CBLN1.

    “If you take away UBE3A, seizures can’t repress sociability or Cerebellin,” said Anderson. “The flip side is, if you have just a little extra UBE3A—as a subset of people with autism do—and you combine that with less severe seizures, you can get a full-blown loss of social interactions.”

    The researchers next conducted a variety of brain-mapping experiments to locate where in the brain these crucial seizure-gene interactions take place.

    “We mapped this seat of sociability to a surprising location,“ Anderson explained. Most scientists would have thought they take place in the cortex—the area of the brain where sensory processing and motor commands take place—but, in fact, these interactions take place in the brain stem, in the reward system.”

    Then the researchers used their engineered mouse model to confirm the precise location as the ventral tegmental area, part of the midbrain that plays a role in the reward system and addiction.

    Anderson and colleagues used chemogenetics—an approach that makes use of modified receptors introduced into neurons that respond to drugs but not to naturally occurring neurotransmitters—to switch this specific group of neurons on or off.

    Turning these neurons on could magnify sociability and rescue seizure and UBE3A-induced sociability deficits.

    “We were able to abolish sociability by inhibiting these neurons, and we could magnify and prolong sociability by turning them on,” said Anderson. “So we have a toggle switch for sociability. It has a therapeutic flavor; someday, we might be able to translate this into a treatment that will helps patients.”

    The researchers thank Oriana DiStefano, Greg Salimando and Rebecca Broadhurst for colony work and the HMS Neurobiology Imaging Facility (NINDS P30 Core Center Grant #NS07203).

    This work was supported an American Academy of Neurology Research Training Fellowship, the National Institutes of Health (grants 1R25NS070682, 1R01NS08916, 1R21MH100868 and 1R21HD079249), the Nancy Lurie Marks Family Foundation, the Landreth Family Foundation, the Simons Foundation, Autism Speaks/National Alliance for Autism Research and the Klarman Family Foundation.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    HMS campus

    Established in 1782, Harvard Medical School began with a handful of students and a faculty of three. The first classes were held in Harvard Hall in Cambridge, long before the school’s iconic quadrangle was built in Boston. With each passing decade, the school’s faculty and trainees amassed knowledge and influence, shaping medicine in the United States and beyond. Some community members—and their accomplishments—have assumed the status of legend. We invite you to access the following resources to explore Harvard Medical School’s rich history.

    Harvard University campus

    Harvard is the oldest institution of higher education in the United States, established in 1636 by vote of the Great and General Court of the Massachusetts Bay Colony. It was named after the College’s first benefactor, the young minister John Harvard of Charlestown, who upon his death in 1638 left his library and half his estate to the institution. A statue of John Harvard stands today in front of University Hall in Harvard Yard, and is perhaps the University’s best known landmark.

    Harvard University has 12 degree-granting Schools in addition to the Radcliffe Institute for Advanced Study. The University has grown from nine students with a single master to an enrollment of more than 20,000 degree candidates including undergraduate, graduate, and professional students. There are more than 360,000 living alumni in the U.S. and over 190 other countries.

     
  • richardmitnick 9:48 am on March 20, 2017 Permalink | Reply
    Tags: Applied Research & Technology, , Crystallites, Disorder can be good, , , Pyrolysis, Vickers hardness test   

    From MIT: “Disorder can be good” 

    MIT News

    MIT Widget

    MIT News

    March 17, 2017
    Denis Paiste

    1
    MIT aerospace researchers have demonstrated that some randomness in the arrangement of carbon atoms makes materials that are lighter and stronger, shown at lower right in illustration, compared to a more densely packed and tightly ordered structure, shown lower left. They formed a type of disordered graphite-like carbon material that is often called glassy carbon by “baking” a phenol-formadehyde hydrocarbon precursor at high temperature in inert gas, a process commonly known as pyrolysis. Illustration: Itai Stein

    Researchers discover that chaos makes carbon materials lighter and stronger.

    In the quest for more efficient vehicles, engineers are using harder and lower-density carbon materials, such as carbon fibers, which can be manufactured sustainably by “baking” naturally occurring soft hydrocarbons in the absence of oxygen. However, the optimal “baking” temperature for these hardened, charcoal-like carbon materials remained a mystery since the 1950s when British scientist Rosalind Franklin, who is perhaps better known for providing critical evidence of DNA’s double helix structure, discovered how the carbon atoms in sugar, coal, and similar hydrocarbons, react to temperatures approaching 3,000 degrees Celsius (5,432 degrees Fahrenheit) in oxygen-free processing. Confusion over whether disorder makes these graphite-like materials stronger, or weaker, prevented identifying the ideal “baking” temperature for more than 40 years.

    Fewer, more chaotically arranged carbon atoms produce higher-strength materials, MIT researchers report in the journal Carbon. They find a tangible link between the random ordering of carbon atoms within a phenol-formaldehyde resin, which was “baked” at high temperatures, and the strength and density of the resulting graphite-like carbon material. Phenol-formaldehyde resin is a hydrocarbon commonly known as “SU-8” in the electronics industry. Additionally, by comparing the performance of the “baked” carbon material, the MIT researchers identified a “sweet spot” manufacturing temperature: 1,000 C (1,832 F).

    “These materials we’re working with, which are commonly found in SU-8 and other hydrocarbons that can be hardened using ultraviolet [UV] light, are really promising for making strong and light lattices of beams and struts on the nanoscale, which only recently became possible due to advances in 3-D printing,” says MIT postdoc Itai Stein SM ’13, PhD ’16. “But up to now, nobody really knew what happens when you’re changing the manufacturing temperature, that is, how the structure affects the properties. There was a lot of work on structure and a lot of work on properties, but there was no connection between the two. … We hope that our study will help to shed some light on the governing physical mechanisms that are at play.”

    Stein, who is the lead author of the paper published in Carbon, led a team under professor of aeronautics and astronautics Brian L. Wardle, consisting of MIT junior Chlöe V. Sackier, alumni Mackenzie E. Devoe ’15 and Hanna M. Vincent ’14, and undergraduate Summer Scholars Alexander J. Constable and Naomi Morales-Medina.

    “Our investigations into this carbon material as a matrix for nanocomposites kept leading to more questions making this topic increasingly interesting in and of itself. Through a series of contributions, notably from MIT undergraduate researchers and Summer Scholars, a sustained investigation of several years resulted, allowing some paradoxical results in the extant literature to be resolved,” Wardle says.

    By “baking” the resin at high temperature in inert gas, a process commonly known as pyrolysis, the researchers formed a type of disordered graphite-like carbon material that is often called glassy carbon. Stein and Wardle showed that when it is processed at temperatures higher than 1,000 C, the material becomes more ordered but weaker. They estimated the strength of their glassy carbon by applying a local force and measuring their material’s ability to resist deformation. This type of measurement, which is known to engineers as the Vickers hardness test, is a highly versatile technique that can be used to study a wide variety of materials, such as metals, glasses, and plastics, and enabled the researchers to compare their findings to many well-known engineering materials that include diamond, carbon fiber composites, and metal carbides.

    The carbon atoms within the MIT researchers’ material were more chaotically organized than is typical for graphite, and this was because phenol-formaldehyde with which they started is a complicated mix of carbon-rich compounds. “Because the hydrocarbon was disordered to begin with, a lot of the disorder remains in your crystallites, at least at this temperature,” Stein explains. In fact, the presence of more complex carbon compounds in the material strengthens it by leading to three-dimensional connections that are hard to break. “Basically you get pinned at the crystallite interface, and that leads to enhanced performance,” he says.

    These high-temperature baked materials have only one carbon atom in their structure for every three in a diamond structure. “When you’re using these materials to make nanolattices, you can make the overall lattice even less dense. Future studies should be able to show how to make lighter and cheaper materials,” Stein suggests. Hydrocarbons similar to the phenol-formaldehyde studied here can also be sourced in an environmentally friendly way, he says.

    “Up until now there wasn’t really consensus about whether having a low density was good or bad, and we’re showing in this work, that having a low density is actually good,” Stein says. That’s because low density in these crystallites means more molecular connections in three dimensions, which helps the material resist shearing, or sliding apart. Because of its low density, this material compares favorably to diamond and boron nitrides for aerospace uses. “Essentially, you can use a lot more of this material and still end up saving weight overall,” Stein says.

    “This study represents sound materials science — connecting all three facets of synthesis, structure, and property — toward elucidating poorly understood scaling laws for mechanical performance of pyrolytic carbon,” says Eric Meshot, a staff scientist at Lawrence Livermore National Laboratory, who was not involved in this research. “It is remarkable that by employing routinely available characterization tools, the researchers pieced together both the molecular and nanoscale structural pictures and deciphered this counterintuitive result that more graphitization does not necessarily equal a harder material. It is an intriguing concept in and of itself that a little structural disorder can enhance the hardness.”

    “Their structural characterization proves how and why they achieve high hardness at relatively low synthesis temperatures,” Meshot adds. “This could be impactful for industries seeking to scale up production of these types of materials since heating is a seriously costly step.” The study also points to new directions for making low-density composite structures with truly transformative properties, he suggests. “For example, by incorporating the starting SU-8 resin in, on, or around other structures (such as nanotubes as the authors suggest), can we synthesize materials that are even harder or more resistant to sheer? Or composites that possibly embed additional functionality, such as sensing?” Meshot asks.

    The new research has particular relevance now because a group of German researchers showed last year in a Nature Materials paper how these materials can form highly structured nanolattices that are strong, lightweight, and are outperformed only by diamond. Those researchers processed their material at 900 C, Stein notes. “You can do a lot more optimization, knowing what the scaling is of the mechanical properties with the structure, then you can go ahead and tune the structure accordingly, and that’s where we believe there is broad implication for our work in this study,” he says.

    This work was partly supported by MIT’s Nano-Engineered Composite aerospace STructures (NECST) Consortium members Airbus Group, Boeing, Embraer, Lockheed Martin, Saab AB, ANSYS, Hexcel, and TohoTenax. Stein was supported, in part, by a National Defense Science and Engineering Graduate Fellowship.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    MIT Seal

    The mission of MIT is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of the MIT community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

    MIT Campus

     
  • richardmitnick 8:12 pm on March 19, 2017 Permalink | Reply
    Tags: Applied Research & Technology, Hydrogen On Demand,   

    From Technion: “Hydrogen On Demand” 

    Technion bloc

    Technion

    March 19, 2017
    T3

    1
    T³Dr. Amir Bahar, Co-founder & CEO Nurami Medical Nanofiber Technology

    Technion researchers have developed a new method for the production of hydrogen from water using solar energy. The new method will make it possible to produce the hydrogen in a centralized manner far from the solar farm, cost-effectively, safely and efficiently.
    Technion-Israel Institute of Technology researchers have developed a new approach to the production of hydrogen from water using solar energy. In findings published yesterday in Nature Materials, the researchers explain that this approach will make it possible to produce hydrogen in a centralized manner at the point of sale (for example, at a gas station for electric cars fueled by hydrogen) located far from the solar farm. The new technology is expected to significantly reduce the cost of producing the hydrogen and shipping it to the customer.

    The study was led by Avigail Landman, a doctoral student in the Nancy & Stephen Grand Technion Energy Program (GTEP), and Dr. Hen Dotan from the Electrochemical Materials & Devices Lab. Ms. Landman is working on her doctorate under the guidance of Prof. Avner Rothschild from the Faculty of Materials Science and Engineering, and Prof. Gideon Grader, Dean of the Faculty of Chemical Engineering.

    The study published in Nature Materials was supported by the Israeli Centers of Research Excellence (I-CORE) for Solar Fuel Research (funded by the Planning and Budgeting Committee of the Council for Higher Education of Israel), the Ministry of National Infrastructures, Energy and Water, the European Fuel Cells and Hydrogen Joint Undertaking (FCH JU), the Grand Technion Energy Program (GTEP), donor Ed Satell and the Adelis Foundation.

    Hydrogen can be produced from water, and therefore production does not depend on access to non-renewable natural resources.
    Using hydrogen fuel would reduce the dependence on fossil fuels such as oil and natural gas, whose availability depends on geographical, political and other factors, and would increase the energy available to the earth’s population. Unlike diesel and gasoline engines that emit considerable pollution into the air, the only byproduct of hydrogen fuel utilization is water.

    Because of the advantages of hydrogen fuel, many countries – led by Japan, Germany and the United States – are investing vast sums of money in programs for the development of environmentally friendly (“green”) technologies for the production of hydrogen. Most hydrogen is currently produced from natural gas in a process that emits carbon dioxide into the air, but it is also possible to produce hydrogen from water by splitting the water molecules into hydrogen and oxygen in a process called electrolysis. However, since electricity production itself is an expensive and polluting process, researchers at the Technion and around the world are developing a photoelectrochemical (PEC) cell that utilizes solar energy to split water into hydrogen and oxygen directly, without the need for external power source.

    The main challenges in the development of PEC solar farms for the production of hydrogen are 1.) keeping the hydrogen and the oxygen separate from each other, 2.) collecting the hydrogen from millions of PEC cells, and 3.) transporting the hydrogen to the point of sale. The Technion team solved these challenges by developing a new method for PEC water splitting. With this method, the hydrogen and oxygen are formed in two separate cells – one that produces hydrogen, and another that produces oxygen. This is in contrast to the conventional method, in which the hydrogen and oxygen are produced within the same cell, and separated by a thin membrane that prevents them from intermixing and forming a flammable and explosive mixture.

    The process allows geographic separation between the solar farm consisting of millions of PEC cells that produce oxygen exclusively, and the site where the hydrogen is produced in a centralized, cost-effective and efficient manner. They accomplished this with a pair of auxiliary electrodes made of nickel hydroxide, an inexpensive material used in rechargeable batteries, and a metal wire connecting them.
    “In the present article, we describe a new method for producing hydrogen through the physical separation of hydrogen production and oxygen production,” says Ms. Landman. “According to our cost estimate, our method could successfully compete with existing water splitting methods and serve as a cheap and safe platform for the production of hydrogen.”

    The vision of the Technion researchers is geographic separation between the sites where the oxygen and hydrogen are produced: at one site, there will be a solar farm that will collect the sun’s energy and produce oxygen, while hydrogen is produced in a centralized manner at another site, miles away. Thus, instead of transporting compressed hydrogen from the production site to the sales point, it will only be necessary to swap the auxiliary electrodes between the two sites. Economic calculations performed in collaboration with research fellows from Evonik Creavis GmbH and the Institute of Solar Research at the German Aerospace Center (DLR), indicate the potential for significant savings in the setup and operating costs of hydrogen production.

    The method developed at the Technion for separating hydrogen production and oxygen production was the basis for the development of new two-stage electrolysis technology. This technology, which was developed by Dr. Hen Dotan, enables hydrogen production at high pressure and with unprecedented efficiency, thus significantly reducing hydrogen production costs. The new technology is now in its pre-industrial development stage.
    Source

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Technion Campus

    A science and technology research university, among the world’s top ten,
    dedicated to the creation of knowledge and the development of human capital and leadership,
    for the advancement of the State of Israel and all humanity.

     
  • richardmitnick 12:18 pm on March 18, 2017 Permalink | Reply
    Tags: Applied Research & Technology, Autism and Genius,   

    From NOVA: “A Genetic Link Between Autism and Prodigy?” 

    PBS NOVA

    NOVA

    MAR 15 2017
    Renee Morad

    A psychologist looks for a link between autism and child geniuses in families.

    1
    [Photo: Shutterstock; App Photo: Wikimedia]

    2
    Autism. Illustration by Bobby Lopez

    Ohio State University psychologist Joanne Ruthsatz, author of The Prodigy’s Cousin, once tested a child prodigy’s IQ. In the middle of her assessment, the child asked for a break at McDonald’s. As they were eating, the child genius’s autistic cousin walked in, and the coincidence made Ruthsatz wonder: What are the chances of having a child prodigy and an autistic child in the same family?

    The question motivated her to find some answers. So she went on to study prodigies who reached a professional level before age 10. After examining their DNA, and those of their families, she discovered that half of prodigies had an autistic relative as close as a grandparent or niece. She also found that both the prodigies and their autistic relatives seemed to have evidence of a genetic mutation or mutations on the short arm of chromosome 1 that was not shared by their neurotypical relatives. The two also shared some characteristics.

    We caught up with Ruthsatz to talk about how her findings might help answer questions about autism. Our condensed and edited conversation follows.

    How might the link between prodigy and autism help us better understand both traits?

    Ruthsatz: Well, if we could find how they are different from their neurotypical relatives, that would lead the way to better medicine for autism. What we’re looking for is a genetic marker that prodigies have that their neurotypical or autistic relatives do not have. More than 50 percent of children who are prodigies have autistic first or second relatives. That’s way too much. It’s a big marker, a big flag. Now we’re working to find out where the difference is, since we know where the similarity is.

    What strengths did you find among prodigies who excel in math and science?

    This group had huge visual-spatial skills. They were able to see visually and report the difference, telling exactly how to get from point A to point B and, miraculously, whether it was northeast, left, right or so on. I didn’t cue them; they just knew. But artistic prodigies were below average on this skill. Some of the artistic prodigies couldn’t have told me left from right.

    What did the music geniuses excel in?

    The music prodigies had the strongest memories. In fact, all music geniuses had a score above 99 percent on working memory. They had significantly better working memories than the other types of prodigies.

    When you compared prodigies and those with autism, what similarities did you find?

    They all have an obsession in something, or what we’d call a “rage to master” in prodigies. They both have strong working memories. They all usually come from families that have engineers or scientists or professors. Well, not all of them, but more than you’d expect. Some come from very normal families, some working-class — and many have autistic relatives.

    You uncovered evidence that prodigies have a very extreme sense of empathy. Can you explain?

    One of the prodigies started a charity that raised $8 million for children with neuro diseases. He was so in tune with these patients that he used to play little concerts for them in the hospital, and his efforts got bigger and bigger. He raised a lot of money for research. Another one focused on feeding starving children. They are very sensitive to the human condition. Now, with autistic individuals, there’s this misunderstanding that they don’t care, but I think they care so much that they don’t know what to do with it — they’re super sensitive.

    What do you find most interesting about child geniuses?

    They are just so extremely rare, and we’re almost seeing an evolution in genetic research that shows that as the world goes on, the gene pool changes. You can go back to Mozart, and he certainly had an autistic background, but we’re finding that more and more. I think we’re seeing an evolution of extreme talent.

    What do you suspect your latest research might lead to?

    We are hoping to arrive at the prodigy gene that allows all the deficits in autism to be put at bay, letting the talent shine through. We think it’s going to be one or two genes. We don’t think they will be massive genes that are different. We think it’s going to be a moderator that lets prodigies be social and live their lives functionally where autistic savants cannot … and finding that difference might lead to better medicine for people with autism.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NOVA is the highest rated science series on television and the most watched documentary series on public television. It is also one of television’s most acclaimed series, having won every major television award, most of them many times over.

     
  • richardmitnick 4:35 pm on March 17, 2017 Permalink | Reply
    Tags: Applied Research & Technology, , , , Scientists make microscopes from droplets, Tunable microlenses   

    From MIT: “Scientists make microscopes from droplets” 

    MIT News

    MIT Widget

    MIT News

    March 10, 2017
    Jennifer Chu

    1
    Researchers at MIT have devised tiny “microlenses” from complex liquid droplets, such as these pictured here, that are comparable in size to the width of a human hair. Courtesy of the researchers

    With chemistry and light, researchers can tune the focus of tiny beads of liquid.

    Liquid droplets are natural magnifiers. Look inside a single drop of water, and you are likely to see a reflection of the world around you, close up and distended as you’d see in a crystal ball.

    Researchers at MIT have now devised tiny “microlenses” from complex liquid droplets comparable in size to the width of a human hair. They report the advance this week in the journal Nature Communications.

    Each droplet consists of an emulsion, or combination of two liquids, one encapsulated in the other, similar to a bead of oil within a drop of water. Even in their simple form, these droplets can magnify and produce images of surrounding objects. But now the researchers can also reconfigure the properties of each droplet to adjust the way they filter and scatter light, similar to adjusting the focus on a microscope.

    The scientists used a combination of chemistry and light to precisely shape the curvature of the interface between the internal bead and the surrounding droplet. This interface acts as a kind of internal lens, comparable to the compounded lens elements in microscopes.

    “We have shown fluids are very versatile optically,” says Mathias Kolle, the Brit and Alex d’Arbeloff Career Development Assistant Professor in MIT’s Department of Mechanical Engineering. “We can create complex geometries that form lenses, and these lenses can be tuned optically. When you have a tunable microlens, you can dream up all sorts of applications.”

    For instance, Kolle says, tunable microlenses might be used as liquid pixels in a three-dimensional display, directing light to precisely determined angles and projecting images that change depending on the angle from which they are observed. He also envisions pocket-sized microscopes that could take a sample of blood and pass it over an array of tiny droplets. The droplets would capture images from varying perspectives that could be used to recover a three-dimensional image of individual blood cells.

    “We hope that we can use the imaging capacity of lenses on the microscale combined with the dynamically adjustable optical characteristics of complex fluid-based microlenses to do imaging in a way people have not done yet,” Kolle says.

    Kolle’s MIT co-authors are graduate student and lead author Sara Nagelberg, former postdoc Lauren Zarzar, junior Natalie Nicolas, former postdoc Julia Kalow, research affiliate Vishnu Sresht, professor of chemical engineering Daniel Blankschtein, professor of mechanical engineering George Barbastathis, and John D. MacArthur Professor of Chemistry Timothy Swager. Moritz Kreysing and Kaushikaram Subramanian of the Max Planck Institute of Molecular Cell Biology and Genetics are also co-authors.

    Shaping a curve

    The group’s work builds on research by Swager’s team, which in 2015 reported a new way to make and reconfigure complex emulsions. In particular, the team developed a simple technique to make and control the size and configuration of double emulsions, such as water that was suspended in oil, then suspended again in water. Kolle and his colleagues used the same techniques to make their liquid lenses.

    They first chose two transparent fluids, one with a higher refractive index (a property that relates to the speed at which light travels through a medium), and the other with a lower refractive index. The contrast between the two refractive indices can contribute to a droplet’s focusing power. The researchers poured the fluids into a vial, heated them to a temperature at which the fluids would mix, then added a water-surfactant solution. When the liquids were mixed rapidly, tiny emulsion droplets formed. As the mixture cooled, the fluids in each of the droplets separated, resulting in droplets within droplets.

    To manipulate the droplets’ optical properties, the researchers added certain concentrations and ratios of various surfactants — chemical compounds that lower the interfacial tension between two liquids. In this case, one of the surfactants the team chose was a light-sensitive molecule. When exposed to ultraviolet light this molecule changes its shape, which modifies the tension at the droplet-water interfaces and the droplet’s focusing power. This effect can be reversed by exposure to blue light.

    “We can change focal length, for example, and we can decide where an image is picked up from, or where a laser beam focuses to,” Kolle says. “In terms of light guiding, propagation, and tailoring of light flow, it’s really a good tool.”

    Optics on the horizon

    Kolle and his colleagues tested the properties of the microlenses through a number of experiments, including one in which they poured droplets into a shallow plate, placed under a stencil, or “photomask,” with a cutout of a smiley face. When they turned on an overhead UV lamp, the light filtered through the holes in the photomask, activating the surfactants in the droplets underneath. Those droplets, in turn, switched from their original, flat interface, to a more curved one, which strongly scattered light, thereby generating a dark pattern in the plate that resembled the photomask’s smiley face.

    The researchers also describe their idea for how the microlenses might be used as pocket-sized microscopes. They propose forming a microfluidic device with a layer of microlenses, each of which could capture an image of a tiny object flowing past, such as a blood cell. Each image would be captured from a different perspective, ultimately allowing recovery of information about the object’s three-dimensional shape.

    “The whole system could be the size of your phone or wallet,” Kolle says. “If you put some electronics around it, you have a microscope where you can flow blood cells or other cells through and visualize them in 3-D.”

    He also envisions screens, layered with microlenses, that are designed to refract light into specific directions.

    “Can we project information to one part of a crowd and different information to another part of crowd in a stadium?” Kolle says. “These kinds of optics are challenging, but possible.”

    This research was supported, in part, by the National Science Foundation, the Natural Sciences and Engineering Research Council of Canada, and the Max Planck Society.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    MIT Seal

    The mission of MIT is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of the MIT community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

    MIT Campus

     
  • richardmitnick 1:59 pm on March 17, 2017 Permalink | Reply
    Tags: , Applied Research & Technology, , , Mapping the Topographic Fingerprints of Humanity Across Earth   

    From Eos: “Mapping the Topographic Fingerprints of Humanity Across Earth” 

    AGU bloc

    AGU
    Eos news bloc

    Eos

    16 March 2017
    Paolo Tarolli
    Giulia Sofia
    Erle Ellis

    1
    Fig. 1. Three-dimensional view of Bingham Canyon Mine, Utah, a human-made topographic signature, based on a free, open-access high-resolution data set. Credit: Data from Utah AGRC

    Since geologic time began, Earth’s surface has been evolving through natural processes of tectonic uplift, volcanism, erosion, and the movement of sediment. Now a new force of global change is altering Earth’s surface and morphology in unprecedented ways: humanity.

    Human activities are leaving their fingerprints across Earth (Figure 1), driven by increasing populations, technological capacities, and societal demands [e.g., Ellis, 2015; Brown et al., 2017; Waters et al., 2016]. We have altered flood patterns, created barriers to runoff and erosion, funneled sedimentation into specific areas, flattened mountains, piled hills, dredged land from the sea, and even triggered seismic activity [Tarolli and Sofia, 2016]. These and other changes can pose broad threats to the sustainability of human societies and environments.

    If increasingly globalized societies are to make better land management decisions, the geosciences must globally evaluate how humans are reshaping Earth’s surface. A comprehensive mapping of human topographic signatures on a planet-wide scale is required if we are to understand, model, and forecast the geological hazards of the future.

    Understanding and addressing the causes and consequences of anthropogenic landform modifications are a worldwide challenge. But this challenge also poses an opportunity to better manage environmental resources and protect environmental values [DeFries et al., 2012].

    The Challenge of Three Dimensions

    “If life happens in three dimensions, why doesn’t science?” This question, posed more than a decade ago in Nature [Butler, 2006], resonates when assessing human reshaping of Earth’s landscapes.

    Landforms are shaped in three dimensions by natural processes and societal demands [e.g., Sidle and Ziegler, 2012; Guthrie, 2015]; societies in turn are shaped by the landscapes they alter. Understanding and modeling these interacting forces across Earth are no small challenge.

    For example, observing and modeling the direct effects of some of the most widespread forms of human topographic modification, such as soil tillage and terracing [Tarolli et al., 2014], are possible only with very fine spatial resolutions (i.e., ≤1 meter). Yet these features are common all over the world. High-resolution three-dimensional topographic data at global scales are needed to observe and appraise them.

    The Need for a Unified, Global Topographic Data Set

    High-resolution terrain data such as lidar [Tarolli, 2014], aerial photogrammetry [Eltner et al., 2016], and satellite observations [Famiglietti et al., 2015] are increasingly available to the scientific community. These data sets are also becoming available to land planners and the public, as governments, academic institutions, and others in the remote sensing community seize the opportunity for high-resolution topographic data sharing (Figure 2) [Wulder and Coops, 2014; Verburg et al., 2015]

    2
    Fig. 2. High-resolution geodata reveal the topographic fingerprints of humanity: (a) terraces in the Philippines, (b) agricultural practices in Germany, and (c) roads in Antarctica. The bottom images are lidar images of the same landscapes. Credit: Data from University of the Philippines TCAGP/Freie und Hansestadt Hamburg/Noh and Howat [2015]. Top row: © Google, DigitalGlobe

    Thanks to these geodata, anthropogenic signatures are widely observable across the globe, under vegetation cover (Figure 2a), at very fine spatial scales (e.g., agricultural practices and plowing; Figure 2b) and at large spatial scales (e.g., major open pit mines; Figure 3), and far from contemporary human settlements (Figure 2c). So the potential to assess the global topographic fingerprints of humanity using high-resolution terrain data is a tantalizing prospect.

    However, despite a growing number of local projects at fine scales, a global data set remains nonetheless elusive. This lack of global data is largely the result of technical challenges to sharing very large data sets and issues of data ownership and permissions.

    But once a global database exists, advances in the technical capacity to handle and analyze large data sets could be utilized to map anthropogenic signatures in detail (e.g., using a close-range terrestrial laser scanner) and across larger areas (e.g., using satellite data). Together with geomorphic analyses, the potential is clear for an innovative, transformative, and global-scale assessment of the extent to which humans shape Earth’s landscapes.

    For example, a fine-scale analysis of terrain data can detect specific anthropogenic configurations in the organization of surface features (Figure 3b) [Sofia et al., 2014], revealing modifications that humans make across landscapes (Figure 3c). Such fine-scale geomorphic changes are generally invisible to coarser scales of observation and analysis, making it appear that natural landforms and natural hydrological and sedimentary processes are unaltered. Failure to observe such changes misrepresents the true extent and form of human modifications of terrain, with huge consequences when inaccurate data are used to assess risks from runoff, landslides, and other geologic hazards to society [Tarolli, 2014].

    3
    Fig. 3. This potential detection of anthropogenic topographic signatures has been derived from satellite data. (a) This satellite image shows an open-pit mine in North Korea. (b) That image has been processed in an autocorrelation analysis, a measure of the organization of the topography (slope local length of autocorrelation, SLLAC [Sofia et al., 2014]). The variation in the natural landscape is noisy (e.g., top right corner), whereas anthropogenic structures are more organized and leave a clear topographic signature. (c) The degree of landscape organization can be empirically related to the amount of human-made alterations to the terrain, as demonstrated by Sofia et al. [2014]. Credit: Data from CNES© Distribution Airbus DS

    Topography for Society

    A global map of the topographic signatures of humanity would create an unparalleled opportunity to change both scientific and public perspectives on the human role in reshaping Earth’s land surface. A worldwide inventory of anthropogenic geomorphologies would enable geoscientists to assess the extent to which human societies have reshaped geomorphic processes globally and provide a tool for monitoring these changes over time.

    Such monitoring would facilitate unprecedented insights into the dynamics and sensitivity of landscapes and their responses to human forcings at global scale. In turn, these insights would help cities, resource managers, and the public better understand and mediate their social and environmental actions.

    As we move deeper into the Anthropocene, a comprehensive mapping of human topographic signatures will be increasingly necessary to understand, model, and forecast the geological hazards of the future. These hazards will likely be manifold.

    4
    Fig. 4. (a) This road, in the HJ Andrews Experimental Forest in Oregon’s Cascade Range, was constructed in 1952. A landslide occurred in 1964, and its scar was still visible in 1994, when the image was acquired. The landslide starts from the road and flows toward the top right corner of the image. (b) An index called the relative path impact index (RPII) [Tarolli et al., 2013] is evaluated here using a lidar data set from 2008. The RPII analyzes the potential water surface flow accumulation based on the lidar digital terrain model, and the index is highest where the flows are increased because of the presence of anthropogenic features. High values beyond one standard deviation (σ) highlight potential road-induced erosion. Credit: Data from NSF LTER, USFS Research, OSU; background image © Google, USGS.

    For example, landscapes across the world face altered flooding regimes in densely populated floodplains, erosion rates associated with road networks, altered runoff and erosion due to agricultural practices, and sediment release and seismic activity from mining [Tarolli and Sofia, 2016]. Modifications in land use (e.g., urbanization and changes in agricultural practices) alter water infiltration and runoff production, increasing flooding risks in floodplains. Increases in road density cause land degradation and erosion (Figure 4), especially when roads are poorly planned and constructed without well-designed drainage systems, leading to destabilized hillslopes and landslides. Erosion from agricultural fields can exceed rates of soil production, causing soil degradation and reducing crop yields, water quality, and food production. Mining areas, even years after reclamation, can induce seismicity, landslides, soil erosion, and terrain collapse, damaging environments and surface structures.

    Without accurate data on anthropogenic topography, communities will find it difficult to develop and implement strategies and practices aimed at reducing or mitigating the social and environmental impacts of anthropogenic geomorphic change.

    Earth Science Community’s Perspective Needed

    Technological advances in Earth observation have made possible what might have been inconceivable just a few years ago. A global map and inventory of human topographic signatures in three dimensions at high spatial resolution can now become a reality.

    Collecting and broadening access to high spatial resolution (meter to submeter scale), Earth science–oriented topography data acquired with lidar and other technologies would promote scientific discovery while fostering international interactions and knowledge exchange across the Earth science community. At the same time, enlarging the search for humanity’s topographical fingerprints to the full spectrum of environmental and cultural settings across Earth’s surface will require a more generalized methodology for discovering and assessing these signatures.

    These two parallel needs are where scientific efforts should focus. It is time for the Earth science community to come together and bring the topographic fingerprints of humanity to the eyes and minds of the current and future stewards, shapers, curators, and managers of Earth’s land surface.
    Acknowledgments

    Data sets for Figure 1 are from Utah Automated Geographic Reference Center (AGRC), Geospatial Information Office. Data sets for Figures 2(a)–2(c) are from the University of the Philippines Training Center for Applied Geodesy and Photogrammetry (TCAGP), Noh and Howat [2015], and Freie und Hansestadt Hamburg (from 2014), respectively. Data sets for Figure 3 are from Centre National d’Études Spatiales (CNES©), France, Distribution Airbus DS. Data sets for Figure 4 are from the HJ Andrews Experimental Forest research program, National Science Foundation’s Long-Term Ecological Research Program (NSF LTER, DEB 08-23380), U.S. Forest Service (USFS) Pacific Northwest Research Station, and Oregon State University (OSU).
    References

    Butler, D. (2006), Virtual globes: The web-wide world, Nature, 439, 776–778, https://doi.org/10.1038/439776a.

    Brown, A. G., et al. (2017), The geomorphology of the Anthropocene: Emergence, status and implications, Earth Surf. Processes Landforms, 42, 71–90, https://doi.org/10.1002/esp.3943.

    DeFries, R. S., et al. (2012), Planetary opportunities: A social contract for global change science to contribute to a sustainable future, BioScience, 62, 603–606, https://doi.org/10.1525/bio.2012.62.6.11.

    Ellis, E. C. (2015), Ecology in an anthropogenic biosphere, Ecol. Monogr., 85, 287–331, https://doi.org/10.1890/14-2274.1.

    Eltner, A., et al. (2016), Image-based surface reconstruction in geomorphometry—Merits, limits and developments, Earth Surf. Dyn., 4, 359–389, https://doi.org/10.5194/esurf-4-359-2016.

    Famiglietti, J. S., et al. (2015), Satellites provide the big picture, Science, 349, 684–685, https://doi.org/10.1126/science.aac9238.

    Guthrie, R. (2015), The catastrophic nature of humans, Nat. Geosci. 8, 421–422, https://doi.org/10.1038/ngeo2455.

    Noh, M. J., and I. M. Howat (2015), Automated stereo-photogrammetric DEM generation at high latitudes: Surface Extraction with TIN-based Search-space Minimization (SETSM) validation and demonstration over glaciated regions, GIScience Remote Sens., 52(2), 198–217, https://doi.org/10.1080/15481603.2015.1008621.

    Sidle, R. C., and A. D. Ziegler (2012), The dilemma of mountain roads, Nat. Geosci, 5, 437–438, https://doi.org/10.1038/ngeo1512.

    Sofia, G., F. Marinello, and P. Tarolli (2014), A new landscape metric for the identification of terraced sites: The slope local length of auto-correlation (SLLAC), ISPRS J. Photogramm. Remote Sens., 96, 123–133, https://doi.org/10.1016/j.isprsjprs.2014.06.018.

    Tarolli, P. (2014), High-resolution topography for understanding Earth surface processes: Opportunities and challenges, Geomorphology, 216, 295–312, https://doi.org/10.1016/j.geomorph.2014.03.008.

    Tarolli, P., and G. Sofia (2016), Human topographic signatures and derived geomorphic processes across landscapes, Geomorphology, 255, 140–161, https://doi.org/10.1016/j.geomorph.2015.12.007.

    Tarolli, P., et al. (2013), Recognition of surface flow processes influenced by roads and trails in mountain areas using high-resolution topography, Eur. J. Remote Sens., 46, 176–197.

    Tarolli, P., F. Preti, and N. Romano (2014), Terraced landscapes: From an old best practice to a potential hazard for soil degradation due to land abandonment, Anthropocene, 6, 10–25, https://doi.org/10.1016/j.ancene.2014.03.002.

    Verburg, P. H., et al. (2015), Land system science and sustainable development of the Earth system: A global land project perspective, Anthropocene, 12, 29–41, https://doi.org/10.1016/j.ancene.2015.09.004.

    Waters, C. N., et al. (2016), The Anthropocene is functionally and stratigraphically distinct from the Holocene, Science, 351, aad2622, https://doi.org/10.1126/science.aad2622.

    Wulder, M. A., and N. C. Coops (2014), Satellites: Make Earth observations open access, Nature, 513, 30–31, https://doi.org/10.1038/513030a.

    —Paolo Tarolli (email: paolo.tarolli@unipd.it; @TarolliP) and Giulia Sofia (@jubermensch2), Department of Land, Environment, Agriculture, and Forestry, University of Padova, Legnaro, Italy; and Erle Ellis (@erleellis), Department of Geography and Environmental Systems, University of Maryland, Baltimore County, Baltimore
    Citation: Tarolli, P., G. Sofia, and E. Ellis (2017), Mapping the topographic fingerprints of humanity across Earth, Eos, 98, https://doi.org/10.1029/2017EO069637. Published on 16 March 2017.
    © 2017. The authors. CC BY-NC-ND 3.0

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

     
  • richardmitnick 11:17 am on March 14, 2017 Permalink | Reply
    Tags: Applied Research & Technology, Deccan Traps, Earth’s lost history of planet-altering eruptions revealed, Enormous volcanoes vomited lava over the ancient Earth, Venus Mars Mercury and the Moon all show signs of enormous eruptions,   

    From Nature: “Earth’s lost history of planet-altering eruptions revealed” 

    Nature Mag
    Nature

    14 March 2017
    Alexandra Witze

    1
    India’s Western Ghats mountains contain igneous rock deposited 66 million years ago by a volcanic eruption in the Deccan Traps. Dinodia Photos/Getty

    Enormous volcanoes vomited lava over the ancient Earth much more often than geologists had suspected. Eruptions as big as the biggest previously known ones happened at least 10 times in the past 3 billion years, an analysis of the geological record shows.

    Such eruptions are linked with some of the most profound changes in Earth’s history. These include the biggest mass extinction, which happened 252 million years ago when volcanoes blanketed Siberia with molten rock and poisonous gases.

    “As we go back in time, we’re discovering events that are every bit as big,” says Richard Ernst, a geologist at Carleton University in Ottawa, Canada, and Tomsk State University in Russia, who led the work. “These are magnificent huge things.”

    Knowing when and where such eruptions occurred can help geologists to pinpoint ore deposits, reconstruct past supercontinents and understand the birth of planetary crust. Studying this type of volcanic activity on other planets can even reveal clues to the geological history of the early Earth.

    Ernst presented the findings this month to an industry consortium that funded the work (see ‘Earth’s biggest eruptions’). He expects to make the data public by the end of the year, through a map from the Commission for the Geological Map of the World in Paris.

    2

    “This will probably be the defining database for the next decade,” says Mike Coffin, a marine geophysicist at the University of Tasmania in Hobart, Australia.

    Surprisingly, the ancient eruptions lurk almost in plain sight. The lava they spewed has long since eroded away, but the underlying plumbing that funnelled molten rock from deep in the Earth up through the volcanoes is still there.

    Telltale tips

    Ernst and his colleagues scoured the globe for traces of this plumbing. It usually appears as radial spokes of ancient squirts of lava, fanned out around the throat of a long-gone volcano. The geologists mapped these features, known as dyke swarms, and used uranium–lead dating to pinpoint the age of the rock in each dyke. By matching the ages of the dykes, the researchers could connect those that came from a single huge eruption. During their survey, they found evidence of many of these major volcanic events.

    Each of those newly identified eruptions goes into Ernst’s database. “We’ve got about 10 or 15 so far that are probably comparable to the Siberian event,” Ernst says, “that we either didn’t know about or had a little taste, but no idea of their true extent.”

    They include a 1.32-billion-year-old eruption in Australia that connects to one in northern China. By linking dyke swarms across continents, scientists can better understand how Earth’s crust has shuffled around over time, says Nasrrddine Youbi, a geologist at Cadi Ayyad University in Marrakesh.

    Technically, the eruptions are known as ‘large igneous provinces’ (LIPs). They can spew more than one million cubic kilometres of rock in a few million years. By comparison, the 1980 eruption of Mount St Helens in Washington state put out just 10 cubic kilometres.

    These large events also emit gases that can change atmospheric temperature and ocean chemistry in a geological blink of an eye. A modelling study published last month suggests that global temperatures could have soared by as much as 7 °C per year at the height of the Siberian eruptions (F. Stordal et al. Palaeogeogr. Palaeoclimatol. Palaeoecol. 471, 96–107; 2017). Sulfur particles from the eruptions would have soon led to global cooling and acid rain; more than 96% of marine species went extinct.

    But the picture of how LIPs affected the global environment gets murkier the further back in time you get, says Morgan Jones, a volcanologist at the University of Oslo. Uncertainties in dating grow, and it becomes hard to correlate individual eruptions with specific environmental impacts. “It’s at the limit of our understanding,” he says.

    On average, LIPs occur every 20 million years or so. The most recent one was the Columbia River eruption 17 million years ago, in what is now the northwestern United States.

    Discovering more LIPs on Earth helps to put the geological history of neighbouring planets in perspective, says Tracy Gregg, a volcanologist at the University at Buffalo in New York. She and Ernst will lead a meeting on LIPs across the Solar System at a planetary-science meeting in Texas next week.

    Venus, Mars, Mercury and the Moon all show signs of enormous eruptions, Gregg notes. On the Moon, LIP-style volcanism started as early as 3.8 billion years ago; on Mars, possibly 3.5 billion years ago. But without plate tectonics to keep the surface active, those eruptions eventually ceased.

    “Other planetary bodies retain information about the earliest parts of planetary evolution, information that we’ve lost on Earth,” Gregg says. “They can give us a window into the early history of our own planet.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Nature is a weekly international journal publishing the finest peer-reviewed research in all fields of science and technology on the basis of its originality, importance, interdisciplinary interest, timeliness, accessibility, elegance and surprising conclusions. Nature also provides rapid, authoritative, insightful and arresting news and interpretation of topical and coming trends affecting science, scientists and the wider public.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: