Tagged: Applied Research & Technology Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 3:41 pm on April 26, 2016 Permalink | Reply
    Tags: Applied Research & Technology, ,   

    From LBL: “Seeing Atoms and Molecules in Action with an Electron ‘Eye’ “ 

    Berkeley Logo

    Berkeley Lab

    April 26, 2016
    Glenn Roberts Jr.
    510-486-5582
    geroberts@lbl.gov

    1
    Daniele Filippetto, a Berkeley Lab scientist, works on the High-Repetition-rate Electron Scattering apparatus (HiRES), which will function like an ultrafast electron camera. HiRES is a new capability that builds on the Advanced Photo-injector Experiment (APEX), a prototype electron source for advanced X-ray lasers. (Roy Kaltschmidt/Berkeley Lab)

    A unique rapid-fire electron source—originally built as a prototype for driving next-generation X-ray lasers—will help scientists at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) study ultrafast chemical processes and changes in materials at the atomic scale. This could provide new insight in how to make materials with custom, controllable properties and improve the efficiency and output of chemical reactions.

    This newly launched setup, dubbed HiRES (for High Repetition-rate Electron Scattering apparatus), will function like an ultrafast electron camera, potentially producing images that can pinpoint defects and their effects, track electronic and superconducting properties in exotic materials, and detail chemical reactions in gases, liquids and biological samples that are difficult to study using more conventional, X-ray-based experiments.

    The new research tool produces highly focused electron bunches, each containing up to 1 million electrons. The electrons stream at a rate of up to 1 million bunches per second, or 1 trillion electrons per second.

    Electrons will be used as a fast camera shutter to capture snapshots of samples as they change over femtoseconds, or quadrillionths of a second. An initial laser pulse will trigger a reaction in the sample that is followed an instant later by an electron pulse to produce an image of that reaction.

    HiRES delivered its first electron beam March 28 and experiments are set to begin in May.

    Daniele Filippetto, a Berkeley Lab scientist who is leading HiRES, has for much of his scientific career focused on building electron sources, also called “electron guns,” that can drive advanced X-ray lasers known as “free-electron lasers.” These electron guns are designed to produce a chain of high-energy electron pulses that are accelerated and then forced by powerful magnetic fields to give up some of their energy in the form of X-ray light.

    SACLA Free-Electron Laser Riken Japan
    SACLA Free-Electron Laser Riken Japan

    Free-electron lasers have opened new frontiers in studying materials and chemistry at the nanoscale and beyond, and Filippetto said he hopes to pave new ground with HiRES, too, using a technique known as “ultrafast electron diffraction,” or UED, that is similar to X-ray diffraction.

    In these techniques, a beam of X-rays or electrons hits a sample, and the scattering of X-rays or electrons is collected on a detector. This pattern, known as a diffraction pattern, provides structural information about the sample. X-rays and electrons interact differently: electrons scatter from a sample’s electrons and the atoms’ nuclei, for example, while X-rays scatter only from the electrons.

    The unique electron gun that Filippetto and his team are using is a part of Berkeley Lab’s APEX (Advanced Photo-injector EXperiment), which has served as a prototype system for LCLS-II, a next-generation X-ray laser project underway at SLAC National Acceleratory Laboratory in Menlo Park, Calif. Berkeley Lab is a member of the LCLS-II project collaboration.

    “The APEX gun is a unique source of ultrafast electrons, with the potential to reach unprecedented precision and stability in timing—ultimately at or below 10 femtoseconds,” Filippetto said. “With HiRES, the time resolution will be about 100 femtoseconds, or the time it takes for chemical bonds to form and break. So you can look at the same kinds of processes that you can look at with an X-ray free-electron laser, but with an electron eye.”

    He added, “You can see the structure and the relative distances between atoms in a molecule changing over time across the whole structure. You need fewer electrons than X-rays to get an image, and in principal there can be much less damage to the sample with electrons.”

    2
    This computerized rendering shows the layout of the HiRES ultrafast electron diffraction beamline, which is located in the domed Advanced Light Source building at Berkeley Lab. At left (on blue base) is APEX, the electron source for HiRES. (Courtesy of Daniele Filippetto/Berkeley Lab)

    Filippetto in 2014 received a five-year DOE Early Career Research Program award that is supporting his work on HiRES. The work is also supported by the Berkeley Lab Laboratory Directed Research and Development Program.

    Already, Berkeley Lab has world-class research capabilities in other electron-beam microscopic imaging techniques, in building nanostructures, and in a range of X-ray experimental techniques, Filippetto noted. All of these capabilities are accessible to the world’s scientists via the lab’s Molecular Foundry and Advanced Light Source (ALS).

    “If we couple all of these together with the power of HiRES, then you basically can collect full information from your samples,” he said. “You can get static images with subatomic resolution, the ultrafast structural response, and chemical information about a sample—in the same lab and in the same week.”

    3
    A view of the HiRES ultrafast electron diffraction (UED) beamline at Berkeley Lab’s APEX. (Roy Kaltschmidt/Berkeley Lab)

    Filippetto has a goal to improve the focus of the HiRES electron beam from microns, or millionths of a meter in diameter, to the nanometer scale (billionths of a meter), and to also improve the timing from hundredths of femtoseconds to tens of femtoseconds to boost the quality of the images it produces and also to study even faster processes at the atomic scale.

    Andrew Minor, director of the Molecular Foundry’s National Center for Electron Microscopy said he is excited about the potential for HiRES to ultimately study the structure of single molecules and to explore the propagation of microscopic defects in materials at the speed of sound.

    “We want to study nanoscale processes such as the structural changes in a material as a crack moves through it at the speed of sound,” he said. Also, the timing of HiRES may allow scientists to study real-time chemical reactions in an operating battery, he added.

    “What is really interesting to me is that you can potentially focus the beam down to a small size, and then you would really have a system that competes with X-ray free-electron lasers,” Minor said, which opens up the possibility of electron imaging of single biological particles.

    He added, “I think there is a very large unexplored space in terms of using electrons at the picosecond (trillionths of a second) and nanosecond (billionths off a second) time scales to directly image materials.”

    There are tradeoffs in using X-rays vs. electrons to study ultrafast processes at ultrasmall scales, he noted, though “even if the capabilities are similar, it’s worth pursuing” because of the smaller size and lesser cost of machines like APEX and HiRES compared to X-ray free-electron lasers.

    Scientists from Berkeley Lab’s Materials Sciences Division and from UC Berkeley will conduct the first set of experiments using HiRES, Filippetto said, including studies of the structural and electronic properties of single-layer and multilayer graphene, as well as other materials with semiconductor and superconductor properties.

    There are also some clear uses for HiRES in chemistry and biology experiments, Filippetto noted. “The idea is to push things to see ever-more-complicated structures and to open the doors to all of the possible applications,” he said.

    There are plans to forge connections between HiRES and other lab facilities, like the ALS, where HiRES is located, and the lab’s National Center for Electron Microscopy at the Molecular Foundry.

    “Already, we are working with the microscopy center on the first experiments,” Filippetto added. “We are adapting the microscope’s sample holder so that one can easily move samples from one instrument to another.”

    Filippetto said there are discussions with ALS scientists on the possibility of gathering complementary information from the same samples using both X-rays from the ALS and electrons from HiRES.

    “This would make HiRES more accessible to a larger scientific community,” he added.

    The Molecular Foundry and Advanced Light Source are DOE Office of Science User Facilities. HiRES is supported by the U.S. Department of Energy Office of Science.

    LBL Advanced Light Source
    LBL Advanced Light Source

    4
    A labeled diagram showing the components of the HiRES beamline at Berkeley Lab. (Courtesy of Daniele Filippetto/Berkeley Lab)

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

     
  • richardmitnick 1:01 pm on April 26, 2016 Permalink | Reply
    Tags: Applied Research & Technology, , , Rising carbon dioxide is greening the Earth – but it’s not all good news   

    From CSIRO: “Rising carbon dioxide is greening the Earth – but it’s not all good news” 

    CSIRO bloc

    Commonwealth Scientific and Industrial Research Organisation

    26th April 2016
    Pep Canadell
    Yingping Wang

    1
    Green planet: tropical rainforests have produced more growth in response to rising carbon dioxide. NASA Goddard Space Flight Center/Flickr, CC BY

    Dried lake beds, failed crops, flattened trees: when we think of global warming we often think of the impacts of droughts and extreme weather. While there is truth in this image, a rather different picture is emerging.

    In a paper published* in Nature Climate Change, we show that the Earth has been getting greener over the past 30 years. As much as half of all vegetated land is greener today, and remarkably, only 4% of land has become browner.

    Our research shows this change has been driven by human activities, particularly the rising concentration of carbon dioxide (CO₂) in the atmosphere. This is perhaps the strongest evidence yet of how people have become a major force in the Earth’s functioning.

    We are indeed in a new age, the Anthropocene.

    How do you measure green?

    Plants play a vital role in maintaining Earth as a habitable place, not least through absorbing CO₂. We wanted to know how people are affecting this ability.

    To do this, we needed to know how much plants are growing. We couldn’t possibly measure all the plants on Earth so we used satellites observations to measure light reflected and absorbed from the Earth’s surface. This is a good indicator of leaf area, and therefore how plants are growing.

    We found consistent trends in greening across Australia, central Africa, the Amazon Basin, southeast United States, and Europe. We found browning trends in northwest North America and central South America. We then used models to figure out what was driving the trends in different regions.

    2
    Updated figure to 2015. Source: http://sites.bu.edu/cliveg/files/2016/04/LAI-Change.png

    A CO₂-richer world

    Plants need CO₂ to grow through photosynthesis. We found that the biggest factor in driving the global greening trend is the fertilisation effect of rising atmospheric CO₂ due to human activity (atmospheric concentration grew by 46 parts per million during the period studied).

    This effect is well known and has been used in agricultural production for decades to achieve larger and faster yields in greenhouses.

    In the tropics, the CO₂ fertilisation effect led to faster growth in leaf area than in most other vegetation types, and made this effect the overwhelming driver of greening there.

    A warmer world

    Climate change is also playing a part in driving the overall greening trend, although not as much as CO₂ fertilisation.

    But at a regional scale, climate change, and particularly increasing temperature, is a dominant factor in northern high latitudes and the Tibetan Plateau, driving increased photosynthesis and lengthening the growing season.

    Greening of the Sahel and South Africa is primarily driven by increased rainfall, while Australia shows consistent greening across the north of the continent, with some areas of browning in interior arid regions and the Southeast. The central part of South America also shows consistent browning.
    A nitrogen-richer world

    We know that heavy use of chemical nitrogen fertilisers leads to pollution of waterways and excess nitrogen which leads to declining plant growth. In fact, our analysis attributes small browning trends in North America and Europe to a long-term cumulative excess nitrogen in soils.

    But, by and large, nitrogen is a driver of greening. For most plants, particularly in the temperate and boreal regions of the Northern Hemisphere, there is not enough nitrogen in soils. Overall, increasing nitrogen in soils has a positive effect on greening, similar to that of climate change.
    A more intensively managed world

    The final set of drivers of the global greening trend relates to changes in land cover and land management. Land management includes forestry, grazing, and the way cropland is becoming more intensively managed with multiple crops per year, increasing use of fertilisers and irrigation.

    All of this affects the intensity and time the land surface is green.

    Perhaps surprisingly, felled forests don’t show as getting browner, because they are typically replaced by pastures and crops, although this change has profound effects on ecosystems.

    The greening trends in southeast China and the southeastern United States are clearly dominated by land cover and management changes, both regions having intensive cropping areas and also reforestation.

    Although this management effect has the smallest impact on the greening trend presented in this study, the models we used are not suitable enough to assess the influence of human management globally.

    The fact that people are making parts of the world greener and browner, and the world greener overall, constitutes some of the most compelling evidence of human domination of planet Earth. And it could be good news: a greening world is associated with more positive outcomes for society than a browning one.

    For instance, a greener world is consistent with, although it does not fully explain, the fact that land plants have been removing more CO₂ from the atmosphere, therefore slowing down the pace of global warming.

    But don’t get your hopes up. We don’t know how far into the future the greening trend will continue as the CO₂ concentration ultimately peaks while delayed global warming continues for decades after. Regardless, it is clear that the benefits of a greening Earth fall well short compared to the estimated negative impacts of extreme weather events (such as droughts, heat waves, and floods), sea level rise, and ocean acidification.

    Humans have shown their capacity to (inadvertently) affect the word’s entire biosphere, it is now time to (advertently) use this knowledge to mitigate climate change and ameliorate its impacts.

    Pep Canadell, CSIRO Scientist, and Executive Director of the Global Carbon Project, CSIRO and Yingping Wang, Chief research scientist, CSIRO

    *Science Paper:
    Greening of the Earth and its drivers

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    CSIRO campus

    CSIRO, the Commonwealth Scientific and Industrial Research Organisation, is Australia’s national science agency and one of the largest and most diverse research agencies in the world.

     
  • richardmitnick 3:38 pm on April 25, 2016 Permalink | Reply
    Tags: Applied Research & Technology, ,   

    From phys.org: “Scientists take next step towards observing quantum physics in real life” 

    physdotorg
    phys.org

    April 25, 2016

    1
    An artist’s impression of the membrane coupled to a laser beam. The periodic pattern makes the device highly reflective, while the thin tethers allow for ultra-low mechanical dissipation. Credit: Felix Fricke

    Small objects like electrons and atoms behave according to quantum mechanics, with quantum effects like superposition, entanglement and teleportation. One of the most intriguing questions in modern science is if large objects – like a coffee cup – could also show this behavior. Scientists at the TU Delft have taken the next step towards observing quantum effects at everyday temperatures in large objects. They created a highly reflective membrane, visible to the naked eye, that can vibrate with hardly any energy loss at room temperature. The membrane is a promising candidate to research quantum mechanics in large objects.

    The team has reported their results* in Physical Review Letters.

    Swing

    “Imagine you’re given a single push on a playground swing. Now imagine this single push allows you to gleefully swing non-stop for nearly a decade. We have created a millimeter-sized version of such a swing on a silicon chip”, says prof. Simon Gröblacher of the Kavli Institute of Nanoscience at the TU Delft.

    Tensile stress

    “In order to do this, we deposit ultra-thin films of ceramic onto silicon chips. This allows us to engineer a million psi of tensile stress, which is the equivalent of 10,000 times the pressure in a car tire, into millimeter-sized suspended membranes that are only eight times thicker than the width of DNA”, explains dr. Richard Norte, lead author of the publication. “Their immense stored energies and ultra-thin geometry mean that these membranes can oscillate for tremendously long times by dissipating only small amounts of energy.”

    Super-mirrors

    To efficiently monitor the motion of the membranes with a laser they need to be extremely reflective. In such a thin structure, this can only be achieved by creating a meta-material through etching a microscopic pattern into the membrane. “We actually made the thinnest super-mirrors ever created, with a reflectivity exceeding 99%. In fact, these membranes are also the world’s best force sensors at room temperature, as they are sensitive enough to measure the gravitational pull between two people 100 km apart from each other”, Richard Norte says.

    Room temperture

    “The high-reflectivity, in combination with the extreme isolation, allows us to overcome a major hurdle towards observing quantum physics with massive objects, for the first time, at room temperature”, says Gröblacher. Because even a single quantum of vibration is enough to heat up and destroy the fragile quantum nature of large objects (in a process called decoherence), researchers have relied on large cryogenic systems to cool and isolate their quantum devices from the heat present in our everyday environments. Creating massive quantum oscillators which are robust to decoherence at room temperature has remained an elusive feat for physicists.

    This is extremely interesting from a fundamental theoretical point of view. One of the strangest predictions of quantum mechanics is that things can be in two places at the same time. Such quantum ‘superpositions’ have now been clearly demonstrated for tiny objects such as electrons or atoms, where we now know that quantum theory works very well.

    Coffee cup

    But quantum mechanics also tells us that the same rules should also apply for macroscopic objects: a coffee cup can be on the table and in the dishwasher at the same time, or Schrödinger’s cat can be in a quantum superposition of being dead and alive. This is however not something we see in our daily lives: the coffee cup is either clean or dirty and the cat is either dead or alive. Experimentally demonstrating a proverbial cat that is simultaneously dead and alive at ambient temperatures is still an open question in quantum mechanics. The steps taken in this research might allow to eventually observe ‘quantum cats’ on everyday life scales and temperatures.

    *Science paper:
    Mechanical Resonators for Quantum Optomechanics Experiments at Room Temperature

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    About Phys.org in 100 Words

    Phys.org™ (formerly Physorg.com) is a leading web-based science, research and technology news service which covers a full range of topics. These include physics, earth science, medicine, nanotechnology, electronics, space, biology, chemistry, computer sciences, engineering, mathematics and other sciences and technologies. Launched in 2004, Phys.org’s readership has grown steadily to include 1.75 million scientists, researchers, and engineers every month. Phys.org publishes approximately 100 quality articles every day, offering some of the most comprehensive coverage of sci-tech developments world-wide. Quancast 2009 includes Phys.org in its list of the Global Top 2,000 Websites. Phys.org community members enjoy access to many personalized features such as social networking, a personal home page set-up, RSS/XML feeds, article comments and ranking, the ability to save favorite articles, a daily newsletter, and other options.

     
  • richardmitnick 2:08 pm on April 25, 2016 Permalink | Reply
    Tags: Applied Research & Technology, , ,   

    From The Conversation: “Has China’s coal use peaked? Here’s how to read the tea leaves” 

    Conversation
    The Conversation

    April 12, 2016
    Valerie J. Karplus

    As the largest emitter of carbon dioxide in the world, how much coal China is burning is of global interest.

    In March, the country’s National Bureau of Statistics said the tonnage of coal has fallen for the second year in the row. Indeed, there are reports that China will stop construction of new plants, as the country grapples with overcapacity, and efforts to phase out inefficient and outdated coal plants are expected to continue.

    A sustained reduction in coal, the main fuel used to generate electricity in China, will be good news for the local environment and global climate. But it also raises questions: what is driving the drop? And can we expect this nascent trend to continue?

    It appears many of the forces that led coal use to slow down in recent years are here to stay. Nevertheless, uncertainties abound.

    The future of coal in China will depend on economic factors, including whether alternatives are cheaper and whether a return to high oil prices will encourage production of liquid fuels from coal. Also crucial to coal’s future trajectory are the pace of China’s economic growth and the country’s national climate and air pollution policies.

    Overcapacity

    First, let’s consider how certain we are that the rise in China’s coal use has reversed course. Unpacking that requires understanding the context in which the data is produced.

    China’s national energy statistics are subject to ongoing adjustments. The most recent one, in 2014, revised China’s energy use upward, mainly as a result of adjustments to coal use. The revisions follow the Third National Economic Census, which involved a comprehensive survey of energy use and economic activity that better represent the energy use of small- and medium-sized enterprises.

    There is good reason to believe these revised figures better reflect reality, because they help to explain a well-recognized gap between previously published national totals and the sum of provincial energy statistics, and because these regular revisions capture more sources of energy consumption.

    In short, the latest numbers show China is using more coal and energy than previously thought, but the last two years of data suggest China’s coal use may be peaking earlier than expected.

    1
    China’s latest available energy statistics, based on a more thorough accounting of energy use in China, has the country’s coal use, measured in energy terms, plateauing and declining over the past two years. Valerie Karplus. Data: China Statistical Yearbook 2015 and 2014., Author provided.

    Working from the revised numbers, the observed leveling off of China’s coal use may be both real and sustainable. Efforts to eliminate overcapacity of coal and raise energy efficiency in electric power and heavy industries are biting: in 2014, coal use in electric power fell by 7.8 percent year-on-year, while annual growth in coal consumption in manufacturing fell to 1.6 percent from 4 percent in the previous year, according to data from CEIC.

    The drop in coal use is partly due to structural shifts and partly to good fortune. In electric power, a shift to larger, more efficient power plants and industrial boilers, as well as a reduction in operating hours, has reduced overall coal use.

    The contribution of hydro, wind, solar, nuclear and natural gas in the power generation mix also continues to expand. Abundant rainfall in 2014 allowed the contribution of hydroelectric power to total generation to increase significantly as well.

    Also, the government-led war on air pollution is giving new impetus to clean up or shut down the oldest, dirtiest plants, and bring online new plants with air pollution controls in place.

    These trends, as well as slower economic growth that increasingly driven more by domestic consumption and less by expansion of heavy industry, suggest that coal demand is likely to continue leveling off. This is true even though prices for coal are falling because of overcapacity.

    The impending launch of a national carbon market in 2017 will further penalize coal in several sectors that use it intensively. The carbon market will require heavy emitters to either reduce carbon dioxide emissions by using less coal or purchase credits for emissions reductions from other market participants.

    In this scenario, China’s coal is likely to instead be increasingly exported to other energy-hungry nations less focused on air quality and climate concerns.

    A role for oil prices and air pollution policy

    However, there is at least one scenario in which coal use could easily reverse its downward trend: a return to high oil and natural gas prices.

    Globally, oil prices have plummeted in the past two years, while natural gas prices in China are relatively low for domestic users. High prices for oil and natural gas would make it attractive to convert coal into products that can be used in place of oil, natural gas or chemicals.

    China already has facilities for producing these products, often referred to as synthetic fuels. Plans to substantially expand these activities have been postponed or scuttled due to the lack of an economic rationale.

    But a return to high oil and gas prices would give new life to these projects, which even with a modest price on carbon emissions are likely to be economically viable. The scale of existing synthetic fuels capacity is sufficient to reverse the downward trend in coal use, should it become economic to bring it online.

    2
    Solar, wind and hydro are making a bigger contribution to China’s power generation mix but given the country’s huge energy needs, it is too early to say the country’s carbon emissions are going down. Jason Lee/Reuters

    Meanwhile, China’s carbon and air pollution rules are starting to have an impact on coal use, although the ultimate size of any reduction will depend on resources and incentives to implement policies at the local level.

    Under the National Air Pollution Action Plan, three major urban regions on China’s populous east coast face significant pressure to reduce the concentration of ambient particulate matter pollution by 20-25 percent before 2017, while a 10 percent reduction target is set for the nation as a whole. Cleaning up the air will involve mobilizing an enormous number of local actors on the ground, financing technology upgrades, and introducing policies to reduce pollution-intensive fuels through efficiency and substitution.

    Yet coal use reductions are unlikely to follow in lock step with air pollution reductions. Slower economic growth would be expected to reduce energy demand. But reducing the energy intensity of growth – the amount of energy needed to produce a unit of GDP – will likely become harder over time.

    As economic growth slows, localities will be under pressure to expand opportunities for existing businesses and create new ones. If this pressure leads local officials to resort to expanding energy-intensive activities, such as iron and steel, cement, and heavy manufacturing, to boost local GDP, it will become more difficult to continue reducing China’s coal use per unit of economic output.

    Carbon market in 2017

    Whether coal use continues to decline or goes back up has implications for the timing of China’s emissions peak. At the Paris Climate Summit, China pledged to peak its emissions at latest by 2030, meaning emissions of carbon dioxide would start to fall in absolute terms.

    While it is too early to say that China’s carbon dioxide emissions will continue to fall, it is unlikely that they will rise much further even if the country’s economic aspirations are realized.

    The expansion of hydro and other forms of low or zero carbon energy will help. If the challenges of integrating the energy generated from solar and wind – less predictable sources of energy compared to dispatchable sources such as coal and gas – can be solved, renewable energy that is already installed has the potential to displace significant additional coal use as well, while contributing the reduction in air pollution and carbon dioxide emissions.

    So the answer to the question of whether or not China’s coal use has peaked is: perhaps. China’s coal cap of 4.2 billion tons in 2020 is set roughly 7% higher than the 2013 peak, suggesting that even if the decline reverses course, at least use will not rise much higher. (Note that China’s coal cap sets a limit on coal use on a mass basis, while the above figure reports coal use on an energy basis, and is therefore not directly comparable.)

    This is good news because China’s carbon dioxide emissions have already reached levels in line with previously projected peak levels for 2030, prior to the data revisions. So earlier peaks in both coal use and carbon dioxide emissions now look not only desirable, but possible.

    Sustaining a commitment in China to cultivating cleaner forms of energy production and use will be challenging in the current economic headwinds, but the potential benefits to human health are great, especially in the medium to longer term.

    The country’s national carbon market, planned to launch in 2017, is an important step in the right direction. A sufficiently high and stable carbon price could form the cornerstone of a sustained transition away from coal in favor of clean and renewable energy, developments that would be consistent with existing targets and air quality goals.

    Any transition away from coal in China has the potential to help the world curb its carbon dioxide emissions and to improve domestic air quality – something that will allow us all to breathe a little easier.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The Conversation US launched as a pilot project in October 2014. It is an independent source of news and views from the academic and research community, delivered direct to the public.
    Our team of professional editors work with university and research institute experts to unlock their knowledge for use by the wider public.
    Access to independent, high quality, authenticated, explanatory journalism underpins a functioning democracy. Our aim is to promote better understanding of current affairs and complex issues. And hopefully allow for a better quality of public discourse and conversation.

     
  • richardmitnick 3:55 pm on April 24, 2016 Permalink | Reply
    Tags: Applied Research & Technology, , , , New Maps Chart Greenland Glaciers' Melting Risk   

    Fron JPL: “New Maps Chart Greenland Glaciers’ Melting Risk” 

    NASA JPL Banner

    JPL-Caltech

    April 21, 2016
    Alan Buis
    Jet Propulsion Laboratory, Pasadena, California
    818-354-0474
    Alan.buis@jpl.nasa.gov

    Brian Bell
    University of California, Irvine
    949-824-8249
    bpbell@uci.edu

    Written by Carol Rasmussen
    NASA Earth Science News Team

    1
    The new maps show that the seafloor under Store Glacier, shown here, is almost 2,000 feet (600 meters) deeper than previously thought. Credits: NASA/JPL-Caltech/Ian Fenty

    Many large glaciers in Greenland are at greater risk of melting from below than previously thought, according to new maps of the seafloor around Greenland created by an international research team. Like other recent research findings, the maps highlight the critical importance of studying the seascape under Greenland’s coastal waters to better understand and predict global sea level rise.

    Researchers from the University of California, Irvine; NASA’s Jet Propulsion Laboratory, Pasadena, California; and other research institutions combined all observations their various groups had made during shipboard surveys of the seafloors in the Uummannaq and Vaigat fjords in west Greenland between 2007 and 2014 with related data from NASA’s Operation Icebridge and the NASA/U.S. Geological Survey Landsat satellites.

    NASA/Landsat 8
    NASA/Landsat 8

    They used the combined data to generate comprehensive maps of the ocean floor around 14 Greenland glaciers. Their findings show that previous estimates of ocean depth in this area were as much as several thousand feet too shallow.

    Why does this matter? Because glaciers that flow into the ocean melt not only from above, as they are warmed by sun and air, but from below, as they are warmed by water.

    2
    A comparison of the newly compiled map of the Uummannaq fjord area (left) and an older map (right). Red areas indicate shallower depths, blues and purples deeper.
    Credits: UCI/NASA/JPL-Caltech

    In most of the world, a deeper seafloor would not make much difference in the rate of melting, because typically ocean water is warmer near the surface and colder below. But Greenland is exactly the opposite. Surface water down to a depth of almost a thousand feet (300 meters) comes mostly from Arctic river runoff. This thick layer of frigid, fresher water is only 33 to 34 degrees Fahrenheit (1 degree Celsius). Below it is a saltier layer of warmer ocean water. This layer is currently more than more than 5 degrees F (3 degrees C) warmer than the surface layer, and climate models predict its temperature could increase another 3.6 degrees F (2 degrees C) by the end of this century.

    About 90 percent of Greenland’s glaciers flow into the ocean, including the newly mapped ones. In generating estimates of how fast these glaciers are likely to melt, researchers have relied on older maps of seafloor depth that show the glaciers flowing into shallow, cold seas. The new study shows that the older maps were wrong.

    “While we expected to find deeper fjords than previous maps showed, the differences are huge,” said Eric Rignot of UCI and JPL, lead author of a paper on the research. “They are measured in hundreds of meters, even one kilometer [3,300 feet] in one place.” The difference means that the glaciers actually reach deeper, warmer waters, making them more vulnerable to faster melting as the oceans warm.

    Coauthor Ian Fenty of JPL noted that earlier maps were based on sparse measurements mostly collected several miles offshore. Mapmakers assumed that the ocean floor sloped upward as it got nearer the coast. That’s a reasonable supposition, but it’s proving to be incorrect around Greenland.

    Rignot and Fenty are co-investigators in NASA’s five-year Oceans Melting Greenland (OMG) field campaign, which is creating similar charts of the seafloor for the entire Greenland coastline. Fenty said that OMG’s first mapping cruise last summer found similar results. “Almost every glacier that we visited was in waters that were far, far deeper than the maps showed.”

    The researchers also found that besides being deeper overall, the seafloor depth is highly variable. For example, the new map revealed one pair of side-by-side glaciers whose bottom depths vary by about 1,500 feet (500 meters). “These data help us better interpret why some glaciers have reacted to ocean warming while others have not,” Rignot said.

    The lack of detailed maps has hampered climate modelers like Fenty who are attempting to predict the melting of the glaciers and their contribution to global sea level rise. “The first time I looked at this area and saw how few data were available, I just threw my hands up,” Fenty said. “If you don’t know the seafloor depth, you can’t do a meaningful simulation of the ocean circulation.”

    The maps are published in a paper titled “Bathymetry data reveal glaciers vulnerable to ice-ocean interaction in Uummannaq and Vaigat glacial fjords, west Greenland,” in the journal Geophysical Research Letters. The other collaborating institutions are Durham University and the University of Cambridge, both in the U.K.; GEOMAR Helmholtz Center for Ocean Research, Kiel, Germany; and the University of Texas at Austin.

    For more information on OMG, visit:

    https://omg.jpl.nasa.gov/portal/

    NASA uses the vantage point of space to increase our understanding of our home planet, improve lives and safeguard our future. NASA develops new ways to observe and study Earth’s interconnected natural systems with long-term data records. The agency freely shares this unique knowledge and works with institutions around the world to gain new insights into how our planet is changing.

    For more information about NASA’s Earth science activities, visit:

    http://www.nasa.gov/earth

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NASA JPL Campus

    Jet Propulsion Laboratory (JPL) is a federally funded research and development center and NASA field center located in the San Gabriel Valley area of Los Angeles County, California, United States. Although the facility has a Pasadena postal address, it is actually headquartered in the city of La Cañada Flintridge [1], on the northwest border of Pasadena. JPL is managed by the nearby California Institute of Technology (Caltech) for the National Aeronautics and Space Administration. The Laboratory’s primary function is the construction and operation of robotic planetary spacecraft, though it also conducts Earth-orbit and astronomy missions. It is also responsible for operating NASA’s Deep Space Network.

    Caltech Logo
    jpl

    NASA image

     
  • richardmitnick 2:59 pm on April 24, 2016 Permalink | Reply
    Tags: Applied Research & Technology, , , Massive Coral Reef Discovered in the Amazon River   

    From livescience: “Massive Coral Reef Discovered in the Amazon River” 

    Livescience

    1
    The Amazon feeds into the Atlantic Ocean in a plume where salt and freshwater mix. The unique pH, salinity, debris and light levels create a unique ecosytem perfect for a massive reef network. Credit: Lance Willis

    Scientists have discovered a huge coral reef system lurking beneath the muddy waters of the Amazon.

    The network of coral reefs, which is about 600 miles (1,000 kilometers) long and is home to a hidden ecosystem of colorful and bizarre creatures, was found at the mouth of the Amazon River, where freshwater from the river empties into the briny waves of the Atlantic Ocean.

    Extraordinary expedition

    The Amazon River is the world’s largest river by volume, harboring 20 percent of the freshwater on Earth. It is also home to a stunning array bizarre and as-yet-undocumented creatures. [The World’s 10 Longest Rivers]

    Patricia Yager, a marine scientist at the University of Georgia and lead investigator of the River-Ocean Continuum of the Amazon project, and her colleagues had originally set sail on the expedition to sample species from the mouth of the Amazon River, according to a statement.

    But one of the team members, biologist Rodrigo Moura from the Universidade Federal do Rio de Janeiro, had seen a published study from the 1970s “that mentioned catching reef fish along the continental shelf and said he wanted to try to locate these reefs,” Yager said.

    So the team set out on a hunt for mysterious reefs. The first obstacle was finding out exactly where the researchers of that past study had done their surveying. The 1970s journal article didn’t have GPS coordinates, so the team went to the general area and used sound waves to create pictures of the river bottom. Then they pulled up seafloor samples to confirm the presence of the reef.

    Intricate web of life

    It turned out the reef was home to a hidden carnival of life not evident at the murky water’s surface, including loads of reef fish species, a wide variety of sponges (as well as sponge-eating fish), algae and, of course, coral species.

    “We brought up the most amazing and colorful animals I had ever seen on an expedition,” Yager said.

    The team then returned to the site in 2014 to do a full catalogue the rainbow of reef species and the reef’s characteristics, which they reported* April 22 in the journal Science Advances.

    The reef changes over its extent. At the southern edge of the reef, sea creatures get more sunlight, and the reef is dominated by traditional coral and creatures that use light to make food.

    “But as you move north, many of those [species] become less abundant, and the reef transitions to sponges and other reef builders that are likely growing on the food that the river plume delivers. So the two systems are intricately linked,” Yager said.

    Yet the amazing Amazon reef system was endangered almost from the moment of its discovery. It turned out that oil exploration is planned on top of the reef, while ocean acidification and warming threatens the coral reefs just as it does throughout the world’s oceans, Yager said.

    *Science paper:
    An extensive reef system at the Amazon River mouth

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 12:43 pm on April 23, 2016 Permalink | Reply
    Tags: Applied Research & Technology, ,   

    From Eos: “Arctic Sea Ice Extent May Shrink Below 2012 Record Low” 

    Eos news bloc

    Eos

    4.22.16
    Megan Gannon

    1
    Sea ice photographed in the central Arctic Ocean during summer 2015, when Arctic sea ice was exceptionally thin, according to satellite data. Credit: Alfred-Wegener-Institut/ Stefan Hendricks

    Recent and current conditions in the Arctic suggest that by the end of this summer, the region’s sea ice could shrink to—or even below—the record low observed in 2012, scientists announced at the 2016 European Geosciences Union (EGU) meeting in Vienna, Austria, on Thursday.

    At a press conference, the researchers discussed signs of impending ice loss that they had noticed while reviewing ice thickness maps from the past several years from the CryoSat-2 satellite.

    “What was striking to us is that the ice thickness distribution in this spring, or at the end of winter, is very similar to 2012, which was the previous record minimum,” said sea ice physicist Marcel Nicolaus of the Alfred Wegener Institute (AWI) Helmholtz Centre for Polar and Marine Research in Bremerhaven, Germany. The sea ice at that time extended over just 3.41 million square kilometers—roughly half the average annual minimum ice coverage between 1981 and 2010.

    Multiple Factors Signal Sea Ice Shortage

    In this latest evaluation and projection for 2016, the research team factored in the recent history of Arctic warming and regional ice loss and growth. The scientists also considered ice thickness measurements from 2015 and 2016 and expected ocean currents and winds.

    2
    A snow buoy erected on Arctic sea ice near the coast of Alaska. Credit: Alfred-Wegener-Institut/ Stefan Hendricks

    In addition, measurements from a set of seven “snow buoys” contributed to the outlook. The researchers had affixed those sensor towers—which track snow thickness and air temperatures and pressures directly on the sea ice—to ice floes last fall.

    Nicolaus and his colleagues developed the concept for using buoys to measure snow depth; they passed this concept to a Canadian company from which they now buy the devices. Each snow buoy consists of a base unit that’s drilled into ice with a 1.5-meter mast topped by a rack of four sonic ranging sensors that autonomously transmit their data via Iridium satellites. Nicolaus said his colleagues joked that the buoy looks somewhat like a rack for hanging clothing.

    Balmy Winter Slowed Ice Growth

    The potential replay of 2012 or worse stems in large measure from the exceptional warmth of the 2015–2016 winter, the researchers noted. Readings from the group’s snow buoys in February showed that the central Arctic temperature in that month surpassed the average by up to 8°C, the team reported. Nonetheless, the extreme winter warmth didn’t make ice melt away. “According to our buoy data from the spring, the warm winter air was not sufficient to melt the layer of snow covering the sea ice, let alone the ice itself,” Nicolaus said.

    “Only over the last decade has it became more and more obvious how important the snow cover is for sea ice,” he told Eos. For instance, snow’s high albedo and other physical properties, such as its ability to provide thermal insulation, can affect the way that sea ice forms and melts.

    High 2015–2016 winter temperatures instead significantly slowed the usual seasonal thickening of sea ice, the scientists explained to reporters. Ice extending from the land in areas north of Alaska, for instance, has attained just two thirds of its usual thickness, currently measuring just a meter thick rather than 1.5 meters.

    Prediction Challenge

    Ice researchers have observed an overall decline in Arctic sea ice abundance ever since satellite records for such data became available in the 1970s. But year-to-year predictions for sea ice decline are difficult to make because the internal variability of sea ice thickness is quite large, said Alexandra Jahn, a climate scientist at the University of Colorado Boulder, who was not involved in the projections. Jahn compared the sea ice prediction challenge to making an accurate forecast today of the weather for this coming Christmas. “We have that inherent uncertainty in the system,” she told reporters.

    Weather Matters Most

    Ultimately, the amount of sea ice present when its extent bottoms out at summer’s end depends disproportionately on the patterns of wind and of air and water temperatures in the months just before the minimum is reached, Nicolaus and his colleagues explained. “The Transpolar Drift Stream, a well-known [ocean] current in the Arctic Ocean, will be carrying the majority of the thick, perennial ice currently located off the northern coasts of Greenland and Canada through the Fram Strait to the North Atlantic. These thick floes will then be followed by thin ice,” said sea ice physicist Stefan Hendricks, also of AWI. “If weather conditions turn out to be unfavorable, we might even be facing a new record low.”

    Julienne Stroeve, a senior research scientist with the U.S. National Snow and Ice Data Center (NSIDC) in Boulder, Colo., who wasn’t involved in the AWI analysis, agreed. “We are preconditioned for a big melt year if the weather patterns favor melt,” Stroeve told Eos. “Unfortunately, we cannot predict the weather, so it remains to be seen. However, if we have a relatively warm summer and a circulation pattern that promotes more ice flow out of Fram Strait, we certainly could be looking at a new record low this summer.”
    Ice-Free Arctic?

    Other sea ice research presented at the EGU meeting took a look further ahead to times when the Arctic could see up to seven ice-free months each year if greenhouse gas emissions are not cut.

    One effect of that diminished ice cover could be bigger waves in Arctic seas, according to Mikhail Dobrynin, an oceanographer at Universität Hamburg’s Center for Earth System Research and Sustainability in Germany. Dobrynin presented a new model for wave patterns in the Arctic based on projections of wind and ice conditions.

    His simulation showed that with less ice and longer reaches of open water, winds can build waves to greater size and strength. In turn, those more powerful waves could break up the sea ice faster and further erode already fragile coastal areas in the region.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

     
  • richardmitnick 12:07 pm on April 23, 2016 Permalink | Reply
    Tags: Applied Research & Technology, ,   

    From Physics: “Q&A: Keeping a Watchful Eye on Earth” 

    Physics LogoAbout Physics

    Physics Logo 2

    Physics

    1
    Anna Hogg

    Andrew Shepherd explains how he uses data from satellites to study polar ice and describes what it’s like to work in the politically charged field of climate science.

    From the baking hot savannahs of Africa to the icy cold wastelands of Greenland and Antarctica, Andrew Shepherd has worked in some of the most extreme environments on Earth. In college, he studied astrophysics, and flirted with the idea of pursuing it as a career. But a professor’s warning that few of his classmates would find a permanent job in that field turned him off. Instead he took advantage of a new department of Earth observation science at Leicester University in the UK to follow a career studying our planet’s climate. Now, rather than pointing satellites towards space to observe the stars, Shepherd flips them around to monitor the Earth. He has studied the arid land in Zimbabwe and the ice sheets at Earth’s poles. (From his fieldwork in these places, Shepherd has concluded that it is far better to bundle up warm for the cold than to boil in the heat.) As the director of the Centre for Polar Observation and Modeling in the UK and a professor at Leeds University, Shepherd also has a hand in designing and building new satellites. Physics spoke to Shepherd to learn more about his work.

    –Katherine Wright

    Your current focus is measuring changes in the amount of ice stored in Antarctica and Greenland. How did you get involved in that?

    There are dozens of estimates for how much ice is being lost from the polar ice sheets, some of which my group has produced. But climate scientists and policy makers don’t want to pick and choose between different estimates; they need a single, authoritative one. I worked with the European Space Agency (ESA) and the National Aeronautics and Space Administration (NASA), and the world’s leading experts, to pull together all the satellite measurements and deliver a single assessment of polar ice sheet losses. The project, called IMBIE—the Ice Sheet Mass Balance Inter-comparison Exercise—has been really well received. Now the space agencies want us to generate yearly assessments of ice sheet loss to chart its impact on global sea-level rise.

    What techniques are used to monitor polar ice?

    People have been exploring the polar regions for centuries, but Antarctica and Greenland are simply too large to track on foot. Satellites have solved this problem. We can now measure changes in the flow, thickness, and mass of the polar ice sheets routinely from space. These data have revolutionized our understanding of how Antarctica and Greenland interact with the climate system. Although most satellite orbits don’t cover the highest latitudes, some—such as ESA’s CryoSat—have been specially designed for that purpose.

    ESA/CryoSat 2
    ESA/CryoSat 2

    Unfortunately, we can’t measure everything from space. For example, the radio frequencies that we use to survey the bedrock beneath ice sheets can interfere with satellite television and telecommunications, so instead we rely on aircraft measurements.

    What questions about polar ice are you trying to answer?

    The headline science question is, how much ice is being lost from Antarctica and Greenland? It’s an important question, but there are many other things that we are interested in finding out. For example, how fast can ice sheets flow? Ask a glaciologist today and they’ll tell you that some glaciers flow at speeds greater than 15 km per year—you can sit next to Greenland’s Jacobshavn Isbrae glacier during your lunch break and watch it move; it’s that quick. But 10 years ago we thought the maximum speed was only 4 or 5 km per year. The speed is a useful piece of information because it’s an indication of how much ice [is available to] contribute to a future rise in sea level.
    Your group is part of several international collaborations.

    What’s your experience of working with so many other people towards a common goal?

    I enjoy it. As scientists, we are able to rely on the expertise of other people; we don’t have to have the answer to every problem. In climate and Earth science, problems are often much larger than any one group, or even institution, can solve alone, so teamwork is important.

    What’s it like to work in a field that’s often in the political and media spotlight?

    This adds excitement to our work: it’s great to know that people are interested in what we do. But it also adds an element of caution. Science moves forward by people challenging what has come before them. It can be daunting to do that in climate science, because it’s easy to be labeled an extremist. If you discover glaciers that aren’t shrinking, people assume you are going against an immense body of science. If you find evidence that the future sea-level rise will be higher than the latest predictions, you get labeled an alarmist. But often the worst option is to adopt a central position. If we assume that everyone else is right, and that there is no need to change the way we look at a problem, then we can rapidly slip into a situation where our knowledge ceases to expand.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Physicists are drowning in a flood of research papers in their own fields and coping with an even larger deluge in other areas of physics. How can an active researcher stay informed about the most important developments in physics? Physics highlights a selection of papers from the Physical Review journals. In consultation with expert scientists, the editors choose these papers for their importance and/or intrinsic interest. To highlight these papers, Physics features three kinds of articles: Viewpoints are commentaries written by active researchers, who are asked to explain the results to physicists in other subfields. Focus stories are written by professional science writers in a journalistic style and are intended to be accessible to students and non-experts. Synopses are brief editor-written summaries. Physics provides a much-needed guide to the best in physics, and we welcome your comments (physics@aps.org).

     
  • richardmitnick 11:58 am on April 23, 2016 Permalink | Reply
    Tags: , Applied Research & Technology, Europe to bet up to €1 billion on quantum technology   

    From AAAS: “Europe to bet up to €1 billion on quantum technology” 

    AAAS

    AAAS

    Apr. 22, 2016
    Kai Kupferschmidt

    1
    Günther Oettinger, the European commissioner for digital economy, and Henk Kamp, the Dutch minister for economic affairs, visit the QuTech lab, a quantum technology laboratory in Delft, the Netherlands.

    The European Commission has picked a third research area where it hopes to have a major impact by spending a massive amount of cash. Research groups across the continent will receive up to €1 billion over the next 10 years to develop quantum technologies, which might be used to develop anything from faster computers and very secure communication systems to ultrasensitive sensors and more precise atomic clocks.

    The project “should place Europe at the forefront of the second quantum revolution, bringing transformative advances to science, industry and society within the decade to come,” a spokesperson for the European Commission wrote in an email to ScienceInsider. “This is an exciting and ambitious effort to focus the extraordinary scientific accomplishments from Europe to develop fundamentally new technologies based on the quantum state of matter,” says David Awschalom, a physicist at the University of Chicago in Illinois who is not involved in the project.

    Two similarly ambitious schemes showering money on a single topic, called Flagship projects, have been underway in the European Union since 2014. One focuses on the study of graphene, the other on a computer model of the entire human brain. They were selected after an exhaustive high-profile beauty contest and announced with a series of media events. This time, there was no formal competition, and the project’s announcement was hidden in a short sentence in a long document this week describing plans to “digitize European industry.”

    The researchers leading the new project had tried to win a Flagship award during the first round, but didn’t even end up among the six finalists. “This proposal is much better thought through and discussed within the community than the previous one,” says one of its leaders, Ignacio Cirac of the Max Planck Institute of Quantum Optics in Garching, Germany. The time wasn’t quite ripe when the team tried last time, says Tommaso Calarco, a quantum physicist at the University of Ulm in Germany, another scientist behind the project. It was before Edward Snowden made his bombshell revelations about U.S. eavesdropping programs, and before companies like Microsoft and Google announced major investments in quantum technologies.

    At the commission’s request, Cirac, Calarco, and a few other scientists and industry experts wrote a Quantum Manifesto earlier this year making the case for a major European investment and laying out what they would do with the money. The manifesto has been signed by more than 3000 people already, and the commission has accepted the proposal. A formal announcement will be made in May at a meeting in Amsterdam.

    Anton Zeilinger, a quantum physicist at the University of Vienna, says the team has learned from the existing two Flagships—especially the Human Brain Project (HBP), which was heavily criticized for making unrealistic claims and an autocratic management style. (The HBP was drastically reformed after hundreds of neuroscientists threatened to boycott the project.) The manifesto’s authors were careful not to overpromise, Zeilinger says. Calarco adds that the low-key announcement was done on purpose: “We did not want to start off like [the HBP] did.” The Quantum Manifesto does not even mention the authors’ names; “This is not about pushing an individual or an institution into the limelight. This is about a research community,” Calarco says.

    The HBP’s specific problems aside, the concept of E.U. Flagships has been criticized by some as inefficient, whereas others said there was too little scrutiny for the money involved. On top of that, spending so much money on a single research topic is a gamble. But “the translation from science to technology needs this scale of coordination and funding in order to be competitive and have technological impact,” Awschalom says.

    Zeilinger agrees. “With the Americans investing hundreds of millions of dollars in this area, we will not be able to keep up with a few grants” from the European Research Council, he says. European researchers are doing top-notch basic research, Zeilinger says, but the extra money is needed to make sure they also play a role in applying it. “Otherwise Europe will once again generate ideas and others will generate products out of them.”

    See the full article here .

    The American Association for the Advancement of Science is an international non-profit organization dedicated to advancing science for the benefit of all people.

    Please help promote STEM in your local schools.
    STEM Icon
    Stem Education Coalition

     
  • richardmitnick 11:37 am on April 23, 2016 Permalink | Reply
    Tags: Applied Research & Technology, ,   

    From Nautilus: “How Big Data Creates False Confidence” 

    Nautilus

    Nautilus

    Apr 23, 2016
    Jesse Dunietz

    If I claimed that Americans have gotten more self-centered lately, you might just chalk me up as a curmudgeon, prone to good-ol’-days whining. But what if I said I could back that claim up by analyzing 150 billion words of text? A few decades ago, evidence on such a scale was a pipe dream. Today, though, 150 billion data points is practically passé. A feverish push for “big data” analysis has swept through biology, linguistics, finance, and every field in between.

    Although no one can quite agree how to define it, the general idea is to find datasets so enormous that they can reveal patterns invisible to conventional inquiry. The data are often generated by millions of real-world user actions, such as tweets or credit-card purchases, and they can take thousands of computers to collect, store, and analyze. To many companies and researchers, though, the investment is worth it because the patterns can unlock information about anything from genetic disorders to tomorrow’s stock prices.

    But there’s a problem: It’s tempting to think that with such an incredible volume of data behind them, studies relying on big data couldn’t be wrong. But the bigness of the data can imbue the results with a false sense of certainty. Many of them are probably bogus—and the reasons why should give us pause about any research that blindly trusts big data.

    In the case of language and culture, big data showed up in a big way in 2011, when Google released its Ngrams tool. Announced with fanfare in the journal Science, Google Ngrams allowed users to search for short phrases in Google’s database of scanned books—about 4 percent of all books ever published!—and see how the frequency of those phrases has shifted over time. The paper’s authors heralded the advent of “culturomics,” the study of culture based on reams of data and, since then, Google Ngrams has been, well, largely an endless source of entertainment—but also a goldmine for linguists, psychologists, and sociologists. They’ve scoured its millions of books to show that, for instance, yes, Americans are becoming more individualistic; that we’re “forgetting our past faster with each passing year”; and that moral ideals are disappearing from our cultural consciousness.

    2
    We’re Losing Hope: An Ngrams chart for the word “hope,” one of many intriguing plots found by xkcd author Randall Munroe. If Ngrams really does reflect our culture, we may be headed for a dark place. No image credit

    The problems start with the way the Ngrams corpus was constructed. In a study published last October, three University of Vermont researchers pointed out that, in general, Google Books includes one copy of every book. This makes perfect sense for its original purpose: to expose the contents of those books to Google’s powerful search technology. From the angle of sociological research, though, it makes the corpus dangerously skewed.

    Some books, for example, end up punching below their true cultural weight: The Lord of the Rings gets no more influence than, say, Witchcraft Persecutions in Bavaria. Conversely, some authors become larger than life. From the data on English fiction, for example, you might conclude that for 20 years in the 1900s, every character and his brother was named Lanny. In fact, the data reflect how immensely prolific (but not necessarily popular) the author Upton Sinclair was: He churned out 11 novels about one Lanny Budd.

    Still more damning is the fact that Ngrams isn’t a consistent, well-balanced slice of what was being published. The same UVM study demonstrated that, among other changes in composition, there’s a marked increase in scientific articles starting in the 1960s. All this makes it hard to trust that Google Ngrams accurately reflects the shifts over time in words’ cultural popularity.

    4
    Go Figure: “Figure” with a capital F, used mainly in captions, rose sharply in frequency through the 20th Century, suggesting that the corpus includes more technical literature over time. That may say something about society, but not much about how most of society uses words.

    Even once you get past the data sources, there’s still the thorny issue of interpretation. Sure, words like “character” and “dignity” might decline over the decades. But does that mean that people care about morality less? Not so fast, cautions Ted Underwood, an English professor at the University of Illinois, Urbana-Champaign. Conceptions of morality at the turn of the last century likely differed sharply from ours, he argues, and “dignity” might have been popular for non-moral reasons. So any conclusions we draw by projecting current associations backward are suspect.

    Of course, none of this is news to statisticians and linguists. Data and interpretation are their bread and butter. What’s different about Google Ngrams, though, is the temptation to let the sheer volume of data blind us to the ways we can be misled.

    This temptation isn’t unique to Ngrams studies; similar errors undermine all sorts of big data projects. Consider, for instance, the case of Google Flu Trends (GFT). Released in 2008, GFT would count words like “fever” and “cough” in millions of Google search queries, using them to “nowcast” how many people had the flu. With those estimates, public health officials could act two weeks before the Centers for Disease Control could calculate the true numbers from doctors’ reports.

    Initially, GFT was claimed to be 97 percent accurate. But as a study out of Northeastern University documents, that accuracy was a fluke. First, GFT completely missed the “swine flu” pandemic in the spring and summer of 2009. (It turned out that GFT was largely predicting winter.) Then, the system began to overestimate flu cases. In fact, it overshot the peak 2013 numbers by a whopping 140 percent. Eventually, Google just retired the program altogether.

    So what went wrong? As with Ngrams, people didn’t carefully consider the sources and interpretation of their data. The data source, Google searches, was not a static beast. When Google started auto-completing queries, users started just accepting the suggested keywords, distorting the searches GFT saw. On the interpretation side, GFT’s engineers initially let GFT take the data at face value; almost any search term was treated as a potential flu indicator. With millions of search terms, GFT was practically guaranteed to over-interpret seasonal words like “snow” as evidence of flu.

    But when big data isn’t seen as a panacea, it can be transformative. Several groups, like Columbia University researcher Jeffrey Shaman’s, for example, have outperformed the flu predictions of both the CDC and GFT by using the former to compensate for the skew of the latter. “Shaman’s team tested their model against actual flu activity that had already occurred during the season,” according to the CDC. By taking the immediate past into consideration, Shaman and his team fine-tuned their mathematical model to better predict the future. All it takes is for teams to critically assess their assumptions about their data.

    Lest I sound like a Google-hater, I hasten to add that the company is far from the only culprit. My wife, an economist, used to work for a company that scraped the entire Internet for job postings and aggregate them into statistics for state labor agencies. The company’s managers boasted that they analyzed 80 percent of the jobs in the country, but once again, the quantity of data blinded them to the ways it could be misread. A local Walmart, for example, might post one sales associate job when it actually wants to fill ten, or it might leave a posting up for weeks after it was filled.

    So rather than succumb to “big data hubris,” the rest of us would do well to keep our skeptic hats on—even when someone points to billions of words.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Welcome to Nautilus. We are delighted you joined us. We are here to tell you about science and its endless connections to our lives. Each month we choose a single topic. And each Thursday we publish a new chapter on that topic online. Each issue combines the sciences, culture and philosophy into a single story told by the world’s leading thinkers and writers. We follow the story wherever it leads us. Read our essays, investigative reports, and blogs. Fiction, too. Take in our games, videos, and graphic stories. Stop in for a minute, or an hour. Nautilus lets science spill over its usual borders. We are science, connected.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
Follow

Get every new post delivered to your Inbox.

Join 552 other followers

%d bloggers like this: