Tagged: Applied Research & Technology Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 7:15 am on February 21, 2017 Permalink | Reply
    Tags: Applied Research & Technology, Chemosynthesis, , , , Strange Life Has Been Found Trapped Inside These Giant Cave Crystals   

    From Science Alert: “Strange Life Has Been Found Trapped Inside These Giant Cave Crystals” 


    Science Alert

    20 FEB 2017

    Alexander Van Driessche/Wikipedia

    A NASA scientist just woke them up.

    Strange microbes have been found inside the massive, subterranean crystals of Mexico’s Naica Mine, and researchers suspect they’ve been living there for up to 50,000 years.

    The ancient creatures appear to have been dormant for thousands of years, surviving in tiny pockets of liquid within the crystal structures. Now, scientists have managed to extract them – and wake them up.

    “These organisms are so extraordinary,” astrobiologist Penelope Boston, director of the NASA Astrobiology Institute, said on Friday at the annual meeting of the American Association for the Advancement of Science (AAAS) in Boston.

    The Cave of Crystals in Mexico’s Naica Mine might look incredibly beautiful, but it’s one of the most inhospitable places on Earth, with temperatures ranging from 45 to 65°C (113 to 149°F), and humidity levels hitting more than 99 percent.

    Not only are temperatures hellishly high, but the environment is also oppressively acidic, and confined to pitch-black darkness some 300 metres (1,000 feet) below the surface.

    Peter Williams/Flickr

    In lieu of any sunlight, microbes inside the cave can’t photosynthesise – instead, they perform chemosynthesis using minerals like iron and sulphur in the giant gypsum crystals, some of which stretch 11 metres (36 feet) long, and have been dated to half a million years old.

    Researchers have previously found life living inside the walls of the cavern and nearby the crystals – a 2013 expedition to Naica reported the discovery of creatures thriving in the hot, saline springs of the complex cave system.

    But when Boston and her team extracted liquid from the tiny gaps inside the crystals and sent them off to be analysed, they realised that not only was there life inside, but it was unlike anything they’d seen in the scientific record.

    They suspect the creatures had been living inside their crystal castles for somewhere between 10,000 and 50,000 years, and while their bodies had mostly shut down, they were still very much alive.

    “Other people have made longer-term claims for the antiquity of organisms that were still alive, but in this case these organisms are all very extraordinary – they are not very closely related to anything in the known genetic databases,” Boston told Jonathan Amos at BBC News.

    What’s perhaps most extraordinary about the find is that the researchers were able to ‘revive’ some of the microbes, and grow cultures from them in the lab.

    “Much to my surprise we got things to grow,” Boston told Sarah Knapton at The Telegraph. “It was laborious. We lost some of them – that’s just the game. They’ve got needs we can’t fulfil.”

    At this point, we should be clear that the discovery has yet to be published in a peer-reviewed journal, so until other scientists have had a chance to examine the methodology and findings, we can’t consider the discovery be definitive just yet.

    The team will also need to convince the scientific community that the findings aren’t the result of contamination – these microbes are invisible to the naked eye, which means it’s possible that they attached themselves to the drilling equipment and made it look like they came from inside the crystals.

    “I think that the presence of microbes trapped within fluid inclusions in Naica crystals is in principle possible,” Purificación López-García from the French National Centre for Scientific Research, who was part of the 2013 study that found life in the cave springs, told National Geographic.

    “[But] contamination during drilling with microorganisms attached to the surface of these crystals or living in tiny fractures constitutes a very serious risk,” she says. I am very skeptical about the veracity of this finding until I see the evidence.”

    That said, microbiologist Brent Christner from the University of Florida in Gainesville, who was also not involved in the research, thinks the claim isn’t as far-fetched as López-García is making it out to be, based on what previous studies have managed with similarly ancient microbes.

    “[R]eviving microbes from samples of 10,000 to 50,000 years is not that outlandish based on previous reports of microbial resuscitations in geological materials hundreds of thousands to millions of years old,” he told National Geographic.

    For their part, Boston and her team say they took every precaution to make sure their gear was sterilised, and cite the fact that the creatures they found inside the crystals were similar, but not identical to those living elsewhere in the cave as evidence to support their claims.

    “We have also done genetic work and cultured the cave organisms that are alive now and exposed, and we see that some of those microbes are similar but not identical to those in the fluid inclusions,” she said.

    Only time will tell if the results will bear out once they’re published for all to see, but if they are confirmed, it’s just further proof of the incredible hardiness of life on Earth, and points to what’s possible out there in the extreme conditions of space.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 10:45 am on February 20, 2017 Permalink | Reply
    Tags: Applied Research & Technology, GISS global climate models over the years,   

    From Goddard: “Forcings in GISS Climate Models” 

    NASA Goddard Banner
    NASA Goddard Space Flight Center

    Dr. Makiko Sato
    Dr. Gavin Schmidt.

    We summarize here forcing datasets used in GISS global climate models over the years. Note that the forcings are estimates that may be revised as new information or better understandings of the source data become available. We archive both our current best estimates of the forcings, along with complete sets of forcings used in specific studies. All radiative forcings are with respect to a specified baseline (often conditions in 1850 or 1750).

    Forcings can be specified in a number of different ways. Traditionally, forcings have been categorised based on specific components in the radiative transfer calculation (concentrations of greenhouse gases, aerosols, surface albedo changes, solar irradiance, etc.). More recently, attribution of forcings have been made via specific emissions (which may have impacts on multiple atmospheric components) or by processes (such as deforestation) that impact multiple terms at once (e.g., Shindell et al., 2009).

    Additionally, the definition of how to specify a forcing can also vary. A good description of these definitions and their differences can be found in Hansen et al. (2005). Earlier studies tend to use either the instantaneous radiative imbalance at the tropopause (Fi), or very similarly, the radiative imbalance at the Top-of-the-Atmosphere (TOA) after stratospheric adjustments — the adjusted forcing (Fa). More recently, the concept of an ‘Effective Radiative Forcing’ (Fs) has become more prevalent, a definition which includes a number of rapid adjustments to the imbalance, not just the stratospheric temperatures. For some constituents, these differences are slight, but for some others (particularly aerosols) they can be significant.

    In order to compare radiative forcings, one also needs to adjust for the efficacy of the forcing relative to some standard, usually the response to increasing CO2. This is designed to adjust for particular geographical features in the forcing that might cause one forcing to trigger larger or smaller feedbacks than another. Applying the efficacies can then make the prediction of the impact of multiple forcings closely equal the net impact of all of them. This is denoted Fe in the Hansen description. Efficacies can depend on the specific context (i.e. they might be different for a very long term simulation, compared to a short term transient simulation) and don’t necessarily disappear by use of the different forcing definitions above.

    Quantifiying the actual forcing within a global climate model is quite complicated and can depend on the baseline climate state. This is therefore an additional source of uncertainty. Within a modern complex climate model, forcings other than solar are not imposed as energy flux perturbations. Rather, the flux perturbations are diagnosed after the specific physical change is made. Estimates of forcings for solar, volcanic and well-mixed GHGs derived from simpler models may be different from the effect in a GCM. Forcings from more heterogenous forcings (aerosols, ozone, land use, etc.) are most often diagnosed from the GCMs directly.
    Forcings in the CMIP5 Simulations

    Fig. Instantaneous radiative forcing at the tropopause (W/m2) in the E2-R NINT ensemble. (a) Individual forcings and (b) Total forcing, along with the separate sums of natural (solar, volcanic and orbital) and anthropogenic forcings. (Updated: 3/12/2016)

    Calculations and descriptions of the forcings in the GISS CMIP5 simulations (1850-2012) can be found in Miller et al. (2014). Data for these figures are available here and here. (Note the iRF figure and values were corrected on 3/12/2016) to account for a missing forcing in the ‘all forcings’ case. Fig. 4 in Miller et al (2014) was also updated). Snapshots of the ERF (Fs) and adjusted forcings (Fa) from these simulations. Note that the forcings from 2000 (or 2005 in some cases) are extrapolations taken from the RCP scenarios, and the real world has diverged slightly from them.

    Further estimates of the responses, including temperatures and the ocean heat content changes, and efficacies are available in the supplementary material associated with Marvel et al. (2016).

    Forcings in Hansen et al. (2011)

    The following chart of forcings from 1880-2011 is taken from Hansen et al. (2011):


    Data is updated from the CMIP3 studies below (e.g., Hansen et al. 2007a, b) and extended to 2011 using assumptions outlined in the paper. The separate radiative forcing data (Fe) are available here (Net forcing). The figures are also available as PDFs here and here.

    Forcings in the CMIP3 simulations

    The following chart of forcings from 1750-2000 is taken from Hansen et al. (2005):


    Figure is also available in PDF format. (Source: Figure 28 of Hansen et al. (2005). More details, including maps and timeseries of individual forcings are available on the Efficacy web pages.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NASA’s Goddard Space Flight Center is home to the nation’s largest organization of combined scientists, engineers and technologists that build spacecraft, instruments and new technology to study the Earth, the sun, our solar system, and the universe.

    Named for American rocketry pioneer Dr. Robert H. Goddard, the center was established in 1959 as NASA’s first space flight complex. Goddard and its several facilities are critical in carrying out NASA’s missions of space exploration and scientific discovery.

    NASA Goddard campus

    NASA/Goddard Campus

    NASA image

  • richardmitnick 10:18 am on February 20, 2017 Permalink | Reply
    Tags: , Applied Research & Technology, , The Real Surprise Behind the 3rd Hottest January on Record   

    From AGU: “The Real Surprise Behind the 3rd Hottest January on Record” 

    AGU bloc

    American Geophysical Union

    18 February 2017
    Dan Satterfield

    The planet’s temperature oscillates a little, between El Nino events and La Nina events. El Nino’s warm the planet a few tenths of a degree, while La Nina events cool it by about that much. The stronger the event the bigger the effect, so a strong El Nino makes it more likely that we will see a new hottest month on record, while a strong La Nina makes that more unlikely.

    All of this is happening as the Earth steadily warms due to the increasing greenhouse gases, and that makes the past few month’s global temp. report so interesting. We’ve had a La Nina over the past few months and it has just now faded away. In spite of that, January was the third hottest month on record. We are now seeing hotter global temperatures during La Nina events than we did in El Nino events in the past. This January was notably warmer than the January of the super El Nino of 1997-98!

    The graphic below (courtesy of Climate Central) shows the up and down of El Nino/La Nina years and the steady rise of global temps. due to the increasing greenhouse gases. Despite what the head of the EPA may think, there is no scientific doubt about this. The only other explanation is the energy received from the sun or changes in the planet’s reflectivity. Research shows that air pollution, however, is blocking enough of the sun’s energy to slow down some of the greenhouse warming. You may hear skeptics talk about the Earth going through “cycles” and it does. Orbital changes over thousands of years, do indeed change our incoming radiation (that’s where ice ages come from), but we know enough to rule out everything but the greenhouse gases. We know where the warming is coming from.

    It’s not El Nino, and it’s not some unknown cycle.
    It’s us.

    The current radiation balance of the planet is shown below. I’ve posted this before but it’s really worth a hard look.

    From Hansen 2011. Click image for the paper. Also see here.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The purpose of the American Geophysical Union is to promote discovery in Earth and space science for the benefit of humanity.

    To achieve this mission, AGU identified the following core values and behaviors.

    Core Principles

    As an organization, AGU holds a set of guiding core values:

    The scientific method
    The generation and dissemination of scientific knowledge
    Open exchange of ideas and information
    Diversity of backgrounds, scientific ideas and approaches
    Benefit of science for a sustainable future
    International and interdisciplinary cooperation
    Equality and inclusiveness
    An active role in educating and nurturing the next generation of scientists
    An engaged membership
    Unselfish cooperation in research
    Excellence and integrity in everything we do

    When we are at our best as an organization, we embody these values in our behavior as follows:

    We advance Earth and space science by catalyzing and supporting the efforts of individual scientists within and outside the membership.
    As a learned society, we serve the public good by fostering quality in the Earth and space science and by publishing the results of research.
    We welcome all in academic, government, industry and other venues who share our interests in understanding the Earth, planets and their space environment, or who seek to apply this knowledge to solving problems facing society.
    Our scientific mission transcends national boundaries.
    Individual scientists worldwide are equals in all AGU activities.
    Cooperative activities with partner societies of all sizes worldwide enhance the resources of all, increase the visibility of Earth and space science, and serve individual scientists, students, and the public.
    We are our members.
    Dedicated volunteers represent an essential ingredient of every program.
    AGU staff work flexibly and responsively in partnership with volunteers to achieve our goals and objectives.

  • richardmitnick 9:44 am on February 20, 2017 Permalink | Reply
    Tags: Applied Research & Technology, Carbon fiber in Australia,   

    From CSIRO: “Carbon fibre makes Australian debut” 

    CSIRO bloc

    Commonwealth Scientific and Industrial Research Organisation

    20 Feb 2017
    Chris Still

    Australia for the first time has the capacity to produce carbon fibre from scratch and at scale, thanks to CSIRO and Deakin University.

    Image: CSIRO

    The “missing link” in Australia’s carbon fibre capability, a wet spinning line (above), has been launched today in a ceremony at Waurn Ponds just outside Geelong.

    Carbon fibre combines high rigidity, tensile strength and chemical resistance with low weight and is used in aerospace, civil engineering, the military, cars, and also in competitive sports.

    Only a handful of companies around the world can create carbon fibre, each using their own secret recipe.

    To join this elite club CSIRO and Deakin researchers had to crack the code.

    In doing so, using patented CSIRO technology, they’ve created what could be the next generation of carbon fibre that is stronger and of a higher quality.

    Director of CSIRO Future Industries, Dr Anita Hill, said the development was an important milestone.

    “This facility means Australia can carry out research across the whole carbon fibre value chain: from molecules, to polymers, to fibre, to finished composite parts,” Dr Hill said.

    “Together with Deakin, we’ve created something that could disrupt the entire carbon fibre manufacturing industry.”

    Deakin University Vice-Chancellor, Professor Jane den Hollander AO said the development is a great example of what Deakin and CSIRO could achieve together, for the benefit of all of Australia.

    “Our two organisations share a long-standing and distinguished bond, one that our new Strategic Relationship Agreement (SRA) deepens even further,” Professor den Hollander said.

    “Together, we’re conducting industry focussed research with a profound and lasting impact, from the communities we serve, through to the world.”

    The wet spinning line machinery takes a sticky mix of precursor chemicals and turns it into five hundred individual strands of fibre, each thinner than a human hair.

    They’re then wound onto a spool to create a tape and taken next door to the massive carbonisation ovens to create the finished carbon fibre.

    The CSIRO/ Deakin wet spinning line was custom built by an Italian company with input from the organisations’ own researchers.

    The company liked the design so much it made another for its own factory and the the CSIRO/ Deakin machine has been described as “the Ferrari of wet spinning lines”.

    Assistant Minister for Industry, Innovation and Science the Honourable Craig Laundy MP officially launched the facility.

    “This is a great example of how collaboration in the Australian research sector can accelerate research, lead innovation and provide new job opportunities,” Mr Laundy said.

    “Geelong already has a global reputation for industrial innovation. Initiatives such as this enhance that standing.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    CSIRO campus

    CSIRO, the Commonwealth Scientific and Industrial Research Organisation, is Australia’s national science agency and one of the largest and most diverse research agencies in the world.

  • richardmitnick 8:57 am on February 20, 2017 Permalink | Reply
    Tags: Applied Research & Technology, Man-made earthquakes,   

    From The Conversation: “Earthquakes triggered by humans pose growing risk” 

    The Conversation

    January 22, 2017
    No writer credit found

    Devastation in Sichuan province after the 2008 Wenchuan earthquake, thought to be induced by industrial activity at a nearby reservoir. dominiqueb/flickr

    People knew we could induce earthquakes before we knew what they were. As soon as people started to dig minerals out of the ground, rockfalls and tunnel collapses must have become recognized hazards.

    Today, earthquakes caused by humans occur on a much greater scale. Events over the last century have shown mining is just one of many industrial activities that can induce earthquakes large enough to cause significant damage and death. Filling of water reservoirs behind dams, extraction of oil and gas, and geothermal energy production are just a few of the modern industrial activities shown to induce earthquakes.

    As more and more types of industrial activity were recognized to be potentially seismogenic, the Nederlandse Aardolie Maatschappij BV, an oil and gas company based in the Netherlands, commissioned us to conduct a comprehensive global review of all human-induced earthquakes.

    Our work assembled a rich picture from the hundreds of jigsaw pieces scattered throughout the national and international scientific literature of many nations. The sheer breadth of industrial activity we found to be potentially seismogenic came as a surprise to many scientists. As the scale of industry grows, the problem of induced earthquakes is increasing also.

    In addition, we found that, because small earthquakes can trigger larger ones, industrial activity has the potential, on rare occasions, to induce extremely large, damaging events.

    How humans induce earthquakes

    As part of our review we assembled a database of cases that is, to our knowledge, the fullest drawn up to date. On Jan. 28, we will release this database publicly. We hope it will inform citizens about the subject and stimulate scientific research into how to manage this very new challenge to human ingenuity.

    Our survey showed mining-related activity accounts for the largest number of cases in our database.

    Earthquakes caused by humans

    Last year, the Nederlandse Aardolie Maatschappij BV commissioned a comprehensive global review of all human-induced earthquakes. The sheer breadth of industrial activity that is potentially seismogenic came as a surprise to many scientists. These examples are now catalogued at The Induced Earthquakes Database.

    Mining 37.4%
    Water reservoir impoundment 23.3%
    Conventional oil and gas 15%
    Geothermal 7.8%
    Waste fluid injection 5%
    Fracking 3.9%
    Nuclear explosion 3%
    Research experiments 1.8%
    Groundwater extraction 0.7%
    Construction 0.3%
    Carbon capture and storage 0.3%

    Source: Earth-Science Reviews Get the data

    Initially, mining technology was primitive. Mines were small and relatively shallow. Collapse events would have been minor – though this might have been little comfort to anyone caught in one.

    But modern mines exist on a totally different scale. Precious minerals are extracted from mines that may be over two miles deep or extend several miles offshore under the oceans. The total amount of rock removed by mining worldwide now amounts to several tens of billions of tons per year. That’s double what it was 15 years ago – and it’s set to double again over the next 15. Meanwhile, much of the coal that fuels the world’s industry has already been exhausted from shallow layers, and mines must become bigger and deeper to satisfy demand.

    As mines expand, mining-related earthquakes become bigger and more frequent. Damage and fatalities, too, scale up. Hundreds of deaths have occurred in coal and mineral mines over the last few decades as a result of earthquakes up to magnitude 6.1 that have been induced.

    Other activities that might induce earthquakes include the erection of heavy superstructures. The 700-megaton Taipei 101 building, raised in Taiwan in the 1990s, was blamed for the increasing frequency and size of nearby earthquakes.

    Since the early 20th century, it has been clear that filling large water reservoirs can induce potentially dangerous earthquakes. This came into tragic focus in 1967 when, just five years after the 32-mile-long Koyna reservoir in west India was filled, a magnitude 6.3 earthquake struck, killing at least 180 people and damaging the dam.

    Throughout the following decades, ongoing cyclic earthquake activity accompanied rises and falls in the annual reservoir-level cycle. An earthquake larger than magnitude 5 occurs there on average every four years. Our report found that, to date, some 170 reservoirs the world over have reportedly induced earthquake activity.

    Magnitude of human-induced earthquakes

    The magnitudes of the largest earthquakes postulated to be associated with projects of different types varies greatly. This graph shows the number of cases reported for projects of various types vs. maximum earthquake magnitude for the 577 cases for which data are available.

    *”Other” category includes carbon capture and storage, construction, groundwater extraction, nuclear explosion, research experiments, and unspecified oil, gas and waste water.

    Source: Earth-Science Reviews Get the data [links are above]

    The production of oil and gas was implicated in several destructive earthquakes in the magnitude 6 range in California. This industry is becoming increasingly seismogenic as oil and gas fields become depleted. In such fields, in addition to mass removal by production, fluids are also injected to flush out the last of the hydrocarbons and to dispose of the large quantities of salt water that accompany production in expiring fields.

    A relatively new technology in oil and gas is shale-gas hydraulic fracturing, or fracking, which by its very nature generates small earthquakes as the rock fractures. Occasionally, this can lead to a larger-magnitude earthquake if the injected fluids leak into a fault that is already stressed by geological processes.

    The largest fracking-related earthquake that has so far been reported occurred in Canada, with a magnitude of 4.6. In Oklahoma, multiple processes are underway simultaneously, including oil and gas production, wastewater disposal and fracking. There, earthquakes as large as magnitude 5.7 have rattled skyscrapers that were erected long before such seismicity was expected. If such an earthquake is induced in Europe in the future, it could be felt in the capital cities of several nations.

    Our research shows that production of geothermal steam and water has been associated with earthquakes up to magnitude 6.6 in the Cerro Prieto Field, Mexico. Geothermal energy is not renewable by natural processes on the timescale of a human lifetime, so water must be reinjected underground to ensure a continuous supply. This process appears to be even more seismogenic than production. There are numerous examples of earthquake swarms accompanying water injection into boreholes, such as at The Geysers, California.

    What this means for the future

    Nowadays, earthquakes induced by large industrial projects no longer meet with surprise or even denial. On the contrary, when an event occurs, the tendency may be to look for an industrial project to blame. In 2008, an earthquake in the magnitude 8 range struck Ngawa Prefecture, China, killing about 90,000 people, devastating over 100 towns, and collapsing houses, roads and bridges. Attention quickly turned to the nearby Zipingpu Dam, whose reservoir had been filled just a few months previously, although the link between the earthquake and the reservoir has yet to be proven.

    The minimum amount of stress loading scientists think is needed to induce earthquakes is creeping steadily downward. The great Three Gorges Dam in China, which now impounds 10 cubic miles of water, has already been associated with earthquakes as large as magnitude 4.6 and is under careful surveillance.

    Scientists are now presented with some exciting challenges. Earthquakes can produce a “butterfly effect”: Small changes can have a large impact. Thus, not only can a plethora of human activities load Earth’s crust with stress, but just tiny additions can become the last straw that breaks the camel’s back, precipitating great earthquakes that release the accumulated stress loaded onto geological faults by centuries of geological processes. Whether or when that stress would have been released naturally in an earthquake is a challenging question.

    An earthquake in the magnitude 5 range releases as much energy as the atomic bomb dropped on Hiroshima in 1945. A earthquake in the magnitude 7 range releases as much energy as the largest nuclear weapon ever tested, the Tsar Bomba test conducted by the Soviet Union in 1961. The risk of inducing such earthquakes is extremely small, but the consequences if it were to happen are extremely large. This poses a health and safety issue that may be unique in industry for the maximum size of disaster that could, in theory, occur. However, rare and devastating earthquakes are a fact of life on our dynamic planet, regardless of whether or not there is human activity.

    Our work suggests that the only evidence-based way to limit the size of potential earthquakes may be to limit the scale of the projects themselves. In practice, this would mean smaller mines and reservoirs, less minerals, oil and gas extracted from fields, shallower boreholes and smaller volumes injected. A balance must be struck between the growing need for energy and resources and the level of risk that is acceptable in every individual project.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The Conversation US launched as a pilot project in October 2014. It is an independent source of news and views from the academic and research community, delivered direct to the public.
    Our team of professional editors work with university and research institute experts to unlock their knowledge for use by the wider public.
    Access to independent, high quality, authenticated, explanatory journalism underpins a functioning democracy. Our aim is to promote better understanding of current affairs and complex issues. And hopefully allow for a better quality of public discourse and conversation.

  • richardmitnick 1:35 pm on February 18, 2017 Permalink | Reply
    Tags: Applied Research & Technology, Breakthrough in understanding heat transport with a chain of gold atoms, , , Wiedemann-Franz law   

    From phys.org: “Breakthrough in understanding heat transport with a chain of gold atoms” 


    February 17, 2017

    Artists’ view of the quantized thermal conductance of an atomically thin gold contact. Credit: Enrique Sahagun

    The precise control of electron transport in microelectronics makes complex logic circuits possible that are in daily use in smartphones and laptops. Heat transport is of similar fundamental importance and its control is for instance necessary to efficiently cool the ever smaller chips. An international team including theoretical physicists from Konstanz, Junior Professor Fabian Pauly and Professor Peter Nielaba and their staff, has achieved a real breakthrough in better understanding heat transport at the nanoscale. The team used a system that experimentalists in nanoscience can nowadays realize quite routinely and keeps serving as the “fruit fly” for breakthrough discoveries: a chain of gold atoms. They used it to demonstrate the quantization of the electronic part of the thermal conductance. The study also shows that the Wiedemann-Franz law, a relation from classical physics, remains valid down to the atomic level. The results were published in the scientific journal Science on 16 February 2017.

    To begin with, the test object is a microscopic gold wire. This wire is pulled until its cross section is only one atom wide and a chain of gold atoms forms, before it finally breaks. The physicists send electric current through this atomic chain, that is through the thinnest wire conceivable. With the help of different theoretical models the researchers can predict the conductance value of the electric transport, and also confirm it by experiment. This electric conductance value indicates how much charge current flows when an electrical voltage is applied. The thermal conductance, that indicates the amount of heat flow for a difference in temperature, could not yet be measured for such atomic wires.

    Now the question was whether the Wiedemann-Franz law, that states that the electric conductance and the thermal conductance are proportional to each other, remains valid also at the atomic scale. Generally, electrons as well as atomic oscillations (also called vibrations or phonons) contribute to heat transport. Quantum mechanics has to be used, at the atomic level, to describe both the electron and the phonon transport. The Wiedemann-Franz law, however, only describes the relation between macroscopic electronic properties. Therefore, initially the researchers had to find out how high the contribution of the phonons is to the thermal conductance.

    The doctoral researchers Jan Klöckner and Manuel Matt did complementary theoretical calculations, which showed that usually the contribution of phonons to the heat transport in atomically thin gold wires is less than ten percent, and thus is not decisive. At the same time, the simulations confirm the applicability of the Wiedemann-Franz law. Manuel Matt used an efficient, albeit less accurate method that provided statistical results for many gold wire stretching events to calculate the electronic part of the thermal conductance value, while Jan Klöckner applied density functional theory to estimate the electronic and phononic contributions in individual contact geometries. The quantization of the thermal conductance in gold chains, as proven by experiment, ultimately results from the combination of three factors: the quantization of the electrical conductance value in units of the so-called conductance quantum (twice the inverse Klitzing constant 2e2/h), the negligible role of phonons in heat transport and the validity of the Wiedemann-Franz law.

    For quite some time it has been possible to theoretically calculate, with the help of computer models as developed in the teams of Fabian Pauly and Peter Nielaba, how charges and heat flow through nanostructures. A highly precise experimental setup, as created by the experimental colleagues Professor Edgar Meyhofer and Professor Pramod Reddy from the University of Michigan (USA), was required to be able to compare the theoretical predictions with measurements. In previous experiments the signals from the heat flow through single atom contacts were too small. The Michigan group succeeded in improving the experiment: Now the actual signal can be filtered out and measured.

    The results of the research team make it possible to study heat transport not only in atomic gold contacts but many other nanosystems. They offer opportunities to experimentally and theoretically explore numerous fundamental quantum heat transport phenomenona that might help to use energy more efficiently, for example by exploiting thermoelectricity.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    About Phys.org in 100 Words

    Phys.org™ (formerly Physorg.com) is a leading web-based science, research and technology news service which covers a full range of topics. These include physics, earth science, medicine, nanotechnology, electronics, space, biology, chemistry, computer sciences, engineering, mathematics and other sciences and technologies. Launched in 2004, Phys.org’s readership has grown steadily to include 1.75 million scientists, researchers, and engineers every month. Phys.org publishes approximately 100 quality articles every day, offering some of the most comprehensive coverage of sci-tech developments world-wide. Quancast 2009 includes Phys.org in its list of the Global Top 2,000 Websites. Phys.org community members enjoy access to many personalized features such as social networking, a personal home page set-up, RSS/XML feeds, article comments and ranking, the ability to save favorite articles, a daily newsletter, and other options.

  • richardmitnick 2:23 pm on February 17, 2017 Permalink | Reply
    Tags: 10 global challenges, Applied Research & Technology, , , Stanford Catalyst for Collaborative Solutions   

    From Stanford: “Stanford Catalyst for Collaborative Solutions focuses on 10 global challenges” 

    Stanford University Name
    Stanford University

    February 16, 2017
    Michael Freedman

    The Stanford Catalyst for Collaborative Solutions plans to award $12 million to four interdisciplinary teams, each committed to working in collaboration on projects that will make headway on one of 10 global challenges.

    Senior research engineer Jennifer Hicks, right, in discussion with Ilenia Battiato, assistant professor of energy resources engineering, at a workshop in the d.school designed to help faculty meet one another and start to identify common research interests as part of the Stanford Catalyst for Collaborative Solutions. (Image credit: L.A. Cicero)

    When Stanford Engineering conducted its school-wide strategic planning program two years ago, one of the main outcomes was the identification of 10 major global challenges on which it would like to have a significant impact.

    10 Grand Challenges

    How can we ensure that humanity flourishes in the cities of the future?

    The world’s urban population is projected to increase from 3.9 billion to 6.3 billion by 2050, making up 66 percent of the entire global population. Today’s urban areas provide a disparate quality of life and quality of services to their populations, and they inflict a mostly adverse impact on our natural environment. Our challenge is to design and re-engineer our urban environments for the future to provide modern services in ways that allow humans and nature to flourish.

    How can we engineer matter from atomic to macro scales?

    The history of human civilization has always been associated with new materials. However, materials are necessary but not sufficient: They need to be affordably and safely manufactured at scale and integrated into engineered devices and systems to create value for society. We seek to engineer matter – at all scales – for affordable and sustainable energy conversion, storage and use; new ways to improve human health and quality of life; and new approaches to creating affordable, clean and drinkable water.

    How can we use autonomy to enable future technologies?

    In an era of continued industrialization, urbanization and globalization, much higher levels of autonomy in a variety of engineered systems are emerging. But the scientific, technological, legal and ethical knowledge required is not yet available to infuse higher levels of autonomy into many of these systems. Moreover, the societal implications of much higher levels of systems autonomy in our daily lives – such as the potential for significant loss in employment – are not well understood. To address such challenges and achieve effective solutions, it will be necessary to integrate engineering disciplines with expertise throughout the university.

    How can we use our strength in computation and data analysis to drive innovation?

    In recent decades, computation and data analysis (CDA) have become critically important in nearly every field of science and engineering. CDA is also increasingly widespread in medicine, the social sciences, the humanities and beyond. Our challenge is to harness domain expertise throughout the university, especially unique access to large data sets and high-performance computing, to provide opportunities for CDA-based innovation that cross traditional boundaries.

    How do we achieve effective yet affordable healthcare everywhere?

    Health care concerns pose tremendous challenges to humanity, but evolving technological trends present tremendous opportunities to address these challenges. New products and processes are emerging that will change how we deliver health care, and remote monitoring and telemedicine are creating a sea change in the role of the physician. Leveraging ongoing transformations in healthcare data, personalized medicine, and preventative care to provide low-cost, high-quality health care globally will require a new level of interdisciplinary collaboration.

    How do we create synergy between humans and engineered systems?

    Engineering exists to serve humanity, and as advances in information, communication and sensor technologies permeate our lives, the interface between us and our technology is becoming both richer and more complex. But how well do these technologies understand what we want? Our challenge is to manage the complex interface needed for technology to discover, understand and adapt to individual, social and cultural values over time.

    How do we secure everything?

    For all the good the digital revolution is producing, it also is bringing new threats and increasingly sophisticated attacks on everything from personal finances to national elections. We currently lack a deep enough understanding of how to engineer such systems securely, and yet many physical systems, once deployed, will remain in place for decades or longer. We must therefore figure out today how to ensure security into the future and how to rapidly deploy those solutions once they are developed.

    How do we sustain the exponential increase in information technology performance?

    Exponential advances in the performance, integration density and energy efficiency of computing systems fueled the information technology (IT) revolution of the 20th century. However, predicting the fate of IT systems from our current trajectory raises more questions than answers. For example, there is no clear roadmap for how we will manage the exponential growth of such data without consuming excessive amounts of power. Solving challenges such as this will require coordinated breakthroughs from materials to the underlying mathematics of computing.

    How do we provide humanity with the affordable energy it needs and stabilize the climate?

    One of the greatest challenges humanity will face this century is providing the world’s growing population and economy with the clean and affordable energy it needs. In a business-as-usual scenario, there are no solutions to provide this energy while reducing greenhouse emissions so that the global climate can be stabilized. Our challenge is to combine technology, financing, market structure, business models, policies and studies of consumer behavior to accelerate deployment of carbon-free energy generation while dramatically reducing consumption of electricity and transportation fuels.

    How good can we get at engineering living matter?

    A global research community has formed with the goal of making biology easy to engineer. We can now foresee achieving exponential improvements in our capacity to engineer living systems and more powerfully harness life’s intrinsic capacity for organizing atoms. Such capacities could be used to remake our civilization’s supply chains; open new frontiers in medicine; and enable the otherwise impossible, such as exploration on Mars. However, positive outcomes will require that ethical, political, and cultural implications of these new technologies are henceforth considered as an essential research activity alongside the science and engineering.

    But how? The issues highlighted were exceedingly complex, focusing on things like how to ensure humanity flourishes in the cities of the future and how to achieve effective yet affordable healthcare everywhere. Solving them would require not just ingenuity but the ability to bring together expertise from multiple disciplines and perspectives from Stanford, industry and the public sector.

    To help achieve this audacious goal, the school has now launched the Stanford Catalyst for Collaborative Solutions. To start, the initiative will provide up to $12 million to four teams, each of them committed to working in collaboration on projects that will make significant headway on one of the 10 grand challenges.

    This spring, Catalyst director John Dabiri and his team of advisors will identify and fund the first two projects, each receiving up to $3 million over three years, with two more teams to be identified for funding in early 2018.

    Leveraging expertise

    “This is an exciting opportunity to leverage Stanford’s expertise across all seven schools in collaborative pursuit of solutions to big challenges that are normally addressed piecemeal if at all,” said Dabiri, a professor of mechanical engineering and of civil and environmental engineering. “The Catalyst represents a bold investment by Stanford, and it has already proven to be a powerful convening force, bringing together faculty from nearly every discipline, most of whom are meeting for the first time.”

    The proposals, due March 17, will be evaluated in part on teams’ willingness to take risks and explore ideas beyond the bounds of traditional research, according to Dabiri, and to an equal measure on their plan to “initiate and sustain meaningful interdisciplinary collaborations within the School of Engineering, across the university and beyond.”

    Teams must include at least one member of the Stanford Engineering faculty as a member of the project leadership team. Beyond that, Dabiri said, teams should be composed of those best suited to working together to solve the problem rather than from any particular background. Teams are required in their proposals to address how every member of the team will interact with one another, and show how each individual is integral to the success of the project.

    Jenny Suckale, assistant professor of geophysics, listens as law Professor Amalia Kessler shares during an exercise in the Catalyst seminar at the d.school. (Image credit: L.A. Cicero)

    To kickstart the initiative, Dabiri and his team recently held a series of workshops during which several dozen faculty members from throughout the university participated in a series of collaborative exercises to help faculty from different schools and departments meet one another and start to identify common research interests.

    “Communicating across disciplines is not always straightforward,” said d.school Executive Director Sarah Stein Greenberg, who led the workshops and serves on the Catalyst advisory board. “So we are engaging participants with a variety of tools to help foster unexpected connections and encourage new ideas to start to flow.”

    In one warm-up exercise, Stein Greenberg asked participants to share in small groups how they’re known in their fields and how they’d like to be known. The goal, she said, was to help participants accelerate the normal rate of getting to know one another and to have a forum to explore their own aspirations and motivations and expose themselves to potential collaborators.

    In another exercise, she asked participants to sketch out their disciplines to show the various intersections with other areas of expertise. In small-group discussions, participants explored how their individual disciplines combined, overlapped or stood apart from other fields. Participants began to see how the complex challenges of the world are most often solved in collaboration rather than in isolation.

    “The workshop was a great opportunity to meet fascinating people from other schools and departments across the university whom I would never encounter through the normal course of my research and teaching, given how specialized we all tend to be,” said law Professor Amalia Kessler.

    Meaningful collaboration, bold risks

    At the workshops, organizers asked participants to consider in small groups and bring their own expertise to bear on the issue of how humanity can flourish in the cities of the future. “It was very interesting to see that despite our disciplinary differences,” Kessler said, “colleagues specializing in civil engineering and risk management and I ended up agreeing that a core hindrance to meaningful change of any kind is deep-rooted, institutionalized forms of structural inequality.”

    Doctoral candidate Chris Ford, Professor Larry Leifer and Assistant Professor Sindy Tang, all of the Department of Mechanical Engineering, work on one of the exercises in the Catalyst workshop. (Image credit: L.A. Cicero)

    Economist Garth Saloner, former dean of the Graduate School of Business, said that one important feature of the Catalyst initiative is that it will provide workshops, forums, conferences and dinners to facilitate the formation of cross-disciplinary teams that don’t already exist.

    “Many of the societal challenges that social scientists are interested in require a deep understanding of technology or have solutions that are in part implemented through technology. Enabling faculty in different schools who have different pieces of the puzzle to find one another will be a unique and core feature of the Catalyst model,” said Saloner, who serves on the Catalyst advisory board.

    Dabiri notes that the outcome of the projects will vary depending on the team. For some it may include development of a new technology; for others it may be the implementation of a policy mechanism. The key, he said, is that by providing funding that explicitly requires meaningful collaboration and encourages the kinds of bold risks that do not normally get funded, the Catalyst can be a model for high-impact and interdisciplinary research.

    An equally important outcome, he said, is that it will create a new network of faculty, staff and students working collaboratively to solve the world’s most urgent challenges.

    “Our primary goal at this stage is to encourage participation from all parts of the Stanford community through the program funding and other activities to come,” he said.

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    Leland and Jane Stanford founded the University to “promote the public welfare by exercising an influence on behalf of humanity and civilization.” Stanford opened its doors in 1891, and more than a century later, it remains dedicated to finding solutions to the great challenges of the day and to preparing our students for leadership in today’s complex world. Stanford, is an American private research university located in Stanford, California on an 8,180-acre (3,310 ha) campus near Palo Alto. Since 1952, more than 54 Stanford faculty, staff, and alumni have won the Nobel Prize, including 19 current faculty members

    Stanford University Seal

  • richardmitnick 1:31 pm on February 16, 2017 Permalink | Reply
    Tags: Applied Research & Technology, , , , Triangulene   

    From Futurism: “Scientists Have Finally Created a Molecule That Was 70 Years in the Making” 



    Neil C. Bhavsar

    Creating the Impossible

    Move over graphene, it’s 2017 and we have a new carbon structure to rave about: Triangulene. It’s one atom thick, six carbon hexagons in size, and in the shape of – you guessed it – a triangle.


    Development of the molecule has eluded chemists for a period of nearly seventy years. It was first predicted mathematically in the 1950s by Czech scientist, Eric Clar. He noted that the molecule would be unstable electronically due to two unpaired electrons in the six benzene structure. Since then, the mysterious molecule has ushered generations of scientists in a pursuit for the unstable molecule – all resulting in failure due to the oxidizing properties of two lone electron pairs.

    Now, IBM researchers in Zurich, Switzerland seem to have done the impossible: they created the molecule. While most scientists build molecules from the ground up, Leo Gross and his team decided to take the opposite approach. They worked with a larger precursor model and removed two hydrogens substituents from the molecule to conjure up the apparition molecule that is triangulene.

    On top of this, they were able to successfully image the structure with a scanning probe microscope and note the molecule’s unexpected stability in the presence of copper. Their published work is available at Nature Nanotechnology.

    This new material is already proving to be impressive. The two unpaired electrons of the triangulene molecules were discovered to have aligned spins, granting the molecule magnetic properties. Meaning triangulene has a lot of potential in electronics, specifically by allowing us to encode and process information by manipulating the electron spin – a field known as spintronics.

    The IBM researchers still have a lot to learn about triangulene. Moving forward, other teams will attempt to verify whether the researchers actually created the triangle-shaped molecule or not. Until then, the technique the team developed could be used for making other elusive structures. Although, it still isn’t ideal, as it is a slow and expensive process. Even so, this could push us closer to the age of quantum computers.

    References: ScienceAlert – Latest, Nature

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Futurism covers the breakthrough technologies and scientific discoveries that will shape humanity’s future. Our mission is to empower our readers and drive the development of these transformative technologies towards maximizing human potential.

  • richardmitnick 10:56 am on February 16, 2017 Permalink | Reply
    Tags: Applied Research & Technology, , , , Using (MRI) to study the brains of infants who have older siblings with autism   

    From U Washington: “Predicting autism: Researchers find autism biomarkers in infancy” 

    U Washington

    University of Washington

    February 15, 2017
    No writer credit

    By using magnetic resonance imaging (MRI) to study the brains of infants who have older siblings with autism, scientists were able to correctly identify 80 percent of the babies who would be subsequently diagnosed with autism at 2 years of age.

    Researchers from the University of Washington were part of a North American effort led by the University of North Carolina to use MRI to measure the brains of “low-risk” infants, with no family history of autism, and “high-risk” infants who had at least one autistic older sibling. A computer algorithm was then used to predict autism before clinically diagnosable behaviors set in. The study was published Feb. 15 in the journal Nature.

    This is the first study to show that it is possible to use brain biomarkers to identify which infants in a high-risk pool — that is, those having an older sibling with autism — will be diagnosed with autism spectrum disorder, or ASD, at 24 months of age.

    Annette Estes, left, plays with a child at the UW Autism Center.Kathryn Sauber

    “Typically, the earliest we can reliably diagnose autism in a child is age 2, when there are consistent behavioral symptoms, and due to health access disparities the average age of diagnosis in the U.S. is actually age 4,” said co-author and UW professor of speech and hearing sciences Annette Estes, who is also director of the UW Autism Center and a research affiliate at the UW Center on Human Development and Disability, or CHDD. “But in our study, brain imaging biomarkers at 6 and 12 months were able to identify babies who would be later diagnosed with ASD.”

    The predictive power of the team’s findings may inform the development of a diagnostic tool for ASD that could be used in the first year of life, before behavioral symptoms have emerged.

    “We don’t have such a tool yet,” said Estes. “But if we did, parents of high-risk infants wouldn’t need to wait for a diagnosis of ASD at 2, 3 or even 4 years and researchers could start developing interventions to prevent these children from falling behind in social and communication skills.”

    People with ASD — which includes 3 million people in the United States — have characteristic social communication deficits and demonstrate a range of ritualistic, repetitive and stereotyped behaviors. In the United States, it is estimated that up to one out of 68 babies develops autism. But for infants with an autistic older sibling, the risk may be as high as one out of every five births.

    This research project included hundreds of children from across the country and was led by researchers at four clinical sites across the United States: the University of North Carolina-Chapel Hill, UW, Washington University in St. Louis and The Children’s Hospital of Philadelphia. Other key collaborators are at the Montreal Neurological Institute, the University of Alberta and New York University.

    Stephen Dager.Marie-Anne Domsalla

    “We have wonderful, dedicated families involved in this study,” said Stephen Dager, a UW professor of radiology and associate director of the CHDD, who led the study at the UW. “They have been willing to travel long distances to our research site and then stay up until late at night so we can collect brain imaging data on their sleeping children. The families also return for follow-up visits so we can measure how their child’s brain grows over time. We could not have made these discoveries without their wholehearted participation.”

    Researchers obtained MRI scans of children while they were sleeping at 6, 12 and 24 months of age. The study also assessed behavior and intellectual ability at each visit, using criteria developed by Estes and her team. They found that the babies who developed autism experienced a hyper-expansion of brain surface area from 6 to 12 months, as compared to babies who had an older sibling with autism but did not themselves show evidence of autism at 24 months of age. Increased surface area growth rate in the first year of life was linked to increased growth rate of brain volume in the second year of life. Brain overgrowth was tied to the emergence of autistic social deficits in the second year.

    MRI technician Mindy Dixon and Stephen Dager review a magnetic resonance spectroscopic image of a child’s brain chemistry.University of Washington

    The researchers input these data — MRI calculations of brain volume, surface area, and cortical thickness at 6 and 12 months of age, as well as sex of the infants — into a computer program, asking it to classify babies most likely to meet ASD criteria at 24 months of age. The program developed the best algorithm to accomplish this, and the researchers applied the algorithm to a separate set of study participants.

    Researchers found that, among infants with an older ASD sibling, the brain differences at 6 and 12 months of age successfully identified 80 percent of those infants who would be clinically diagnosed with autism at 24 months of age.

    If these findings could form the basis for a “pre-symptomatic” diagnosis of ASD, health care professionals could intervene even earlier.

    “By the time ASD is diagnosed at 2 to 4 years, often children have already fallen behind their peers in terms of social skills, communication and language,” said Estes, who directs behavioral evaluations for the network. “Once you’ve missed those developmental milestones, catching up is a struggle for many and nearly impossible for some.”

    Research could then begin to examine interventions on children during a period before the syndrome is present and when the brain is most malleable. Such interventions may have a greater chance of improving outcomes than treatments started after diagnosis.

    “Our hope is that early intervention — before age 2 — can change the clinical course of those children whose brain development has gone awry and help them acquire skills that they would otherwise struggle to achieve,” said Dager.

    The research team has gathered additional behavioral and brain imaging data on these infants and children — such as changes in blood flow in the brain and the movement of water along white matter networks — to understand how brain connectivity and neural activity may differ between high-risk children who do and don’t develop autism. In a separate study published Jan. 6 in Cerebral Cortex, the researchers identified specific brain regions that may be important for acquiring an early social behavior called joint attention, which is orienting attention toward an object after another person points to it.

    “These longitudinal imaging studies, which follow the same infants as they grow older, are really starting to hone in on critical brain developmental processes that can distinguish children who go on to develop ASD and those who do not,” said Dager. “We hope these ongoing efforts will lead to additional biomarkers, which could provide the basis for early, pre-symptomatic diagnosis and serve also to guide individualized interventions to help these kids from falling behind their peers.”

    The research was funded by the National Institutes of Health, Autism Speaks and the Simons Foundation.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The University of Washington is one of the world’s preeminent public universities. Our impact on individuals, on our region, and on the world is profound — whether we are launching young people into a boundless future or confronting the grand challenges of our time through undaunted research and scholarship. Ranked number 10 in the world in Shanghai Jiao Tong University rankings and educating more than 54,000 students annually, our students and faculty work together to turn ideas into impact and in the process transform lives and our world. For more about our impact on the world, every day.

    So what defines us — the students, faculty and community members at the University of Washington? Above all, it’s our belief in possibility and our unshakable optimism. It’s a connection to others, both near and far. It’s a hunger that pushes us to tackle challenges and pursue progress. It’s the conviction that together we can create a world of good. Join us on the journey.

  • richardmitnick 10:17 am on February 16, 2017 Permalink | Reply
    Tags: , Applied Research & Technology, , , , , , SCOAP³,   

    From The Conversation: “How the insights of the Large Hadron Collider are being made open to everyone” 

    The Conversation

    January 12, 2017 [Just appeared in social media.]
    Virginia Barbour

    CERN CMS Higgs Event
    CERN CMS Higgs Event

    If you visit the Large Hadron Collider (LHC) exhibition, now at the Queensland Museum, you’ll see the recreation of a moment when the scientist who saw the first results indicating discovery of the Higgs boson laments she can’t yet tell anyone.

    It’s a transitory problem for her, lasting as long as it takes for the result to be thoroughly cross-checked. But it illustrates a key concept in science: it’s not enough to do it; it must be communicated.

    That’s what is behind one of the lesser known initiatives of CERN (European Organization for Nuclear Research): an ambitious plan to make all its research in particle physics available to everyone, with a big global collaboration inspired by the way scientists came together to make discoveries at the LHC.

    This initiative is called SCOAP³, the Sponsoring Consortium for Open Access in Particle Physics Publishing, and is now about to enter its fourth year of operation. It’s a worldwide collaboration of more than 3,000 libraries (including six in Australia), key funding agencies and research centres in 44 countries, together with three intergovernmental organisations.

    It aims to make work previously only available to paying subscribers of academic journals freely and immediately available to everyone. In its first three years it has made more than 13,000 articles available.

    Not only are these articles free for anyone to read, but because they are published under a Creative Commons attribution license (CCBY), they are also available for anyone to use in anyway they wish, such as to illustrate a talk, pass onto a class of school children, or feed to an artificial intelligence program to extract information from. And these usage rights are enshrined forever.

    The concept of sharing research is not new in physics. Open access to research is now a growing worldwide initiative, including in Australasia. CERN, which runs the LHC, was also where the world wide web was invented in 1989 by Tim Berners-Lee, a British computer scientist at CERN.

    The main purpose of the web was to enable researchers contributing to CERN from all over the world share documents, including scientific drafts, no matter what computer systems they were using.

    Before the web, physicists had been sharing paper drafts by post for decades, so they were one of the first groups to really embrace the new online opportunities for sharing early research. Today, the pre-press site arxiv.org has more than a million free article drafts covering physics, mathematics, astronomy and more.

    But, with such a specialised field, do these “open access” papers really matter? The short answer is “yes”. Downloads have doubled to journals participating in SCOAP³.

    With millions of open access articles now being downloaded across all specialities, there is enormous opportunity for new ideas and collaborations to spring from chance readership. This is an important trend: the concept of serendipity enabled by open access was explored in 2015 in an episode of ABC RN’s Future Tense program.

    Greater than the sum of the parts

    There’s also a bigger picture to SCOAP³’s open access model. Not long ago, the research literature was fragmented. Individual papers and the connections between them were only as good as the physical library, with its paper journals, that academics had access to.

    Now we can do searches in much less time than we spend thinking of the search question, and the results we are presented with are crucially dependent on how easily available the findings themselves are. And availability is not just a function of whether an article is free or not but whether it is truly open, i.e. connected and reusable.

    One concept is whether research is “FAIR”, or Findable, Accessible, Interoperable and Reusable. In short, can anyone find, read, use and reuse the work?

    The principle is most advanced for data, but in Australia work is ongoing to apply it to all research outputs. This approach was also proposed at the November 2016 meeting of the G20 Science, Technology and Innovation Ministers Meeting. Research findings that are not FAIR can, effectively, be invisible. It’s a huge waste of millions of taxpayer dollars to fund research that won’t be seen.

    There is an even bigger picture that research and research publications have to fit into: that of science in society.

    Across the world we see politicians challenging accepted scientific norms. Is the fact that most academic research remains available only to those who can pay to see it contributing to an acceptance of such misinformed views?

    If one role for science is to inform public debate, then restricting access to that science will necessarily hinder any informed public debate. Although no one suggests that most readers of news sites will regularly want to delve into the details of papers in high energy physics, open access papers are 47% more likely to end up being cited in Wikipedia, which is a source that many non-scientists do turn to.

    Even worse, work that is not available openly now may not even be available in perpetuity, something that is being discussed by scientists in the USA.

    So in the same way that CERN itself is an example of the power of international collaboration to ask some of the fundamental scientific questions of our time, SCOAP³ provides a way to ensure that the answers, whatever they are, are available to everyone, forever.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The Conversation US launched as a pilot project in October 2014. It is an independent source of news and views from the academic and research community, delivered direct to the public.
    Our team of professional editors work with university and research institute experts to unlock their knowledge for use by the wider public.
    Access to independent, high quality, authenticated, explanatory journalism underpins a functioning democracy. Our aim is to promote better understanding of current affairs and complex issues. And hopefully allow for a better quality of public discourse and conversation.

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: