Tagged: Applied Research & Technology Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 5:50 am on December 3, 2016 Permalink | Reply
    Tags: 7 Among World’s Leading Scientific Minds, Applied Research & Technology, ,   

    From UMass Amherst: “Seven UMass Amherst Researchers Named among ‘World’s Leading Scientific Minds,’ Survey Says” 

    U Mass Amherst

    University of Massachusetts

    December 1, 2016
    Janet Lathrop


    Once again, seven University of Massachusetts Amherst faculty members are among “the world’s leading scientific minds,” whose publications are among the most influential in their fields, according to a survey by leading multinational media and information firm Thomson Reuters.

    Thomson Reuters compilers who set out to identify “some of the best and brightest scientific minds of our time” recently recognized UMass Amherst food scientists Eric Decker and David Julian McClements, polymer scientist Thomas Russell, soil chemist Baoshan Xing of the Stockbridge School of Agriculture, biostatistician and epidemiologist Susan Hankinson of the School of Public Health and Health Sciences, microbiologist Derek Lovley and astronomer Mauro Giavaliso in its recent Highly Cited Researchers 2016 list.

    Thomson Reuters says, “The 2016 list focuses on contemporary research achievement: only Highly Cited Papers in science and social sciences journals indexed in the Web of Science Core Collection during the 11-year period 2004-14 were surveyed.” These papers are defined as those that rank in the top 1 percent by citations for field and publication year in the Web of Science.

    Michael Malone, vice chancellor for research and engagement for the campus, says, “The results of this citation study demonstrate the terrific impact of the research done by this distinguished group of UMass Amherst faculty and their students.”

    The UMass Amherst researchers are among more than 3,000 researchers in 21 fields who earned this distinction “by writing the greatest numbers of reports officially designated by Essential Science Indicators as highly cited papers, ranking among the top 1 percent most cited for their subject field and year of publication, earning them the mark of exceptional impact,” the compilers explain.

    To focus on “more contemporary research achievement” and “recognize early and mid-career as well as senior researchers,” for this year’s list they survey only articles and reviews in science and social sciences journals indexed in the Web of Science Core Collection during the period 2004-14. Next, as an impact measure they consider only Highly Cited Papers, those ranked in the top 1 percent by citations for field and year, instead of total citations.

    “Relatively younger researchers are more apt to emerge in such an analysis than in one dependent on total citations over many years,” compilers note. Data used in the analysis and selection came from Essential Science Indicators, 2002-12, which included 113,092 Highly Cited Papers.

    The Thomson Reuters group determined how many researchers to include in the list for each field based on the population of each field. The analysis does not include letters to the editor, correction notices and other marginalia.

    The ranking team notes that “there are many highly accomplished and influential researchers who are not recognized by the method described above and whose names do not appear in the new list,” and “the only reasonable approach to interpreting a list of top researchers such as ours is to fully understand the method behind the data and results, and why the method was used.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    U Mass Amherst campus

    UMass Amherst, the Commonwealth’s flagship campus, is a nationally ranked public research university offering a full range of undergraduate, graduate and professional degrees.

    As the flagship campus of America’s education state, the University of Massachusetts Amherst is the leader of the public higher education system of the Commonwealth, making a profound, transformative impact to the common good. Founded in 1863, we are the largest public research university in New England, distinguished by the excellence and breadth of our academic, research and community outreach programs. We rank 29th among the nation’s top public universities, moving up 11 spots in the past two years in the U.S. News & World Report’s annual college guide.

  • richardmitnick 8:52 am on December 2, 2016 Permalink | Reply
    Tags: Applied Research & Technology, , , Why can’t we predict when a volcano will erupt?   

    From COSMOS: “Why can’t we predict when a volcano will erupt?” 

    Cosmos Magazine bloc


    02 December 2016
    David Pyle

    Volcanoes spew out rocks that could hold the key to understanding them. Shutterstock

    We started 2016 with a bang. Both Chile and Indonesia saw a clutch of volcanoes erupting after laying dormant for a decade or more. This followed an eruption in April 2015, when Calbuco volcano in Chile burst back to life after more than 40 years of silence, with experts giving less than two hours of warning. In an era of global satellite monitoring with proliferating networks of instruments on the ground, why can we still not accurately predict volcanic eruptions?

    Volcano scientists have an unprecedented array of tools with which to keep an eye on the world’s many restless and active volcanoes. In many cases, we can watch emerging events from the safe distance of an volcano observatory. Or, once an eruption has begun, we can observe it in near-real time using satellite feeds and social media. But this isn’t matched by our ability to anticipate what might happen next at a restless but dormant volcano. New research, however, is providing clues about the best way to look for signals of future volcanic behaviour.

    Like medicine, volcanologists can get a clearer sense of the state of a volcano using observations from many other examples around the world. But if we don’t know the prior history of a particular volcano, and with no way of taking the equivalent of a biopsy from it, our capacity to work out what is going on is always going to be limited. For example, some volcanoes stay completely quiet and then erupt violently without warning, while others are noisy but have a moment of calm before they erupt. Without prior knowledge, how would we know?

    Sampling eruptions

    While we can’t yet safely drill into a rumbling volcano, the deposits from past eruptions may contain the information we need about what happened in the build-up to that eruption. Explosive eruptions typically throw out large quantities of ejecta, the frozen and disrupted remnants of the emptied magma reservoir.

    This often includes pumice, a light and frothy rock made of a network of glassy tubes, sheets and strands and a void space that fills with volcanic gas, mainly steam, just before eruption which is then replaced with air. Other components include crystals of different minerals that grew at depth as the magma cooled and started to solidify, perhaps for decades or centuries.

    Explosive eruptions are thought to be caused by bubbles of gas escaping from the molten rock deep below the ground. When fresh magma first arrives beneath the volcano, it usually contains quantities of dissolved gases, like water and carbon dioxide.

    As the magma cools and freezes into solid rock, the gases remain dissolved in a smaller and smaller amount of melt, until eventually the melt becomes saturated and bubbles of gas start to form. From this point, the pressure inside the volcano begins to build and eventually, the rocks around the magma chamber crack. Then the bubbly magma rises through the crack to the surface, starting an eruption.

    Could eruptions like Calbuco one day be predicted? Reuters

    Bubbles point the way

    But how can we find out the point at which the magma starts to grow bubbles? This is where forensic volcanology comes in. As magmas freeze, the crystals formed at different times will capture snapshots of the state of the reservoir. With some good fortune, it is sometime possible to go and find these crystals after an eruption, and piece together the sequence of events.

    In our new research, my colleagues and I have shown how this approach works at Campi Flegrei, a steaming volcanic field that lies west of Naples and the supposed location of the entrance to the underworld in Roman mythology. By analysing the composition of one particular mineral called apatite, which grew throughout the long cooling history of the magma, we found that the gas bubbles could only have formed shortly – perhaps a few days to months – before the eruption itself. So at this volcano, the best signals of an impending eruption might be a combination of swelling of the ground levels (with changing pressure) and in the gases escaping out of the volcano.

    This still doesn’t provide us with a simple way to predict the eruptions of any volcano. But it does show how taking a forensic look at the deposits of past eruptions at a specific site offer a way to help identify the monitoring signals that will give us clues to future behaviour. And this moves us a step closer to being able predict when an eruption is likely.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 5:31 pm on December 1, 2016 Permalink | Reply
    Tags: Applied Research & Technology, , MoS2,   

    From NC State: “New Findings Boost Promise of Molybdenum Sulfide for Hydrogen Catalysis” 

    NC State bloc

    North Carolina State University

    December 1, 2016
    Linyou Cao

    Matt Shipman


    Researchers from North Carolina State University, Duke University and Brookhaven National Laboratory have found that molybdenum sulfide (MoS2) holds more promise than previously thought as a catalyst for producing hydrogen to use as a clean energy source. Specifically, the researchers found that the entire surface of MoS2 can be used as a catalyst, not just the edges of the material.

    “The key finding here is that the intrinsic catalytic performance of MoS2 is much better than the research community thought,” says Linyou Cao, an associate professor of materials science and engineering at NC State and senior author of a paper describing the work. “We’re optimistic that this can be a step toward making hydrogen a larger part of our energy portfolio.”

    Hydrogen promises clean energy, producing only water as a byproduct. But to create hydrogen for use as a clean energy source, ideally you’d be able to isolate the hydrogen gas from water – with the only byproduct being oxygen.

    However, the key to creating hydrogen from water – a process called hydrogen evolution – is an efficient catalyst. Currently, the best catalyst is platinum, which is too expensive for widespread use.

    Another candidate for a hydrogen evolution catalyst is MoS2, which is both inexpensive and abundant. But it has long been thought that MoS2 is of limited utility, based on the conventional wisdom that only the edges of MoS2 act as catalysts – leaving the bulk of the material inactive.

    But the new findings from NC State, Duke and Brookhaven show that the surface of MoS2 can be engineered to maximize the catalytic efficiency of the material. And the key to this efficiency is the number of sulfur vacancies in the MoS2.

    If you think of the crystalline structure of MoS2 as a grid of regularly spaced molybdenum and sulfur atoms, a sulfur vacancy is what happens when one of those sulfur atoms is missing.

    “We found that these sulfur vacancies attract the hydrogen atoms in water at just the right strength: the attraction is strong enough pull the hydrogen out of the water molecule, but is then weak enough to let the hydrogen go,” says Cao.

    The researchers also found that the grain boundaries of MoS2 , which have been speculated by the research community to be catalytically active for hydrogen evolution, may only provide trivial activity. Grain boundaries are the boundaries between crystalline domains.

    The findings point to a new direction for improving the catalytic performance of MoS2 . Currently, the most common way is to increase the number of edge sites, because of the conventional wisdom that only the edge sites are catalytically active.

    “Our result indicates that grain boundaries should not be the factor to consider when thinking about improving catalytic activity,” Cao says. “The best way to improve the catalytic activities is to engineer sulfur vacancies. The edges of MoS2 are still twice as efficient at removing hydrogen atoms compared to the sulfur vacancies. But it’s difficult to create a high density of edges in MoS2 – a lot of the material’s area is wasted – whereas a large number of sulfur vacancies can be engineered uniformly across the material.”

    The researchers have also found that there is a “sweet spot” for maximizing the catalytic efficiency of MoS2 .

    “We get the best results when between 7 and 10 percent of the sulfur sites in MoS2 are vacant,” Cao says. “If you go higher or lower than that range, catalytic efficiency drops off significantly.”

    Additionally, the researchers found that the crystalline quality of MoS2 is important to optimize the catalytic activity of the sulfur vacancies. The sulfur vacancies in high crystalline quality MoS2 showed better efficiency than those in low crystalline quality MoS2 , even when the densities of the vacancies are the same.

    “In order to get the best output from sulfur vacancies, the crystalline quality of MoS2 needs to be very high,” says Guoqing Li, a Ph.D. student at NC State and lead author of the paper. “The ideal scenario would be 7 to 10 percent sulfur vacancies uniformly distributed in a single crystalline MoS2 film.”

    The work was done using MoS2 thin films that are only three atoms thick. Using these engineered thin films, the researchers were able to achieve catalytic efficiency comparable to previous MoS2 technologies that relied on having two or three orders of magnitude more surface area.

    “We now know that MoS2 is a more promising catalyst than we anticipated, and are fine-tuning additional techniques to further improve its efficiency,” Cao says. “Hopefully, this moves us closer to making a low-cost catalyst that is at least as good as platinum.”

    The paper, All the Catalytic Active Sites of MoS2 for Hydrogen Evolution,” is published in the Journal of the American Chemical Society. The paper was co-authored by Yifei Yu, David Peterson, Abdullah Zafar, Raj Kumar, Frank Hunte and Steve Shannon of NC State; Du Zhang, Stefano Curtarolo and Weitao Yang of Duke; and Qiao Qiao and Yimei Zhu of Brookhaven National Lab.

    The work was done with support from the Department of Energy’s Office of Science, under grants DE-SC0012575 and DE-SC0012704, as well as by the National Science Foundation under grant PHY1338917.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NC State campus

    NC State was founded with a purpose: to create economic, societal and intellectual prosperity for the people of North Carolina and the country. We began as a land-grant institution teaching the agricultural and mechanical arts. Today, we’re a pre-eminent research enterprise that excels in science, technology, engineering, math, design, the humanities and social sciences, textiles and veterinary medicine.

    NC State students, faculty and staff take problems in hand and work with industry, government and nonprofit partners to solve them. Our 34,000-plus high-performing students apply what they learn in the real world by conducting research, working in internships and co-ops, and performing acts of world-changing service. That experiential education ensures they leave here ready to lead the workforce, confident in the knowledge that NC State consistently rates as one of the best values in higher education.

  • richardmitnick 4:23 pm on December 1, 2016 Permalink | Reply
    Tags: Applied Research & Technology, Dealing with chemical weapons,   

    From Rutgers: “Rutgers Receives $19 Million to Develop Drugs to Treat Chemical Weapons Attacks” 

    Rutgers University
    Rutgers University

    December 1, 2016
    Robin Lally

    The National Institutes of Health continued funding Rutgers research to develop drugs to treat the toxicity of chemical weapons in case of a terrorist attack with an adidtional $19 million.

    The National Institutes of Health (NIH) has awarded Rutgers University a five-year grant for more than $19 million for research that would lead to the development of drugs to treat toxicity from chemical agents used in a terrorist attack.

    The grant – which first received funding in 2006 and again in 2011 – provides scientists at Rutgers, New York Medical College and Lehigh University the funds they need to continue a decades-long collaboration, aimed at devising drug therapies to use if deadly chemical poisons were released into the general population. Over the course of this project, NIH has provided more than $60 million to these investigators for this research.

    “Our preparedness in case of an attack in the United States and how you treat it is still of the utmost importance,” said Jeffrey Laskin, director of the Rutgers University CounterACT Research Center of Excellence, a federal program pursuing medical countermeasures. “Another important issue is for our military, the warfighters who may be exposed to chemicals on the battlefield.”

    The U.S. government wants researchers to develop drug products that would work as an antidote for individuals exposed to mustard gas, a chemical weapon banned under the 1925 Geneva protocol. First used by the German military against Allied troops in World War I and in subsequent wars including the Iran-Iraq conflict during the 1980s, symptoms range from skin irritations and conjunctivitis to severe ulcerations, blistering of the skin, blindness and irreversible damage to the respiratory tract.

    More recently, The Islamic State used chemical weapons, including sulfur mustard gas agents at least 52 times since 2014 on the battlefield in Syria and Iraq, according to a London-based intelligence collection and analysis service. News reports have indicated that ISIS militants have also loaded the gas into artillery shells and fired on people living in small villages miles away.

    “Many people don’t think of mustard gas anymore,” said Laskin, professor of environmental and occupational health at Rutgers University School of Public Health and the Rutgers Environmental and Occupational Health Sciences Institute (EOHSI). “But more than 100 years after it was used in World War I, it is still being used in Syria. It remains a great concern to both public health officials and the military.”

    Laskin said the Rutgers CounterACT team has met with the U.S. Food and Drug Administration to discuss its research and promising new drugs, and how long it will take to get drug approvals once the drug products get into the pipeline.

    The principal investigators besides Jeffrey Laskin include co-director, Donald Gerecke at Rutgers Ernest Mario School of Pharmacy; Marion Gordon, Debra Laskin and Patrick Sinko, also at the Rutgers School of Pharmacy; Diane Heck at New York Medical College; and Ned Heindel at Lehigh University. They work closely with MRIGlobal in Kansas City, where mustard gas experiments are being carried out.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition


    Rutgers, The State University of New Jersey, is a leading national research university and the state’s preeminent, comprehensive public institution of higher education. Rutgers is dedicated to teaching that meets the highest standards of excellence; to conducting research that breaks new ground; and to providing services, solutions, and clinical care that help individuals and the local, national, and global communities where they live.

    Founded in 1766, Rutgers teaches across the full educational spectrum: preschool to precollege; undergraduate to graduate; postdoctoral fellowships to residencies; and continuing education for professional and personal advancement.

    Rutgers smaller

  • richardmitnick 11:32 am on November 30, 2016 Permalink | Reply
    Tags: Applied Research & Technology, , ,   

    From popsci.com: “IBM Creates A Molecule That Could Destroy All Viruses” 


    Popular Science

    May 13, 2016 [Just found this in social media.]
    Claire Maldarelli

    Finding a cure for viruses like Ebola, Zika, or even the flu is a challenging task. Viruses are vastly different from one another, and even the same strain of a virus can mutate and change–that’s why doctors give out a different flu vaccine each year. But a group of researchers at IBM and the Institute of Bioengineering and Nanotechnology in Singapore sought to understand what makes all viruses alike. Using that knowledge, they’ve come up with a macromolecule that may have the potential to treat multiple types of viruses and prevent them from infecting us. The work was published recently in the journal Macromolecules.

    For their study, the researchers ignored the viruses’ RNA and DNA, which could be key areas to target, but because they change from virus to virus and also mutate, it’s very difficult to target them successfully.

    Instead, the researchers focused on glycoproteins, which sit on the outside of all viruses and attach to cells in the body, allowing the viruses to do their dirty work by infecting cells and making us sick. Using that knowledge, the researchers created a macromolecule, which is basically one giant molecule made of smaller subunits. This macromolecule has key factors that are crucial in fighting viruses. First, it’s able to attract viruses towards itself using electrostatic charges. Once the virus is close, the macromolecule attaches to the virus and makes the virus unable to attach to healthy cells. Then it neutralizes the virus’ acidity levels, which makes it less able to replicate.

    As an alternative way to fight, the macromolecule also contains a sugar called mannose. This sugar attaches to healthy immune cells and forces them closer to the virus so that the viral infection can be eradicated more easily.

    The researchers tested out this treatment in the lab on a few viruses, including Ebola and dengue, and they found that the molecule did work as they thought it would: According to the paper, the molecules bound to the glycoproteins on the viruses’ surfaces and reduced the number of viruses. Further, the mannose successfully prevented the virus from infecting immune cells.

    This all sounds promising, but the treatment still has a ways to go before it could be used as a disinfectant or even as a potential pill that we could take to prevent and treat viral infections. But it does represent a step in the right direction for treating viruses: figuring out what is similar about all viruses to create a broad spectrum antiviral treatment.



    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 10:56 am on November 30, 2016 Permalink | Reply
    Tags: Applied Research & Technology, , ,   

    From Vox: The Map of Physics Video 

    Access the mp4 video here .
    Watch, enjoy, learn.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 10:42 am on November 30, 2016 Permalink | Reply
    Tags: Applied Research & Technology, , , ,   

    From The Conversation: “Fusion energy: A time of transition and potential” 

    The Conversation

    November 29, 2016
    Stewart Prager
    Professor of Astrophysical Science, former director of the Princeton Plasma Physics Laboratory, Princeton University

    Michael C. Zarnstorff
    Deputy Director for Research, Princeton Plasma Physics Laboratory, Princeton University

    fusion energy. murrayashmole

    For centuries, humans have dreamed of harnessing the power of the sun to energize our lives here on Earth. But we want to go beyond collecting solar energy, and one day generate our own from a mini-sun. If we’re able to solve an extremely complex set of scientific and engineering problems, fusion energy promises a green, safe, unlimited source of energy. From just one kilogram of deuterium extracted from water per day could come enough electricity to power hundreds of thousands of homes.

    Since the 1950s, scientific and engineering research has generated enormous progress toward forcing hydrogen atoms to fuse together in a self-sustaining reaction – as well as a small but demonstrable amount of fusion energy. Skeptics and proponents alike note the two most important remaining challenges: maintaining the reactions over long periods of time and devising a material structure to harness the fusion power for electricity.

    As fusion researchers at the Princeton Plasma Physics Lab, we know that realistically, the first commercial fusion power plant is still at least 25 years away.


    But the potential for its outsize benefits to arrive in the second half of this century means we must keep working. Major demonstrations of fusion’s feasibility can be accomplished earlier – and must, so that fusion power can be incorporated into planning for our energy future.

    Unlike other forms of electrical generation, such as solar, natural gas and nuclear fission, fusion cannot be developed in miniature and then be simply scaled up. The experimental steps are large and take time to build. But the problem of abundant, clean energy will be a major calling for humankind for the next century and beyond. It would be foolhardy not to exploit fully this most promising of energy sources.

    Why fusion power?

    Adding heat to two isotopes of water can result in fusion. American Security Project, CC BY-ND

    In fusion, two nuclei of the hydrogen atom (deuterium and tritium isotopes) fuse together. This is relatively difficult to do: Both nuclei are positively charged, and therefore repel each other. Only if they are moving extremely fast when they collide will they smash together, fuse and thereby release the energy we’re after.

    This happens naturally in the sun. Here on Earth, we use powerful magnets to contain an extremely hot gas of electrically charged deuterium and tritium nuclei and electrons. This hot, charged gas is called a plasma.

    The plasma is so hot – more than 100 million degrees Celsius – that the positively charged nuclei move fast enough to overcome their electrical repulsion and fuse. When the nuclei fuse, they form two energetic particles – an alpha particle (the nucleus of the helium atom) and a neutron.

    Heating the plasma to such a high temperature takes a large amount of energy – which must be put into the reactor before fusion can begin. But once it gets going, fusion has the potential to generate enough energy to maintain its own heat, allowing us to draw off excess heat to turn into usable electricity.

    Fuel for fusion power is abundant in nature. Deuterium is plentiful in water, and the reactor itself can make tritium from lithium. And it is available to all nations, mostly independent of local natural resources.

    Fusion power is clean. It emits no greenhouse gases, and produces only helium and a neutron.

    It is safe. There is no possibility for a runaway reaction, like a nuclear-fission “meltdown.” Rather, if there is any malfunction, the plasma cools, and the fusion reactions cease.

    All these attributes have motivated research for decades, and have become even more attractive over time. But the positives are matched by the significant scientific challenge of fusion.

    Progress to date

    The progress in fusion can be measured in two ways. The first is the tremendous advance in basic understanding of high-temperature plasmas. Scientists had to develop a new field of physics – plasma physics – to conceive of methods to confine the plasma in strong magnetic fields, and then evolve the abilities to heat, stabilize, control turbulence in and measure the properties of the superhot plasma.

    Related technology has also progressed enormously. We have pushed the frontiers in magnets, and electromagnetic wave sources and particle beams to contain and heat the plasma. We have also developed techniques so that materials can withstand the intense heat of the plasma in current experiments.

    It is easy to convey the practical metrics that track fusion’s march to commercialization. Chief among them is the fusion power that has been generated in the laboratory: Fusion power generation escalated from milliwatts for microseconds in the 1970s to 10 megawatts of fusion power (at the Princeton Plasma Physics Laboratory) and 16 megawatts for one second (at the Joint European Torus in England) in the 1990s.


    A new chapter in research

    Under construction: the ITER research tokamak in France. ITER

    ITER Tokamak
    ITER Tokamak

    Now the international scientific community is working in unity to construct a massive fusion research facility in France. Called ITER (Latin for “the way”), this plant will generate about 500 megawatts of thermal fusion power for about eight minutes at a time. If this power were converted to electricity, it could power about 150,000 homes. As an experiment, it will allow us to test key science and engineering issues in preparation for fusion power plants that will function continuously.

    ITER employs the design known as the “tokamak,” originally a Russian acronym. It involves a doughnut-shaped plasma, confined in a very strong magnetic field, which is partly created by electrical current that flows in the plasma itself.

    Though it is designed as a research project, and not intended to be a net producer of electric energy, ITER will produce 10 times more fusion energy than the 50 megawatts needed to heat the plasma. This is a huge scientific step, creating the first “burning plasma,” in which most of the energy used to heat the plasma comes from the fusion reaction itself.

    ITER is supported by governments representing half the world’s population: China, the European Union, India, Japan, Russia, South Korea and the U.S. It is a strong international statement about the need for, and promise of, fusion energy.

    The road forward

    From here, the remaining path toward fusion power has two components. First, we must continue research on the tokamak. This means advancing physics and engineering so that we can sustain the plasma in a steady state for months at a time. We will need to develop materials that can withstand an amount of heat equal to one-fifth the heat flux on the surface of the sun for long periods. And we must develop materials that will blanket the reactor core to absorb the neutrons and breed tritium.

    The second component on the path to fusion is to develop ideas that enhance fusion’s attractiveness. Four such ideas are:

    The W-7X stellarator configuration. Max-Planck Institute of Plasmaphysics, CC BY

    Wendelstgein 7-X stellarator
    Wendelstgein 7-X stellarator

    1) Using computers, optimize fusion reactor designs within the constraints of physics and engineering. Beyond what humans can calculate, these optimized designs produce twisted doughnut shapes that are highly stable and can operate automatically for months on end. They are called “stellarators” in the fusion business.

    2) Developing new high-temperature superconducting magnets that can be stronger and smaller than today’s best. That will allow us to build smaller, and likely cheaper, fusion reactors.

    3) Using liquid metal, rather than a solid, as the material surrounding the plasma. Liquid metals do not break, offering a possible solution to the immense challenge how a surrounding material might behave when it contacts the plasma.

    4) Building systems that contain doughnut-shaped plasmas with no hole in the center, forming a plasma shaped almost like a sphere. Some of these approaches could also function with a weaker magnetic field. These “compact tori” and “low-field” approaches also offer the possibility of reduced size and cost.

    Government-sponsored research programs around the world are at work on the elements of both components – and will result in findings that benefit all approaches to fusion energy (as well as our understanding of plasmas in the cosmos and industry). In the past 10 to 15 years, privately funded companies have also joined the effort, particularly in search of compact tori and low-field breakthroughs. Progress is coming and it will bring abundant, clean, safe energy with it.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The Conversation US launched as a pilot project in October 2014. It is an independent source of news and views from the academic and research community, delivered direct to the public.
    Our team of professional editors work with university and research institute experts to unlock their knowledge for use by the wider public.
    Access to independent, high quality, authenticated, explanatory journalism underpins a functioning democracy. Our aim is to promote better understanding of current affairs and complex issues. And hopefully allow for a better quality of public discourse and conversation.

  • richardmitnick 9:18 am on November 30, 2016 Permalink | Reply
    Tags: Applied Research & Technology, , , The Great Barrier Reef has suffered the worst die-off ever recorded   

    From Science Alert: “The Great Barrier Reef has suffered the worst die-off ever recorded” 


    Science Alert

    28 NOV 2016
    Terry Hughes
    Distinguished Professor, James Cook University, James Cook University

    Britta Schaffelke
    Research Program Leader – A Healthy and Sustainable Great Barrier Reef, Australian Institute of Marine Science

    James Kerry
    Senior Research Officer, ARC Centre of Excellence for Coral Reef Studies, James Cook University

    This article was co-authored by David Wachenfeld, Director for Reef Recovery at the Great Barrier Reef Marine Park Authority.

    Two-thirds of the corals in the northern part of the Great Barrier Reef have died on in the reef’s worst-ever bleaching event, according to our latest underwater surveys.

    On some reefs in the north, nearly all the corals have died. However the impact of bleaching eases as we move south, and reefs in the central and southern regions (around Cairns and Townsville and southwards) were much less affected, and are now recovering.

    In 2015 and 2016, the hottest years on record, we have witnessed at first hand the threat posed by human-caused climate change to the world’s coral reefs.

    Heat stress from record high summer temperatures damages the microscopic algae (zooxanthellae) that live in the tissues of corals, turning them white.

    After they bleach, these stressed corals either slowly regain their zooxanthellae and colour as temperatures cool off, or else they die.

    The Great Barrier Reef bleached severely for the first time in 1998, then in 2002, and now again in 2016. This year’s event was more extreme than the two previous mass bleachings.

    Surveying the damage
    A scientist assesses the damage in a bleached coral reef (Pic: XL Catlin Seaview Survey)

    We undertook extensive underwater surveys at the peak of bleaching in March and April, and again at the same sites in October and November. In the northern third of the Great Barrier Reef, we recorded an average (median) loss of 67 percent of coral cover on a large sample of 60 reefs.

    The dieback of corals due to bleaching in just eight to nine months is the largest loss ever recorded for the Great Barrier Reef.

    To put these losses in context, over the 27 years from 1985 to 2012, scientists from the Australian Institute of Marine Science measured the gradual loss of 51 percent of corals on the central and southern regions of the Great Barrier Reef.

    They reported no change over this extended period in the amount of corals in the remote, northern region. Unfortunately, most of the losses in 2016 have occurred in this northern, most pristine part of the Great Barrier Reef.

    ARC Centre of Excellence for Coral Reef Studies

    Bright spots

    Healthy coral in the southern Great Barrier Reef in November 2016. Tory Chase, ARC Centre of Excellence for Coral Reef Studies., Author provided

    The bleaching, and subsequent loss of corals, is very patchy. Our map shows clearly that coral death varies enormously from north to south along the 2,300km length of the Reef.

    The southern third of the Reef did not experience severe heat stress in February and March. Consequently, only minor bleaching occurred, and we found no significant mortality in the south since then.

    In the central section of the Reef, we measured widespread but moderate bleaching, which was comparably severe to the 1998 and 2002 events. On average, only 6 percent of coral cover was lost in the central region in 2016.

    The remaining corals have now regained their vibrant colour. Many central reefs are in good condition, and they continue to recover from Severe Tropical Cyclones Hamish (in 2009) and Yasi (2011).

    In the eastern Torres Strait and outermost ribbon reefs in the northernmost part of the Great Barrier Reef Marine Park, we found a large swathe of reefs that escaped the most severe bleaching and mortality, compared to elsewhere in the north. Nonetheless, 26 percent of the shallow-water corals died.

    We suspect that these reefs were partially protected from heat stress by strong currents and upwelling of cooler water across the edge of the continental shelf that slopes steeply into the Coral Sea.

    For visitors, these surveys show there are still many reefs throughout the Marine Park that have abundant living coral, particularly in popular tourism locations in the central and southern regions, such as the Whitsundays and Cairns.


    Dead table corals killed by bleaching in the north, November 2016. Greg Torda, ARC Centre of Excellence for Coral Reef Studies., Author provided

    The northern third of the Great Barrier Reef, extending 700 km from Port Douglas to Papua New Guinea, experienced the most severe bleaching and subsequent loss of corals.

    On 25 percent of the worst affected reefs (the top quartile), losses of corals ranged from 83-99 percent. When mortality is this high, it affects even tougher species that normally survive bleaching.

    However, even in this region, there are some silver linings. Bleaching and mortality decline with depth, and some sites and reefs had much better than average survival. A few corals are still bleached or mottled, particularly in the north, but the vast majority of survivors have regained their colour.

    What will happen next?

    The reef science and management community will continue to gather data on the bleaching event as it slowly unfolds. The initial stage focused on mapping the footprint of the event, and now we are analysing how many bleached corals died or recovered over the past 8-9 months.

    Over the coming months and for the next year or two we expect to see longer-term impacts on northern corals, including higher levels of disease, slower growth rates and lower rates of reproduction.

    The process of recovery in the north – the replacement of dead corals by new ones – will be slow, at least 10-15 years, as long as local conditions such as water quality remain conducive to recovery.

    As global temperatures continue to climb, time will tell how much recovery in the north is possible before a fourth mass bleaching event occurs.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 8:55 am on November 30, 2016 Permalink | Reply
    Tags: Applied Research & Technology, , , Ring of Fire, Scientists have found the largest exposed fault on Earth   

    From Science Alert: “Scientists have found the largest exposed fault on Earth” 


    Science Alert

    29 NOV 2016

    Pulau Banta island in the Banta Sea. Credit: Jialiang Gao/Wikimedia

    For the first time, researchers have confirmed the existence of the largest exposed fault on Earth, and it could explain how a 7.2-km-deep (4.5-mile) abyss formed in the Pacific Ocean.

    Discovered beneath the Banda Sea in eastern Indonesia, the massive fault plane runs right through the notorious Ring of Fire – an explosive region where roughly 90 percent of the world’s earthquakes and 75 percent of all active volcanoes occur.

    SVG version of File:Pacific_Ring_of_Fire.png, recreated using WDB vector data using code mentioned in File:Worldmap_wdb_combined.svg. 11 February 2009. Gringer

    For almost a century, scientists have known about the Weber Deep – a massive chasm lurking near the Maluku Islands of Indonesia that forms the deepest point of Earth’s oceans not within a trench.

    But until now, no one could figure out how it formed.

    To investigate, geologists from the Australian National University (ANU) in Canberra and Royal Holloway University of London analysed maps of the sea floor taken from the Banda Sea region in the Pacific Ocean.

    They discovered that rocks sitting the bottom of the sea were cut by hundreds of straight parallel scars.

    Simulations of the sea floor suggested that a massive piece of crust bigger than Belgium was at some point ripped apart by a massive crack – or fault – in the oceanic plates to form a deep depression in the ocean floor.

    The activity appeared to have left behind the biggest exposed fault plane ever detected on Earth, which the researchers have tentatively called the Banda Detachment.

    When a fault forms in Earth’s crust, it forms two main features: a fault plane, which is the flat surface of a fault; and the fault line, which is the intersection of a fault plane with the ground surface.

    The team’s simulations showed that the Banda Detachment fault plane was exposed over an area of 60,000 square kilometres (23,166 square miles) when the sea floor cracked.

    “We had made a good argument for the existence of this fault we named the Banda Detachment, based on the bathymetry [underwater topography] data and on knowledge of the regional geology,” said one of the researchers, Gordon Lister from ANU.

    Diagram showing the Banda Detachment fault beneath the Weber Deep basin. Credit: ANU

    But as far as the researchers were concerned, this massive fault didn’t exist until they saw evidence of it with their own eyes.

    When they sailed out in the Pacific Ocean in eastern Indonesia, they identified prominent landforms in the water that were formed by the Banda Detachment fault plane.

    “I was stunned to see the hypothesised fault plane, this time not on a computer screen, but poking above the waves,” says one of the team, Jonathan Pownall from ANU. “The discovery will help explain how one of Earth’s deepest sea areas became so deep.”

    The team says the fact that the Weber Deep abyss formed right where the Banda Detachment was exposed could help researchers figure out how it formed.

    “Our research found that a 7 km-deep abyss beneath the Banda Sea off eastern Indonesia was formed by extension along what might be Earth’s largest-identified exposed fault plane,” says Pownall.

    The discovery could also help geologists predict the movements of one of the most tectonically active regions in the world – the Pacific Ring of Fire, a 40,000-km (25,000-mile) stretch of ocean dotted with no less than 452 volcanoes, which is around 75 percent of the world’s total.

    “In a region of extreme tsunami risk, knowledge of major faults such as the Banda Detachment, which could make big earthquakes when they slip, is fundamental to being able to properly assess tectonic hazards,” says Pownall.

    The research has been published in Geology.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 11:59 am on November 29, 2016 Permalink | Reply
    Tags: Applied Research & Technology, , EU and Brexit   

    From NatureIndex: “Brexit uncertainty disrupting EU-UK research” 



    29 November 2016
    Mark Peplow

    Thousands of people took to the streets for a series of protests against the referendum vote for Britain to leave the European Union. Kate Green/Anadolu/Getty

    Uncertainty surrounding Britain’s future in EU research could be as damaging to science as the prospect of funding cuts once it leaves the union.

    Helga Nowotny joined a vociferous chorus when she denounced the United Kingdom’s momentous decision to leave the European Union, “It’s a real loss for everyone — for the UK, and for European science,” says the former president of the European Research Council (ERC).

    Five months after the Brexit referendum and the country is still unsure exactly what it has voted for. The terms of the UK’s departure are far from set in stone, but it is already clear that the outcome of that process will have huge significance for science, both in the UK and across Europe.

    A wide range of factors underlies the strength of UK-EU partnerships. But for many researchers, they boil down to money, free movement, and broad collaboration: all of which would be weakened if the UK isolates itself from the EU funding schemes, agencies and rules that have fostered these partnerships.

    “Without a big, bold plan to guarantee UK science’s attractiveness, leadership and support of foreign researchers, the UK will haemorrhage talent and collaborations,” claims Mike Galsworthy, the programme director of Scientists for EU, a group that campaigned for the UK to remain part of the EU. UK universities receive about 16% of their research funding from the EU.

    Much comes directly from the vast €75 billion Horizon 2020 programme, which aims to foster multinational collaborations aimed at tackling big scientific questions and boosting innovation. About €13 billion of that pot is funnelled through large ERC grants, awarded for the highest-rated research. UK researchers have been particularly successful in winning these funds: from 2007 to 2013, they won €1.67 billion in ERC grants, some 22.4% of the total available and more than any other country.

    Show me the money

    Immediately after the referendum, research collaborations across the EU were thrown into doubt. Fearing that UK researchers would no longer be eligible for these funding streams, some EU partners started to get nervous about having them on board. Scientists for EU have logged 40 examples where Horizon 2020 projects have been disrupted, for example because UK researchers have stepped down from coordination roles.

    “I initially received condolences,” says Aldo Sorniotti of the University of Surrey, who is involved in several EU consortia developing control systems for electric vehicles. His collaborators said that they were reluctant to invite him to join future consortia, and that it was no longer appropriate for him to lead any of the consortia. As his group receives most of its funding from the EU, this was a serious blow.

    It soon became clear, however, that UK researchers would not immediately be denied EU funding. The European Commission encouraged UK researchers to keep participating in Horizon 2020, warned collaborations not to drop their UK partners, and emphasized that there would be no discrimination against UK researchers in funding decisions. “For the time being, nothing changes,” says Lucía Caudet, spokesperson for research, science and innovation at the commission. The commission is also monitoring grant awards, and has not seen any anomalies suggesting that UK researchers are suffering. “After the initial shock, the situation is getting back to normality,” says Sorniotti.

    Some concerns may be assuaged by a UK government commitment in August that it will ‘underwrite’ any Horizon 2020 research funds applied for successfully before the country leaves the EU, offering a guarantee that British researchers will not lose that money.

    But the government has not pledged to replace lost funding once the UK has left the EU. “It doesn’t reassure me that we’ll have access to EU funding after Brexit,” says Paul Crowther, head of the physics and astronomy department at the University of Sheffield. “In the long term, after 2020, it will be very difficult to replace the funding I get from the European Commission,” agrees Sorniotti.

    Whether or not UK researchers will have access to those funds after the UK leaves is one of the many unknowns that must be resolved in the coming negotiations. And any loss of EU funds could expose a weakness in the UK’s domestic science support, says Crowther: “UK funding has eroded against inflation over the past decade, and it’s the flagship EU grants that have helped to mitigate against that.”

    In May, Digital Science (a London-based consultancy, whose owner also has a stake in Nature’s publisher) estimated that Brexit could create a £1 billion funding gap for UK researchers. The UK’s leading universities could be amongst the hardest hit. Nature Index data show that the universities of Cambridge and Oxford, for example, were the UK’s most enthusiastic collaborators with other EU countries; making up 48 of the 100 most productive partnerships between two institutions (one from the UK, one from the EU) publishing papers together in top journals. According to Digital Science, Cambridge receives 20% of its academic research funding from the EU, and Oxford 23%.

    Dream team

    Money is only part of the story. The UK has benefitted enormously from its ability to attract top talent from other EU countries, according to Nowotny. “If you lose that, you’d be nowhere,” she says.

    The ability to work anywhere within the EU has also made it easy for project collaborators to spend time in the UK, or vice versa. “Freedom of movement has proven extraordinarily fruitful for science,” says one post-doctoral researcher in theoretical physics, who spoke to Nature Index on condition of anonymity.

    Brexit may limit that mobility in the future. “I’m at the stage in my career where that uncertainty is extremely damaging,” the postdoc says. Shortly before the referendum, he had been offered jobs in the UK and in North America. “I was on a knife’s edge,” he says. “But when the referendum result came out and it became clear to me that science would suffer, it was the straw that broke the camel’s back”. He started his new job across the Atlantic in September.

    Crowther is concerned that the UK will lose access to one of the funding mechanisms that has boosted mobility, known as the Marie Skłodowska-Curie actions. The programme provides grants for early-career researchers to spend a year in collaborators’ labs, or to support meetings of large groups. “They’re very good for training researchers, and giving them exposure to different groups and expertise,” he says. Yet researchers in his department have already been cut from one of these schemes because there was concern that “if the walls went up”, students would not easily be able to travel to the UK. “The uncertainty is already affecting UK participation in these projects,” he says. “It’s a shame.” The European Commission declined to comment on this specific case.

    Large multinational collaborations involving the UK and other EU members have also led to significant advances in some fields, says Galsworthy, because they have allowed EU researchers to tackle complex multidisciplinary questions, sharing resources, ideas and expertise that a single country could never tackle on its own. “You can put together a dream team,” he says.

    For example, the EU has overseen a series of ground-breaking projects to understand the health impact of low doses of radiation, and put €1 billion into the Graphene Flagship, a 10-year programme to develop applications for atom-thin materials. Common EU rules on data standards, clinical trials and animal testing help to enable these sorts of projects, he adds.

    Indeed, the UK’s strength in science has made it a popular partner for collaborations, creating a virtuous feedback loop that has helped to boost the UK’s share of grants from the EU science budget. This in turn gave the UK a big say in setting the agenda for the EU’s massive science programme. Now, it is set to lose that influence. Discussions on the successor to Horizon 2020 will begin next year, and the UK will probably not be part of those, says Nowotny. But the European Commission says that depends on whether the UK has formally declared its intention to leave the EU, by invoking article 50 of the Lisbon Treaty. Until that happens, the country will participate as a full member of the EU. Above all, it is this kind of uncertainty over the UK’s future relationship with the EU that is likely to stifle research in the coming years. “We are going to be in limbo for some time,” says Crowther.

    Some researchers hope that the UK could eventually strike a deal that allows it to pay a contribution in return for its researchers having access to programmes like Horizon 2020. Norway, not a member of the EU, has this kind of arrangement, but it must also accept freedom of movement for EU citizens to the country, something that committed Brexit supporters in the UK are dead-set against. “We’re in for a few years of mess,” says Galsworthy. “We were in a very privileged position before, and people are just waking up to that”.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: