Tagged: Energy Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 3:53 am on August 24, 2018 Permalink | Reply
    Tags: , , Energy,   

    From CSIROscope: “How hydrogen power can help us cut emissions, boost exports, and even drive further between refills” 

    CSIRO bloc

    From CSIROscope

    24 August 2018
    Sam Bruce

    1
    Could this be the way to fill up in future?

    Hydrogen could become a significant part of Australia’s energy landscape within the coming decade, competing with both natural gas and batteries, according to our new roadmap for the industry.

    2

    Hydrogen gas is a versatile energy carrier with a wide range of potential uses. However, hydrogen is not freely available in the atmosphere as a gas. It therefore requires an energy input and a series of technologies to produce, store and then use it.

    Why would we bother? Because hydrogen has several advantages over other energy carriers, such as batteries. It is a single product that can service multiple markets and, if produced using low- or zero-emissions energy sources, it can help us significantly cut greenhouse emissions.

    2
    Potential uses for hydrogen. No image credit.

    Compared with batteries, hydrogen can release more energy per unit of mass. This means that in contrast to electric battery-powered cars, it can allow passenger vehicles to cover longer distances without refuelling. Refuelling is quicker too and is likely to stay that way.

    The benefits are potentially even greater for heavy vehicles such as buses and trucks which already carry heavy payloads, and where lengthy battery recharge times can affect the business model.

    Hydrogen can also play an important role in energy storage, which will be increasingly necessary both in remote operations such as mine sites, and as part of the electricity grid to help smooth out the contribution of renewables such as wind and solar. This could work by using the excess renewable energy (when generation is high and/or demand is low) to drive hydrogen production via electrolysis of water. The hydrogen can then be stored as compressed gas and put into a fuel cell to generate electricity when needed.

    Australia is heavily reliant on imported liquid fuels and does not currently have enough liquid fuel held in reserve. Moving towards hydrogen fuel could potentially alleviate this problem. Hydrogen can also be used to produce industrial chemicals such as ammonia and methanol, and is an important ingredient in petroleum refining.

    Further, as hydrogen burns without greenhouse emissions, it is one of the few viable green alternatives to natural gas for generating heat.

    Our roadmap predicts that the global market for hydrogen will grow in the coming decades. Among the prospective buyers of Australian hydrogen would be Japan, which is comparatively constrained in its ability to generate energy locally. Australia’s extensive natural resources, namely solar, wind, fossil fuels and available land lend favourably to the establishment of hydrogen export supply chains.

    Why embrace hydrogen now?

    Given its widespread use and benefit, interest in the “hydrogen economy” has peaked and troughed for the past few decades. Why might it be different this time around? While the main motivation is hydrogen’s ability to deliver low-carbon energy, there are a couple of other factors that distinguish today’s situation from previous years.

    Our analysis shows that the hydrogen value chain is now underpinned by a series of mature technologies that are technically ready but not yet commercially viable. This means that the narrative around hydrogen has now shifted from one of technology development to “market activation”.

    The solar panel industry provides a recent precedent for this kind of burgeoning energy industry. Large-scale solar farms are now generating attractive returns on investment, without any assistance from government. One of the main factors that enabled solar power to reach this tipping point was the increase in production economies of scale, particularly in China. Notably, China has recently emerged as a proponent for hydrogen, earmarking its use in both transport and distributed electricity generation.

    But whereas solar power could feed into a market with ready-made infrastructure (the electricity grid), the case is less straightforward for hydrogen. The technologies to help produce and distribute hydrogen will need to develop in concert with the applications themselves.

    A roadmap for hydrogen

    In light of this, the primary objective of our National Hydrogen Roadmap is to provide a blueprint for the development of a hydrogen industry in Australia. With several activities already underway, it is designed to help industry, government and researchers decide where exactly to focus their attention and investment.

    Our first step was to calculate the price points at which hydrogen can compete commercially with other technologies. We then worked backwards along the value chain to understand the key areas of investment needed for hydrogen to achieve competitiveness in each of the identified potential markets. Following this, we modelled the cumulative impact of the investment priorities that would be feasible in or around 2025.

    3

    What became evident from the report was that the opportunity for clean hydrogen to compete favourably on a cost basis with existing industrial feedstocks and energy carriers in local applications such as transport and remote area power systems is within reach. On the upstream side, some of the most material drivers of reductions in cost include the availability of cheap low emissions electricity, utilisation and size of the asset.

    The development of an export industry, meanwhile, is a potential game-changer for hydrogen and the broader energy sector. While this industry is not expected to scale up until closer to 2030, this will enable the localisation of supply chains, industrialisation and even automation of technology manufacture that will contribute to significant reductions in asset capital costs. It will also enable the development of fossil-fuel-derived hydrogen with carbon capture and storage, and place downward pressure on renewable energy costs dedicated to large scale hydrogen production via electrolysis.

    In light of global trends in industry, energy and transport, development of a hydrogen industry in Australia represents a real opportunity to create new growth areas in our economy. Blessed with unparalleled resources, a skilled workforce and established manufacturing base, Australia is extremely well placed to capitalise on this opportunity. But it won’t eventuate on its own.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    SKA/ASKAP radio telescope at the Murchison Radio-astronomy Observatory (MRO) in Mid West region of Western Australia

    So what can we expect these new radio projects to discover? We have no idea, but history tells us that they are almost certain to deliver some major surprises.

    Making these new discoveries may not be so simple. Gone are the days when astronomers could just notice something odd as they browse their tables and graphs.

    Nowadays, astronomers are more likely to be distilling their answers from carefully-posed queries to databases containing petabytes of data. Human brains are just not up to the job of making unexpected discoveries in these circumstances, and instead we will need to develop “learning machines” to help us discover the unexpected.

    With the right tools and careful insight, who knows what we might find.

    CSIRO campus

    CSIRO, the Commonwealth Scientific and Industrial Research Organisation, is Australia’s national science agency and one of the largest and most diverse research agencies in the world.

     
  • richardmitnick 4:53 pm on August 22, 2018 Permalink | Reply
    Tags: , Energy, , , , PPPL's QUEST journal,   

    From PPPL: Two Items 


    From PPPL

    Advances in plasma and fusion science are described in Quest, PPPL’s research magazine.

    1

    July 9, 2018
    Larry Bernard

    From analyzing solar flares to pursuing “a star in a jar” to produce virtually limitless electric power, scientists at the U.S. Department of Energy’s (DOE) Princeton Plasma Physics Laboratory (PPPL) have developed insights and discoveries over the past year that advance understanding of the universe and the prospect for safe, clean, and abundant energy for all humankind.

    “Our research sheds new light on the function of plasma, the state of matter that comprises 99 percent of the visible universe,” writes Steve Cowley, new director of PPPL, in the 2018 edition of Quest, PPPL’s annual research magazine. Quest, just published in July 2018, summarizes in short, easy-to-digest format, much of the research that occurred at PPPL over the last year.

    Among the stories are descriptions of how scientists are finding ways to calm instabilities that can lead to the disruption of fusion reactions. Such research is critical to the next steps in advancing fusion energy to enable fusion devices to produce and sustain reactions that require temperatures many times hotter than the core of the sun.

    Fusion, the power that drives the sun and stars, fuses light elements and releases enormous energy. If scientists can capture and control fusion on Earth, the process could provide clean energy to produce electricity for millions of years.

    Plasma, the state of matter composed of free electrons and atomic nuclei that fuels fusion reactions and makes up 99 percent of the visible universe, unites PPPL research from astrophysics to nanotechnology to the science of fusion energy. Could planets beyond our solar system be habitable, for example? PPPL and Princeton scientists say that stellar winds — the outpouring of charged plasma particles from the sun into space — could deplete a planet’s atmosphere and dry up life-giving water over hundreds of millions of years, rendering a blow to the theory that these planets could host life as we know it.

    Quest details efforts to understand the scientific basis of fusion and plasma behavior. For example, in the section on Advancing Fusion Theory, physicists describe how bubble-like “blobs” that arise at the edge of the plasma can carry off heat needed for fusion reactions. Improved understanding of such behavior could lead to better control of the troublesome blobs.

    Another story outlines how researchers are using a form of artificial intelligence called “machine learning” to predict when disruptions that can halt fusion reactions and damage fusion devices occur. The innovative technique has so far yielded outstanding results.

    Included in Quest are descriptions of collaborations PPPL scientists and engineers have working on fusion devices around the world. These collaborations include ITER, the large multinational fusion device under construction in France, as well as research on devices in China, South Korea, and at the National Ignition Facility in the United States.

    Read also about PPPL’s long-standing efforts to educate students, teachers, and the public around STEM (science, technology, engineering, and math), as well as some of the award-winning work by scientists and inventors at PPPL.

    Quest can be accessed here, or at this web address: https://www.pppl.gov/quest

    See the full article here .

    PPPL diagnostic is key to world record of German fusion experiment
    July 9, 2018
    John Greenwald

    2
    PPPL physicist Novimir Pablant, right, and Andreas Langenberg of the Max Planck Institute in front of the housing for the x-ray crystal spectrometer prior to its installation in the W7-X. (Photo by Scott Massida )

    When Germany’s Wendelstein 7-X (W7-X) fusion facility set a world record for stellarators recently, a finely tuned instrument built and delivered by the U.S. Department of Energy’s (DOE) Princeton Plasma Physics Laboratory (PPPL) proved the achievement.

    Wendelstein 7-AS built in built in Greifswald, Germany

    The record strongly suggests that the design of the stellarator can be developed to capture on Earth the fusion that drives the sun and stars, creating “a star in a jar” to generate a virtually unlimited supply of electric energy.

    The record achieved by the W7-X, the world’s largest and most advanced stellarator, was the highest “triple product” that a stellarator has ever created. The product combines the temperature, density and confinement time of a fusion facility’s plasma — the state of matter composed of free electrons and atomic nuclei that fuels fusion reactions — to measure how close the device can come to producing self-sustaining fusion power. (The triple product was 6 x 1026 degrees x second per cubic meter — the new stellarator record.)

    Spectrometer maps the temperature

    The achievement produced temperatures of 40 million degrees for the ions and an energy confinement time, which measures how long it takes energy to leak out across the confining magnetic fields of 0.22 seconds. (The density was 0.8 x 1020 particles per cubic meter.) Measuring the temperature was an x-ray imaging crystal spectrometer (XICS) built by PPPL physicist Novimir Pablant, now stationed at W7-X, and engineer Michael Mardenfeld at PPPL. “The spectrometer provided the primary measurement,” said PPPL physicist Sam Lazerson, who also collaborates on W7-X experiments.

    Pablant implemented the device with scientists and engineers of the Max Planck Institute of Plasma Physics (IPP), which operates the stellarator in the Baltic Sea town of Greifswald, Germany. “It has been a great experience to work closely with my colleagues here on W7-X,” Pablant said. “Installing the XICS system was a major undertaking and it has been a pleasure to work with this world-class research team. The initial results from these high-performance plasmas are very exciting, and we look forward to using the measurements from our instrument to further understanding of the confinement properties of W7-X, which is a truly unique magnetic fusion experiment.”

    Researchers at IPP welcomed the findings. “Without XICS we could not have confirmed the record,” said Thomas Sunn Pedersen, director of stellarator edge and divertor physics at IPP. Concurred physicist Andreas Dinklage, lead author of a Nature Physics (link is external) paper confirming a key feature of the W7-X physical design: “The XICS data set was one of the very valuable inputs that confirmed the physics predictions.”

    PPPL physicist David Gates, technical coordinator of the U.S. collaboration on W7-X, oversaw construction of the instrument. “The XICS is an incredibly precise device capable of measuring very small shifts in wavelength,” said Gates. “It is a crucial part of our collaboration and we are very grateful to have the opportunity to participate in these important experiments on the groundbreaking W7-X device.”

    PPPL provides added components

    PPPL has designed and delivered additional components installed on the W7-X. These include a set of large trim coils that correct errors in the magnetic field that confines W7-X plasma, and a scraper unit that will lessen the heat reaching the divertor that exhausts waste heat from the fusion facility.

    The recent world record was a result of upgrades that IPP made to the stellarator following the initial phase of experiments, which began in December 2015. Improvements included new graphite tiles that enabled the higher temperatures and longer duration plasmas that produced the results. A new round of experiments is to begin this July using the new scraper unit that PPPL delivered.

    Stellarators, first constructed in the 1950s under PPPL founder Lyman Spitzer, can operate in a steady state, or continuous manner, with little risk of the plasma disruptions that doughnut-shaped tokamak fusion facilities face. But tokamaks are simpler to design and build, and historically have confined plasma better, which accounts for their much wider use in fusion laboratories around the world.

    An overall goal of the W7-X is to show that the twisty stellarator design can confine plasma just as well as tokamaks. When combined with the ability to operate virtually free of disruptions, such improvement could make stellarators excellent models for future fusion power plants.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition


    PPPL campus

    Princeton Plasma Physics Laboratory is a U.S. Department of Energy national laboratory managed by Princeton University. PPPL, on Princeton University’s Forrestal Campus in Plainsboro, N.J., is devoted to creating new knowledge about the physics of plasmas — ultra-hot, charged gases — and to developing practical solutions for the creation of fusion energy. Results of PPPL research have ranged from a portable nuclear materials detector for anti-terrorist use to universally employed computer codes for analyzing and predicting the outcome of fusion experiments. The Laboratory is managed by the University for the U.S. Department of Energy’s Office of Science, which is the largest single supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.

     
  • richardmitnick 11:55 am on August 3, 2018 Permalink | Reply
    Tags: , , Energy, New approaches to chemical and electrical energy conversions, ,   

    From Pacific Northwest National Lab: “New approaches to chemical and electrical energy conversions” 

    PNNL BLOC
    From Pacific Northwest National Lab

    July 16, 2018
    Susan Bauer
    susan.bauer@pnnl.gov
    (509) 372-6083

    For the second time, the U.S. Department of Energy renewed funding for a center designed to explore fundamental scientific principles that underpin technologies such as solar energy and fuel cells. Researchers at Pacific Northwest National Laboratory, together with partners at Yale University, the University of Wisconsin, Massachusetts Institute of Technology, the University of Washington, and Purdue University, earned the renewal through significant achievements in developing catalysts that can convert energy between electrical and chemical forms. Building on their success, and expanding their team, researchers are now poised to take on new challenges.

    The Center for Molecular Electrocatalysis was established in 2009 as a DOE Energy Frontier Research Center. DOE recently announced awards of $100 million for 42 new or continuing EFRCs, including this one led by PNNL. The centers are charged with pursuing the scientific underpinnings of various aspects of energy production, storage and use.

    Since 2009, CME researchers have been studying molecules called catalysts that convert electrical energy into chemical bonds and back again. Chemical bonds can store a huge amount of energy in a small amount of physical space. Of interest are catalysts that pack energy into bonds involving hydrogen, oxygen or nitrogen. Among the reactions studied are production of hydrogen, which can be used in fuel cells, and the reduction of oxygen, the reaction that balances the oxidation reaction of fuel cells.

    In the past four years, the Center for Molecular Electrocatalysis has reported:

    the fastest electrocatalysts for production of hydrogen,
    the fastest electrocatalysts for reduction of oxygen,
    and the most energy-efficient molecular electrocatalyst for reduction of oxygen.

    These fundamental scientific discoveries are important for our energy future. For example, a catalyst breaks chemical bonds to produce electricity in a fuel cell. An energy-efficient catalyst produces more power from fuel than an inefficient one — and fuel cells for vehicles need to release energy as fast as the explosions in a gasoline engine do.

    These efforts have sharpened scientists’ understanding of the central challenges in the field and laid the foundation for the ambitious goals for future studies.

    Directed by PNNL chemist Morris Bullock, the Center for Molecular Electrocatalysis expects to receive $3.2 million per year for the next four years and involve researchers from several complementary disciplines.

    “We are excited to be able to further our scientific mission by developing new approaches to circumventing traditional relationships found between rates and energy efficiency,” said Bullock. “These parameters are often correlated, such that improvements in one are obtained at the expense of the others. Typically, the faster catalysts are less energy efficient, and the more energy efficient catalysts are slower. To make breakthrough progress, we seek to remarkably improve catalyst performance through system-level design.”

    PNNL leads another Energy Frontier Research Center, Interfacial Dynamics in Radioactive Environments and Materials (IDREAM) which is focused on solving the chemistry challenges found in tanks holding a wide array of radioactive chemical waste generated from weapons production.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Pacific Northwest National Laboratory (PNNL) is one of the United States Department of Energy National Laboratories, managed by the Department of Energy’s Office of Science. The main campus of the laboratory is in Richland, Washington.

    PNNL scientists conduct basic and applied research and development to strengthen U.S. scientific foundations for fundamental research and innovation; prevent and counter acts of terrorism through applied research in information analysis, cyber security, and the nonproliferation of weapons of mass destruction; increase the U.S. energy capacity and reduce dependence on imported oil; and reduce the effects of human activity on the environment. PNNL has been operated by Battelle Memorial Institute since 1965.

    i1

     
  • richardmitnick 11:04 am on June 27, 2018 Permalink | Reply
    Tags: , Energy, , ,   

    From Max Planck Institute for Plasma Physics: “Wendelstein 7-X achieves world record” 

    MPIPP bloc

    From Max Planck Institute for Plasma Physics

    June 25, 2018
    Isabella Milch

    Wendelstgein 7-X stellarator, built in Greifswald, Germany

    Stellarator record for fusion product / First confirmation for optimisation

    In the past experimentation round Wendelstein 7-X achieved higher temperatures and densities of the plasma, longer pulses and the stellarator world record for the fusion product. Moreover, first confirmation for the optimisation concept on which Wendelstein 7-X is based, was obtained. Wendelstein 7-X at Max Planck Institute for Plasma Physics (IPP) in Greifswald, the world’s largest fusion device of the stellarator type, is investigating the suitability of this concept for application in power plants.

    1
    View inside the plasma vessel with graphite tile cladding. Photo: IPP, Jan Michael Hosan

    Unlike in the first experimentation phase 2015/16, the plasma vessel of Wendelstein 7-X has been fitted with interior cladding since September last year (see PI 8/2017). The vessel walls are now covered with graphite tiles, thus allowing higher temperatures and longer plasma discharges. With the so-called divertor it is also possible to control the purity and density of the plasma: The divertor tiles follow the twisted contour of the plasma edge in the form of ten broad strips along the wall of the plasma vessel. In this way, they protect particularly the wall areas onto which the particles escaping from the edge of the plasma ring are made to impinge. Along with impurities, the impinging particles are here neutralised and pumped off.

    “First experience with the new wall elements are highly positive”, states Professor Dr. Thomas Sunn Pedersen. While by the end of the first campaign pulse lengths of six seconds were being attained, plasmas lasting up to 26 seconds are now being produced. A heating energy of up to 75 megajoules could be fed into the plasma, this being 18 times as much as in the first operation phase without divertor. The heating power could also be increased, this being a prerequisite to high plasma density.

    2
    Wendelstein 7-X attained the Stellarator world record for the fusion product. This product of the ion temperature, plasma density and energy confinement time specifies how close one is getting to the reactor values needed to ignite a plasma. Graphic: IPP

    In this way a record value for the fusion product was attained. This product of the ion temperature, plasma density and energy confinement time specifies how close one is getting to the reactor values needed to ignite a plasma. At an ion temperature of about 40 million degrees and a density of 0.8 x 1020 particles per cubic metre Wendelstein 7-X has attained a fusion product affording a good 6 x 1026 degrees x second per cubic metre, the world’s stellarator record. “This is an excellent value for a device of this size, achieved, moreover, under realistic conditions, i.e. at a high temperature of the plasma ions”, says Professor Sunn Pedersen. The energy confinement time attained, this being a measure of the quality of the thermal insulation of the magnetically confined plasma, indicates with an imposing 200 milliseconds that the numerical optimisation on which Wendelstein 7-X is based might work: “This makes us optimistic for our further work.”

    The fact that optimisation is taking effect not only in respect of the thermal insulation is testified to by the now completed evaluation of experimental data from the first experimentation phase from December 2015 to March 2016, which has just been reported in Nature Physics (see below). This shows that also the bootstrap current behaves as expected. This electric current is induced by pressure differences in the plasma and could distort the tailored magnetic field. Particles from the plasma edge would then no longer impinge on the right area of the divertor. The bootstrap current in stellarators should therefore be kept as low as possible. Analysis has now confirmed that this has actually been accomplished in the optimised field geometry. “Thus, already during the first experimentation phase important aspects of the optimisation could be verified”, states first author Dr. Andreas Dinklage. “More exact and systematic evaluation will ensue in further experiments at much higher heating power and higher plasma pressure.”

    Since the end of 2017 Wendelstein 7-X has undergone further extensions: These include new measuring equipment and heating systems. Plasma experiments are to be resumed in July. Major extension is planned as of autumn 2018: The present graphite tiles of the divertor are to be replaced by carbon-reinforced carbon components that are additionally water-cooled. They are to make discharges lasting up to 30 minutes possible, during which it can be checked whether Wendelstein 7-X permanently meets its optimisation objectives as well.

    Background

    The objective of fusion research is to develop a power plant favourable to the climate and environment. Like the sun, it is to derive energy from fusion of atomic nuclei. Because the fusion fire needs temperatures exceeding 100 million degrees to ignite, the fuel, viz. a low-density hydrogen plasma, ought not to come into contact with cold vessel walls. Confined by magnetic fields, it is suspended inside a vacuum chamber with almost no contact.

    The magnetic cage of Wendelstein 7-X is produced by a ring of 50 superconducting magnet coils about 3.5 metres high. Their special shapes are the result of elaborate optimisation calculations. Although Wendelstein 7-X will not produce energy, it hopes to prove that stellarators are suitable for application in power plants.

    Its aim is to achieve for the first time in a stellarator the quality of confinement afforded by competing devices of the tokamak type. In particular, the device is to demonstrate the essential advantage of stellarators, viz. their capability to operate in continuous mode.

    Science paper:
    Magnetic configuration effects on the Wendelstein 7-X stellarator. Nature Physics

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    MPIPP campus

    The Max Planck Institute of Plasma Physics (Max-Planck-Institut für Plasmaphysik, IPP) is a physics institute for the investigation of plasma physics, with the aim of working towards fusion power. The institute also works on surface physics, also with focus on problems of fusion power.

    The IPP is an institute of the Max Planck Society, part of the European Atomic Energy Community, and an associated member of the Helmholtz Association.

    The IPP has two sites: Garching near Munich (founded 1960) and Greifswald (founded 1994), both in Germany.

    It owns several large devices, namely

    the experimental tokamak ASDEX Upgrade (in operation since 1991)
    the experimental stellarator Wendelstein 7-AS (in operation until 2002)
    the experimental stellarator Wendelstein 7-X (awaiting licensing)
    a tandem accelerator

    It also cooperates with the ITER and JET projects.

     
  • richardmitnick 10:44 am on June 12, 2018 Permalink | Reply
    Tags: , Energy, Revolutionizing geothermal energy research,   

    From Sanford Underground Research Facility: “Revolutionizing geothermal energy research” 

    SURF logo
    Sanford Underground levels

    From Sanford Underground Research Facility

    June 11, 2018
    Constance Walter

    The SIMFIP tool is changing the way researchers measure and design hydro fractures.

    1
    Deep underground on the 4850 Level at Sanford Lab, engineer Paul Cook explains how the SIMFIP tool will be used to measure openings in hard rock. Matthew Kapust

    On May 22, researchers with the SIGMA-V experiment worked in near silence in the West Drift on the 4850 Level. The locomotives sat quietly on the tracks, jack-leg drills rested against drift walls and operations ceased for several minutes at a time as the team began pumping pressurized water into the injection well, one of eight boreholes drilled for this experiment.

    “We requested quiet because we use sensitive seismic monitoring equipment,” said Tim Kneafsey, earth scientist at Lawrence Berkeley National Laboratory (LBNL). “The signals we measure are very small and we don’t want vibrations from other sources overwhelming those signals.”

    Kneafsey is the principal investigator for the Enhanced Geothermal Systems (EGS) Collab Project, a collaboration comprised of eight national laboratories and six universities who are working to improve geothermal technologies. The test featured the SIMFIP (Step-Rate Injection Method for Fracture In-Situ Properties), a tool that revolutionizes the way scientists can study geothermal energy, a process that pulls heat from the earth as it extracts steam or hot water, which is then converted to electricity.

    Developed at LBNL, the SIMFIP allows precise measurements of displacements in the rock and, most importantly, the aperture, or opening, of a hydro fracture.

    The extreme quiet paid off, Kneafsey said.

    “Our goal was to create a fracture from a specific zone in our injection well that would connect to our production well—about 10 meters away. And we were successful in doing that,” Kneafsy said.

    “People were excited when the connection between the boreholes was made and measured. But it took a while for the team to realize how far we had come and how much research, logistics, planning and collaboration went into that moment. It was gratifying to say the least, and there was certainly a sense of accomplishment.” —Hunter Knox

    The experiment

    Before the introduction of the SIMFIP, separate tools were used to create and measure hydro fractures. They work like this: “Straddle packers”—pipes with two deflated balloons on either end—are placed inside boreholes. Once inside, the balloons are inflated and water injected down the pipes to create an airtight section. They continue to pump water until the rock fractures, then remove the packers and insert the measuring tool. In the time it takes to do all that, much of the pertinent data is lost, leaving traces, but little else.

    “Even if you did get the aperture, when you released the pressure, the hydro fracture was already closing,” said Yves Guglielmi, a geologist at LBNL who designed the tool. “You don’t have the ‘true’ aperture and you also don’t know how the aperture might vary during the test.”

    With the introduction of the SIMFIP, a small device that sits between the two packers, they can the aperture in real-time.

    “This is really a new way to do the work,” Guglielmi said. “It will help us understand the whole process of initiating and growing hydro fractures in hard rock, which is kind of new. This is fundamental science. If we understand how hydro fractures will behave in this kind of rock, we can begin to make intelligent, complex fractures that can capture more heat from the earth.”

    The device is “bristling with sensors and other instrumentation that give us a close-up view of what happens when the rock is stimulated—all in real-time,” said Paul Cook, LBNL engineer.

    The SIMFIP measures fracture openings in hard rock in the EGS Collab test site. The team had drilled eight slightly downward-sloping boreholes in the rib (side) of the West Drift: The injection hole, used for stimulating the rock, and production well, which produces the fluid, run parallel to each other through the rock. Six other boreholes contain equipment to monitor microseismic activity (rock displacement); electrical resistivity tomography (subsurface imaging); temperature; and strain (how rocks move when stimulated).

    Nestled between the straddle packers in the injection hole, the SIMFIP measured the rock opening as the team looked on.

    The SIMFIP difference

    The SIGMA-V team hoped to see signals as small as a few microns of displacements in the rock. As they watched data accumulate in real time over a two-day period, the excitement in the West Drift was palpable.

    “People were excited when the connection between the boreholes was made and measured,” said Hunter Knox, the field coordinator with Sandia National Laboratory, “But it took a while for the team to realize how far we had come and how much research, logistics, planning and collaboration went into that moment. It was gratifying to say the least, and there was certainly a sense of accomplishment.”

    Measurements from the SIMFIP could remove barriers that stand in the way of commercializing geothermal systems, which have the potential to provide enough energy to power 100 million American homes.

    “We know fracturing rock can be done. But can it be effective for geothermal purposes? We need good, well-monitored field tests of fracturing, particularly in crystalline rock, to better understand that,” Kneafsey said.

    With the first test under its belt, the EGS Collab just moved a step closer to that goal.

    2
    No image credit or caption

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    About us.
    The Sanford Underground Research Facility in Lead, South Dakota, advances our understanding of the universe by providing laboratory space deep underground, where sensitive physics experiments can be shielded from cosmic radiation. Researchers at the Sanford Lab explore some of the most challenging questions facing 21st century physics, such as the origin of matter, the nature of dark matter and the properties of neutrinos. The facility also hosts experiments in other disciplines—including geology, biology and engineering.

    The Sanford Lab is located at the former Homestake gold mine, which was a physics landmark long before being converted into a dedicated science facility. Nuclear chemist Ray Davis earned a share of the Nobel Prize for Physics in 2002 for a solar neutrino experiment he installed 4,850 feet underground in the mine.

    Homestake closed in 2003, but the company donated the property to South Dakota in 2006 for use as an underground laboratory. That same year, philanthropist T. Denny Sanford donated $70 million to the project. The South Dakota Legislature also created the South Dakota Science and Technology Authority to operate the lab. The state Legislature has committed more than $40 million in state funds to the project, and South Dakota also obtained a $10 million Community Development Block Grant to help rehabilitate the facility.

    In 2007, after the National Science Foundation named Homestake as the preferred site for a proposed national Deep Underground Science and Engineering Laboratory (DUSEL), the South Dakota Science and Technology Authority (SDSTA) began reopening the former gold mine.

    In December 2010, the National Science Board decided not to fund further design of DUSEL. However, in 2011 the Department of Energy, through the Lawrence Berkeley National Laboratory, agreed to support ongoing science operations at Sanford Lab, while investigating how to use the underground research facility for other longer-term experiments. The SDSTA, which owns Sanford Lab, continues to operate the facility under that agreement with Berkeley Lab.

    The first two major physics experiments at the Sanford Lab are 4,850 feet underground in an area called the Davis Campus, named for the late Ray Davis. The Large Underground Xenon (LUX) experiment is housed in the same cavern excavated for Ray Davis’s experiment in the 1960s.
    LUX/Dark matter experiment at SURFLUX/Dark matter experiment at SURF

    In October 2013, after an initial run of 80 days, LUX was determined to be the most sensitive detector yet to search for dark matter—a mysterious, yet-to-be-detected substance thought to be the most prevalent matter in the universe. The Majorana Demonstrator experiment, also on the 4850 Level, is searching for a rare phenomenon called “neutrinoless double-beta decay” that could reveal whether subatomic particles called neutrinos can be their own antiparticle. Detection of neutrinoless double-beta decay could help determine why matter prevailed over antimatter. The Majorana Demonstrator experiment is adjacent to the original Davis cavern.

    Another major experiment, the Long Baseline Neutrino Experiment (LBNE)—a collaboration with Fermi National Accelerator Laboratory (Fermilab) and Sanford Lab, is in the preliminary design stages. The project got a major boost last year when Congress approved and the president signed an Omnibus Appropriations bill that will fund LBNE operations through FY 2014. Called the “next frontier of particle physics,” LBNE will follow neutrinos as they travel 800 miles through the earth, from FermiLab in Batavia, Ill., to Sanford Lab.

    Fermilab LBNE
    LBNE

     
  • richardmitnick 1:12 pm on March 11, 2018 Permalink | Reply
    Tags: , , Energy, Eni, ,   

    From MIT: “A new era in fusion research at MIT” 

    MIT News

    MIT Widget

    MIT News

    March 9, 2018
    Francesca McCaffrey | MIT Energy Initiative

    MIT Energy Initiative founding member Eni announces support for key research through MIT Laboratory for Innovation in Fusion Technologies.

    1

    A new chapter is beginning for fusion energy research at MIT.

    This week the Italian energy company Eni, a founding member of the MIT Energy Initiative (MITEI), announced it has reached an agreement with MIT to fund fusion research projects run out of the MIT Plasma Science and Fusion Center (PSFC)’s newly created Laboratory for Innovation in Fusion Technologies (LIFT). The expected investment in these research projects will amount to about $2 million over the following years.

    This is part of a broader engagement with fusion research and the Institute as a whole: Eni also announced a commitment of $50 million to a new private company with roots at MIT, Commonwealth Fusion Systems (CFS), which aims to make affordable, scalable fusion power a reality.

    “This support of LIFT is a continuation of Eni’s commitment to meeting growing global energy demand while tackling the challenge of climate change through its research portfolio at MIT,” says Robert C. Armstrong, MITEI’s director and the Chevron Professor of Chemical Engineering at MIT. “Fusion is unique in that it is a zero-carbon, dispatchable, baseload technology, with a limitless supply of fuel, no risk of runaway reaction, and no generation of long-term waste. It also produces thermal energy, so it can be used for heat as well as power.”

    Still, there is much more to do along the way to perfecting the design and economics of compact fusion power plants. Eni will fund research projects at LIFT that are a continuation of this research and focus on fusion-specific solutions. “We are thrilled at PSFC to have these projects funded by Eni, who has made a clear commitment to developing fusion energy,” says Dennis Whyte, the director of PSFC and the Hitachi America Professor of Engineering at MIT. “LIFT will focus on cutting-edge technology advancements for fusion, and will significantly engage our MIT students who are so adept at innovation.”

    Tackling fusion’s challenges

    The inside of a fusion device is an extreme environment. The creation of fusion energy requires the smashing together of light elements, such as hydrogen, to form heavier elements such as helium, a process that releases immense amounts of energy. The temperature at which this process takes place is too hot for solid materials, necessitating the use of magnets to hold the hot plasma in place.

    One of the projects PSFC and Eni intend to carry out will study the effects of high magnetic fields on molten salt fluid dynamics. One of the key elements of the fusion pilot plant currently being studied at LIFT is the liquid immersion blanket, essentially a flowing pool of molten salt that completely surrounds the fusion energy core. The purpose of this blanket is threefold: to convert the kinetic energy of fusion neutrons to heat for eventual electricity production; to produce tritium — a main component of the fusion fuel; and to prevent the neutrons from reaching other parts of the machine and causing material damage.

    It’s critical for researchers to be able to predict how the molten salt in such an immersion blanket would move when subjected to high magnetic fields such as those found within a fusion plant. As such, the researchers and their respective teams plan to study the effects of these magnetohydrodynamic forces on the salt’s fluid dynamics.

    A history of innovation

    During the 23 years MIT’s Alcator C-Mod tokamak fusion experiment was in operation, it repeatedly advanced records for plasma pressure in a magnetic confinement device. Its compact, high-magnetic-field fusion design confined superheated plasma in a small donut-shaped chamber.

    “The key to this success was the innovations pursued more than 20 years ago at PSFC in developing copper magnets that could access fields well in excess of other fusion experiments. The coupling between innovative technology development and advancing fusion science is in the DNA of the Plasma Science and Fusion Center,” says PSFC Deputy Director Martin Greenwald.

    In its final run in 2016, Alcator C-Mod set a new world record for plasma pressure, the key ingredient to producing net energy from fusion. Since then, PSFC researchers have used data from these decades of C-Mod experiments to continue to advance fusion research. Just last year, they used C-Mod data to create a new method of heating fusion plasmas in tokamaks which could result in the heating of ions to energies an order of magnitude greater than previously reached.

    A commitment to low-carbon energy

    MITEI’s mission is to advance low-carbon and no-carbon emissions solutions to efficiently meet growing global energy needs. Critical to this mission are collaborations between academia, industry, and government — connections MITEI helps to develop in its role as MIT’s hub for multidisciplinary energy research, education, and outreach.

    Eni is an inaugural, founding member of the MIT Energy Initiative, and it was through their engagement with MITEI that they became aware of the fusion technology commercialization being pursued by CFS and its immense potential for revolutionizing the energy system. It was through these discussions, as well, that Eni investors learned of the high-potential fusion research projects taking place through LIFT at MIT, spurring them to support the future of fusion at the Institute itself.

    Eni CEO Claudio Descalzi said, “Today is a very important day for us. Thanks to this agreement, Eni takes a significant step forward toward the development of alternative energy sources with an ever lower environmental impact. Fusion is the true energy source of the future, as it is completely sustainable, does not release emissions or waste, and is potentially inexhaustible. It is a goal that we are determined to reach quickly.” He added, “We are pleased and excited to pursue such a challenging goal with a collaborator like MIT, with unparalleled experience in the field and a long-standing and fruitful alliance with Eni.”

    These fusion projects are the latest in a line of MIT-Eni collaborations on low- and no-carbon energy projects. One of the earliest of these was the Eni-MIT Solar Frontiers Center, established in 2010 at MIT. Through its mission to develop competitive solar technologies, the center’s research has yielded the thinnest, lightest solar cells ever produced, effectively able to turn any surface, from fabric to paper, into a functioning solar cell. The researchers at the center have also developed new, luminescent materials that could allow windows to efficiently collect solar power.

    Other fruits of MIT-Eni collaborations include research into carbon capture systems to be installed in cars, wearable technologies to improve workplace safety, energy storage, and the conversion of carbon dioxide into fuel.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    MIT Seal

    The mission of MIT is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of the MIT community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

    MIT Campus

     
  • richardmitnick 12:59 pm on March 9, 2018 Permalink | Reply
    Tags: , , Energy, Gasification, , Turning landfill into energy   

    From Horizon: “Turning landfill into energy” 

    1

    Horizon

    07 March 2018
    Jon Cartwright

    1
    Advanced gasification methods can turn any waste except metal and rubble into fuel for electricity. Credit – Pixabay/ Prylarer

    Landfill is both ugly and polluting. But a new breed of technology promises to make it a thing of the past, transforming a huge portion of landfill material into clean gas.

    It’s thanks to a process called gasification, which involves turning carbon-based materials into gas by heating them to a high temperature but without burning them. The gas can be stored until it is needed for the generation of electricity.

    According to its developers, advanced gasification can be fed by plastic, biomass, textiles – just about anything except metal and rubble. Out of the other end comes syngas – a clean, easily combustible gas made up of carbon monoxide and hydrogen.

    The basics of the technology are old. Back in the 19th century, gasification plants existed in many of Europe’s major cities, turning coal into coal gas for heating and lighting.

    Gasification waned after the discovery of natural gas reserves early last century. Then in the past 20 years or so, it had a small renaissance, as gasification plants sprung up to process waste wood.

    In a new, advanced implementation, however, a much broader range of materials can be processed, and the output gas is much cleaner. ‘Gasification is clearly gaining a lot of traction, but we’ve taken it further,’ said Jean-Eric Petit of French company CHO Power, based in Bordeaux.

    Gasification

    Gasification involves heating without combustion. At temperatures greater than 700°C, a lot of hydrocarbon-based materials break down into a gas of carbon monoxide and hydrogen – syngas – which can be used as a fuel.

    For materials such as wood, this is relatively straightforward. Try it with other hydrocarbon materials, and especially hard-to-recycle industrial waste, however, and the reaction tends to generate pollutants, such as tar.

    But tar itself is just a more complex hydrocarbon. That is why Petit and his colleagues have developed a higher temperature process, at some 1200°C, in which even tar is broken down.

    The result is syngas, which, unline other thermal processes, does not create dangerous pollutants. In fact, it is high-quality enough to be fed directly into high-efficiency gas engines, generating electricity with twice the efficiency of the steam turbines used with conventional gasification, says Petit.

    CHO Power has already built an advanced gasification plant in Morcenx, France, which converts 55,000 tonnes of wood, biomass and industrial waste a year into 11 megawatts of electricity.

    In December the EU announced that the company will receive a €30 million loan from the European Investment Bank to construct another plant in the Thouarsais area of France.

    The company is not the first to attempt advanced gasification on a commercial scale. But, said Petit: ‘We think we’re the first to crack it.’

    CHO Power’s gasification plants still need to have waste delivered to them. Hysytech, a company in Torino, Italy, however, plans to bring gasification to industry’s door.

    The idea is to build a small gasification plant, processing at least 100 kilos per hour of waste, next to any industrial plant that deals with hydrocarbon materials – a textiles or plastics manufacturer, for instance.

    Then, any waste the industrial plant generates can be turned straight into syngas for electricity generation on site, avoiding the emissions associated with transporting waste to a distant gasification plant.

    2
    The gas produced by CHO Power’s gasification process is refined at 1,200°C in their turboplasma facility (left) so that it can be used in a gas engine (right) to generate electricity. Credit – CHO Power

    Small-scale

    The problem is that, historically, gasification on this scale has cost too much to be in an industry’s interests. But Hysytech believes it has made small-scale gasification cost effective, by developing a novel reactor known as a fluidised bed.

    When waste materials are fed into this reactor, a fluid is passed through them to create an even temperature and to allow the gas to leave easily. If the materials need a lot of time to turn to gas, they remain in the reactor until they are gasified, but the fluid can be sped up if the materials turn to gas quickly.

    The result, for smaller plants at least, is a more efficient and cost-effective process. ‘Our system is designed and built to operate year-round with a good efficiency, easy operation and little maintenance,’ said Andrés Saldivia, Hysytech’s head of business development.

    Hysytech has built a pilot plant that has about one-tenth the envisaged output, processing 10 kilos of waste an hour into syngas. Currently, its engineers are constructing a full-sized demo plant that will include an additional power-to-gas system, to link the gasification to surplus energy from wind turbines and solar panels so the energy is not wasted.

    With this additional system, the surplus energy is used to split water into hydrogen and oxygen. Using a carbon source, this hydrogen is then converted into methane, which can be used like everyday natural gas.

    ‘Our goal is to have it ready for the market (by) 2019,’ said Saldivia.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 8:03 am on March 4, 2018 Permalink | Reply
    Tags: , Chronobiology, , Energy, Light polution of the Earth, Most of the growth came from developing nations,   

    From popsci: “Light pollution is getting worse” 

    popsci-bloc

    Popular Science

    November 27, 2017 [Just now in social media.]
    Rachel Feltman

    1
    Do not go gently into that goodnight, night! Depositphotos.

    Goodbye darkness, my old friend.

    According to a study published last week in Science Advances, the world is getting brighter. And not in a ‘my future’s so bright I gotta wear shades’ kinda way. The future’s so bright that we should probably all be wearing eyeshades to bed, and turning some lights off while we’re at it.

    “We’re losing more and more of the night on a planetary scale,” Kip Hodges, a member of Science Advances‘ editorial board, said during a teleconference on the paper. “Earth’s night is getting brighter.”

    The data comes from satellite observations made each October from 2012 through 2016. Researchers scanned these sky-by-night shots to see how much artificial light shone through the darkness around the world, and how the brightness changed over time.

    3
    India in 2012. NASA/NOAA.

    They report an increase in artificially-lit areas of about 2.2 percent per year. The total radiance growth—the extent to which the brightness of those lights increased—was about the same.

    Unsurprisingly, most of the growth came from developing nations. It makes sense that countries beginning to thrive in industry would require additional outdoor lighting as cities start to spring up. In fact, light pollution increase can be tied pretty reliably to a growth in Gross Domestic Product (GDP).

    According to previous research, the study notes, humans tend to use about as much artificial light as .07 percent of their country’s GDP will pay for. As GDP surged in countries within South America, Africa, and Asia, so did their use of artificial lighting.

    But while developed nations such as the U.S. appeared more stable in satellite images (sometimes even becoming slightly dimmer) there’s still reason to worry. The satellite used in the study can’t actually pick up all visible wavelengths of light. It can see the red, orange, and yellow light of older bulbs, but the blue light of light-emitting diodes (LEDs) doesn’t show up in the picture.

    LEDs are wildly more efficient than older sources of light, and last for much longer, so many cities and individuals have made the switch in recent years to cut costs and help the environment. The researchers worry that their results indicate a “rebound effect,” where the increased use of efficient LEDs is being offset by more widespread light pollution in general, often from older, less efficient bulbs. Photos taken from the International Space Station, which pick up all visible wavelengths, show cities shifting from yellow to blue in hue.

    4
    ISS Flies Over the Mediterranean. ESA flight engineer Tim Peake captured this image of Earth while flying over the Mediterranean Sea on January 25, 2016.
    ESA/NASA.

    Meanwhile, urban sprawl is pushing those bright borders out farther and farther.

    It might not be as immediately deadly as air pollution, but light pollution can harm many forms of life. For humans, the burgeoning field of chronobiology—the study of how our sleep and wake cycles affect our health—suggests that artificial light, especially of the blue variety, can trigger wakefulness when our bodies should be preparing for a good night’s sleep. Excessive exposure to nighttime light is now linked to everything from cancer to obesity.

    “Inside light is just terrible for you,” Susan Golden, director of the University of California at San Diego’s Center for Circadian Biology, told PopSci several months ago. “It is making us all sick.”

    To make matters worse, the increasing encroachment of artificial light on the outside world is hurting other organisms, too. Humans are fighting our entire evolutionary history by turning on lights and staring at screens after sunset, but at least most of us can choose to draw the blackout curtains and ban phones from the bedroom. The animals that live in and around our cities don’t have the same luxury, and it’s impossible to know just how badly light pollution might affect them.

    But while light pollution might be more insidious than smog, it’s also much easier to fix.

    “Usually when we think of how humanity messes with environment, it’s a costly thing to fix or reverse,” Kevin Gaston from the University of Exeter told the BBC. “For light, it’s just a case of directing it where we need it and not wasting it where we don’t.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 6:12 am on October 19, 2017 Permalink | Reply
    Tags: A sharp rise in the content of sediments, , , Energy, , Hydroelectric power plants, LMH-EPFL's Laboratory for Hydraulic Machines, Of all the electricity produced in Switzerland 56% comes from hydropower, One of the aims of Switzerland’s 2050 Energy Strategy is to increase hydroelectric production, SCCER-SoE-Swiss Competence Center for Energy Research - Supply of Electricity   

    From EPFL: “Hydroelectric power plants have to be adapted for climate change” 

    EPFL bloc

    École Polytechnique Fédérale de Lausanne EPFL

    19.10.17
    Clara Marc

    1
    © 2017 LMH – Grande Dixence dam. This hydroelectric power complex generates some 2 billion kWh of power per year
    Of all the electricity produced in Switzerland, 56% comes from hydropower. The life span of hydroelectric plants, which are massive and expensive to build and maintain, is measured in decades, yet the rivers and streams they depend on and the surrounding environment are ever-changing. These changes affect the machinery and thus the amount of electricity that can be revised. EPFL’s Laboratory for Hydraulic Machines (LMH) is working on an issue that will be very important in the coming years: the impact of sediment erosion on turbines, which are the main component of this machinery. The laboratory’s work could help prolong these plants’ ability to produce electricity for Switzerland’s more than eight million residents.

    One of the aims of Switzerland’s 2050 Energy Strategy is to increase hydroelectric production. The Swiss government therefore also needs to predict the environment in which these power plants will operate so that the underlying technology can keep pace with changing needs and future conditions. “In Switzerland, the glaciers and snow are melting more and more quickly. This affects the quality of the water, with a sharp rise in the content of sediments,” says François Avellan, who heads the LMH and is one of the study’s authors. “The sediments are very aggressive and erode the turbines.” This undermines the plants’ efficiency, leaves cavities in the equipment and leads to an increase in vibrations – and in the frequency and cost of repairs. To top things off, the turbines’ useful life is reduced. Under the umbrella of the Swiss Competence Center for Energy Research – Supply of Electricity (SCCER-SoE) and with the support of the Commission for Technology and Innovation (CTI), EPFL has teamed up with General Electric Renewable Energy in an effort to better understand and predict the process of sediment erosion. The aim is to lengthen the hydropower plants’ life span through improved turbines and more effective operating strategies.

    Tiny particles with an outsized impact

    One of the challenges facing researchers in the field of hydropower is that they cannot run experiments directly on power plants because of the impact and cost of a plant’s outage. They must therefore limit their investigations to simulations and reduced-scale physical model tests. In response to this challenge, the LMH has come up with a novel multiscale computer model that predicts sediment erosion in turbines with much greater accuracy than other approaches. The results have been published in the scientific journal Wear. “Sediment erosion, like many other problems in nature, is a multiscale phenomenon. It means that behavior observed at the macroscopic level is the result of a series of interactions at the microscopic level,” says Sebastián Leguizamón, an EPFL doctoral student and lead author of the study. “The sediments are extremely small and move very fast, and their impact lasts less than a microsecond. On the other hand, the erosion process we see is gradual, taking place over the course of many operating hours and affecting all the turbine.”

    A multiscale solution

    The researchers therefore opted for a multiscale solution and modeled the two processes involved in erosion separately. At the microscopic level, they focused on the extremely brief impact of the minuscule sediments that strike the turbines, taking into account parameters such as the angle, speed, size, shape – and even composition – of the slurry. At the macroscopic level, they looked at how the sediments are transported by water flow, as this affects the flux, distribution and density of sediments reaching the walls of the turbine flow passages. The results were then combined in order to develop erosion predictions. “It’s not possible to study the entire process of erosion as a whole. The sediments are so small and the period of time over which the process takes place so long that replicating the process would take hundreds of years of calculations and require a computer that doesn’t exist yet,” says Leguizamón. “But the problem becomes manageable when you decouple the different phases.”

    Adapting to the future

    With conclusive results in hand, the LMH has now moved on to the next phase, which consists in characterizing the materials used in the turbines. Following this step, the researchers will be able to apply the new model to existing hydroelectric facilities. The stakes are global when it comes to retrofitting turbines in response to climate change, as hydropower accounts for 17% of the world’s electricity production. Turbines offer little leeway and have to operate in a wide range of environments – including monsoon regions and anything from tropical to alpine climates. If turbines are to last, changes will have to be made to both their underlying design and how they are operated. “While I was evaluating a hydro plant in the Himalayas, my contacts there told me that if a turbine made it through more than one monsoon season, that was a success!” says Avellan.

    This study is part of CTI project No. 17568.1 PFEN-IW GPUSpheros. It was conducted in conjunction with General Electric Renewable Energy under the umbrella of the Swiss Competence Center for Energy Research – Supply of Electricity (SCCER-SoE).

    A multiscale model for sediment impact erosion simulation using the finite volume particle method, Sebastián Leguizamón, Ebrahim Jahanbakhsh, Audrey Maertens, Siamak Alimirzazadeh and François Avellan. Science Direct.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    EPFL campus

    EPFL is Europe’s most cosmopolitan technical university with students, professors and staff from over 120 nations. A dynamic environment, open to Switzerland and the world, EPFL is centered on its three missions: teaching, research and technology transfer. EPFL works together with an extensive network of partners including other universities and institutes of technology, developing and emerging countries, secondary schools and colleges, industry and economy, political circles and the general public, to bring about real impact for society.

     
  • richardmitnick 1:41 pm on October 8, 2017 Permalink | Reply
    Tags: , Energy, , Using University of Michigan buildings as batteries   

    From University of Michigan: “Using University of Michigan buildings as batteries” 

    U Michigan bloc

    University of Michigan

    September 21, 2017 [hiding your light under a bushel?]
    Dan Newman

    How a building’s thermal energy can help the power grid accommodate more renewable energy sources.

    1
    Connor Flynn, an energy engineer with the Energy Management team, helps Aditya Keskar, a master’s student in electrical and computer engineering, retrieve data from a campus building’s HVAC system.
    No image credit.

    Michigan researchers and staff are testing how to use the immense thermal energy of large buildings as theoretical battery packs. The goal is to help the nation’s grid better accommodate renewable energy sources, such as wind and solar.

    For power grids, supply must closely track demand to ensure smooth delivery of electric power. Incorporating renewable energy sources into the grid introduces a large degree of unpredictability to the system. For example, peak solar generation occurs during the day, while peak electricity demand occurs in the evening. Because of this, California, the leading solar producer in the U.S., has had to pay other states to take excess electricity off of its grid, and at other times simply wasted potential electricity by disconnecting solar panels.

    As renewable sources become more prevalent, so does the unpredictability and mismatched supply and demand, creating a growing problem in how to keep better control of both.

    To address this, and help demand for electricity react to the variability of supply from renewable energy sources, an MCubed project is testing how buildings store energy.

    The team consisted originally of project leader Johanna Mathieu, assistant professor of electrical engineering and computer science (EECS), Ian Hiskens, Vennema Professor of Engineering and professor of EECS, and Jeremiah Johnson, formerly an assistant professor at the School of Natural Resources and Environment and now an associate professor at North Carolina State University. Additionally, Dr. Sina Afshari, former postdoctoral researcher, helped set up the project on campus.

    “The goal is to utilize a building as a big battery: dump energy in and pull energy out in a way that the occupants don’t know is going on and the building managers aren’t incurring any extra costs. That’s the holy grail,” Hiskens said. “You wouldn’t have to buy chemical batteries and dispose of them a few years later.”

    Commercial buildings, like those around campus, use massive Heating, Ventilation, and Air Conditioning (HVAC) systems to keep occupants comfortable. Large buildings require a vast amount of energy to heat and cool, and their HVAC systems consume around 20% of the electricity generated in the United States.

    However, the large building size also means any short-term changes in a thermostat will not be felt. This means a building can cut or increase power to its HVAC for a short time to help a power grid match supply and demand, while the building’s temperature remains unchanged.

    2
    Aditya Keskar downloads data from another campus building’s HVAC system.

    Aditya Keskar, who is pursuing his masters in electrical engineering and computer science, has been working with staff to test these short-term changes in HVAC power consumption in three campus buildings.

    “We’ve had immense support from the Plant Operations team and building managers. They’ve helped us gather baseline data over months, and implement the tests,” Keskar said. “With their help, we were able to make short-term adjustments to their HVAC system with no change in the actual temperature, and no complaints from building occupants.”

    If there is a surplus of supply on the grid due to heavy wind production, for example, a building automation system (BAS), which controls an HVAC system, could automatically lower its thermostat settings in the summer and increase its energy use for fifteen minutes, and then raise the thermostat to balance the extra energy consumed. This action would soak up some of the excess electricity and help to maintain equilibrium on the grid.

    If darker skies reduce the usual solar production, a BAS could raise its thermostat setting in the summer and decrease its energy use immediately, then lower the thermostat to balance the extra energy consumed.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    U MIchigan Campus

    The University of Michigan (U-M, UM, UMich, or U of M), frequently referred to simply as Michigan, is a public research university located in Ann Arbor, Michigan, United States. Originally, founded in 1817 in Detroit as the Catholepistemiad, or University of Michigania, 20 years before the Michigan Territory officially became a state, the University of Michigan is the state’s oldest university. The university moved to Ann Arbor in 1837 onto 40 acres (16 ha) of what is now known as Central Campus. Since its establishment in Ann Arbor, the university campus has expanded to include more than 584 major buildings with a combined area of more than 34 million gross square feet (781 acres or 3.16 km²), and has two satellite campuses located in Flint and Dearborn. The University was one of the founding members of the Association of American Universities.

    Considered one of the foremost research universities in the United States,[7] the university has very high research activity and its comprehensive graduate program offers doctoral degrees in the humanities, social sciences, and STEM fields (Science, Technology, Engineering and Mathematics) as well as professional degrees in business, medicine, law, pharmacy, nursing, social work and dentistry. Michigan’s body of living alumni (as of 2012) comprises more than 500,000. Besides academic life, Michigan’s athletic teams compete in Division I of the NCAA and are collectively known as the Wolverines. They are members of the Big Ten Conference.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: