Tagged: Clean Energy Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 3:20 pm on October 29, 2014 Permalink | Reply
    Tags: , , Clean Energy, , ,   

    From LBL: “New Lab Startup Afingen Uses Precision Method to Enhance Plants” 

    Berkeley Logo

    Berkeley Lab

    October 29, 2014
    Julie Chao (510) 486-6491

    Imagine being able to precisely control specific tissues of a plant to enhance desired traits without affecting the plant’s overall function. Thus a rubber tree could be manipulated to produce more natural latex. Trees grown for wood could be made with higher lignin content, making for stronger yet lighter-weight lumber. Crops could be altered so that only the leaves and certain other tissues had more wax, thus enhancing the plant’s drought tolerance, while its roots and other functions were unaffected.

    By manipulating a plant’s metabolic pathways, two scientists at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab), Henrik Scheller and Dominique Loqué, have figured out a way to genetically rewire plants to allow for an exceptionally high level of control over the spatial pattern of gene expression, while at the same time boosting expression to very high levels. Now they have launched a startup company called Afingen to apply this technology for developing low-cost biofuels that could be cost-competitive with gasoline and corn ethanol.

    two
    Henrik Scheller (left) and Dominique Loque hold a tray of Arabidopsis Thaliana plants, which they used in their research. (Berkeley Lab photo)

    “With this tool we seem to have found a way to control very specifically what tissue or cell type expresses whatever we want to express,” said Scheller. “It’s a new way that people haven’t thought about to increase metabolic pathways. It could be for making more cell wall, for increasing the stress tolerance response in a specific tissue. We think there are many different applications.”

    Cost-competitive biofuels

    Afingen was awarded a Small Business Innovation Research (SBIR) grant earlier this year for $1.72 million to engineer switchgrass plants that will contain 20 percent more fermentable sugar and 40 percent less lignin in selected structures. The grant was provided under a new SBIR program at DOE that combines an SBIR grant with an option to license a specific technology produced at a national laboratory or university through DOE-supported research.

    “Techno-economic modeling done at (the Joint BioEnergy Institute, or JBEI) has shown that you would get a 23 percent reduction in the price of the biofuel with just a 20 percent reduction in lignin,” said Loqué. “If we could also increase the sugar content and make it easier to extract, that would reduce the price even further. But of course it also depends on the downstream efficiency.”

    Scheller and Loqué are plant biologists with the Department of Energy’s Joint BioEnergy Institute (JBEI), a Berkeley Lab-led research center established in 2007 to pursue breakthroughs in the production of cellulosic biofuels. Scheller heads the Feedstocks Division and Loqué leads the cell wall engineering group.

    The problem with too much lignin in biofuel feedstocks is that it is difficult and expensive to break down; reducing lignin content would allow the carbohydrates to be released and converted into fuels much more cost-effectively. Although low-lignin plants have been engineered, they grow poorly because important tissues lack the strength and structural integrity provided by the lignin. With Afingen’s technique, the plant can be manipulated to retain high lignin levels only in its water-carrying vascular cells, where cell-wall strength is needed for survival, but low levels throughout the rest of the plant.

    The centerpiece of Afingen’s technology is an “artificial positive feedback loop,” or APFL. The concept targets master transcription factors, which are molecules that regulate the expression of genes involved in certain biosynthetic processes, that is, whether certain genes are turned “on” or “off.” The APFL technology is a breakthrough in plant biotechnology, and Loqué and Scheller recently received an R&D 100 Award for the invention.

    An APFL is a segment of artificially produced DNA coded with instructions to make additional copies of a master transcription factor; when it is inserted at the start of a chosen biosynthetic pathway—such as the pathway that produces cellulose in fiber tissues—the plant cell will synthesize the cellulose and also make a copy of the master transcription factor that launched the cycle in the first place. Thus the cycle starts all over again, boosting cellulose production.

    The process differs from classical genetic engineering. “Some people distinguish between ‘transgenic’ and ‘cisgenic.’ We’re using only pieces of DNA that are already in that plant and just rearranging them in a new way,” said Scheller. “We’re not bringing in foreign DNA.”

    Other licensees and applications

    This breakthrough technique can also be used in fungi and for a wide variety of uses in plants, for example, to increase food crop yields or to boost production of highly specialized molecules used by the pharmaceutical and chemical industries. “It could also increase the quality of forage crops, such as hay fed to cows, by increasing the sugar content or improving the digestibility,” Loqué said.

    Another intriguing application is for biomanufacturing. By engineering plants to grow entirely new pharmaceuticals, specialty chemicals, or polymer materials, the plant essentially becomes a “factory.” “We’re interested in using the plant itself as a host for production,” Scheller said. “Just like you can upregulate pathways in plants that make cell walls or oil, you can also upregulate pathways that make other compounds or properties of interest.”

    Separately, two other companies are using the APFL technology. Tire manufacturer Bridgestone has a cooperative research and development agreement (CRADA) with JBEI to develop more productive rubber-producing plants. FuturaGene, a Brazilian paper and biomass company, has licensed the technology for exclusive use with eucalyptus trees and several other crops; APFL can enhance or develop traits to optimize wood quality for pulping and bioenergy applications.

    “The inventors/founders of Afingen made the decision to not compete for a license in fields of use that were of interest to other companies that had approached JBEI. This allowed JBEI to move the technology forward more quickly on several fronts,” said Robin Johnston, Berkeley Lab’s Acting Deputy Chief Technology Transfer Officer. “APFL is a very insightful platform technology, and I think only a fraction of the applications have even been considered yet.”

    Afingen currently has one employee—Ai Oikawa, a former postdoctoral researcher and now the director of plant engineering—and will be hiring three more in November. It is the third startup company to spin out of JBEI. The first two were Lygos, which uses synthetic biology tools to produce chemical compounds, and TeselaGen, which makes tools for DNA synthesis and cloning.

    See the full article here.

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 3:35 pm on October 27, 2014 Permalink | Reply
    Tags: , Clean Energy, , , ,   

    From AAAS: “After Election 2014: FUSION RESEARCH” 

    AAAS

    AAAS

    ScienceInsider

    24 October 2014
    Adrian Cho

    Should we stay or should we go? Once the voters have spoken, that’s the question Congress will have to answer regarding the United States’ participation in ITER, the hugely overbudget fusion experiment under construction in Cadarache, France. Some lawmakers say it may be time for the United States to bow out, especially as the growing ITER commitment threatens to starve U.S.-based fusion research programs. The next Congress may have to decide the issue—if the current one doesn’t pull the plug first when it returns to Washington, D.C., for a 6-week lame-duck session.

    ITER Tokamak
    ITER Tokamak

    For those tired of the partisan squabbling on Capitol Hill, the ITER debate may provide curious relief. ITER appears to enjoy bipartisan support in the House of Representatives—and bipartisan opposition among key senators.

    ITER aims to prove that nuclear fusion is a viable source of energy, and the United States has agreed to build 9% of the reactor’s hardware, regardless of the cost. Recent estimates suggest the U.S. price tag could be $3.9 billion or more—nearly quadrupling original estimates and raising alarm among some lawmakers. In response, this past June a Senate appropriations subcommittee proposed a budget bill that would end U.S. participation in the project next year. In contrast, the next month the House passed a bill that would increase U.S. spending on ITER.

    Some observers think the current Congress will kick the issue to the next one by passing a stop-gap budget for fiscal year 2015, which began 1 October, that will keep U.S. ITER going. “I don’t think in the end they can come out and kill ITER based on what the Senate subcommittee did,” says Stephen Dean, president of Fusion Power Associates, a research and educational foundation in Gaithersburg, Maryland. Others say a showdown could come by year’s end.

    Trouble over ITER has been brewing for years. ITER was originally proposed in 1985 as a joint U.S.-Soviet Union venture. The United States backed out of the project in 1998 because of cost and schedule concerns—only to rejoin in 2003. At the time, ITER construction costs were estimated at $5 billion. That number had jumped to $12 billion by 2006, when the European Union, China, India, Japan, Russia, South Korea, and the United States signed a formal agreement to build the device. At the time, ITER was supposed to start running in 2016. By 2011, U.S. costs for ITER had risen to more than $2 billion, and the date for first runs had slid to 2020. But even that date was uncertain; U.S. ITER researchers did not have a detailed cost projection and schedule—or performance baseline—to go by.

    Then in 2013, the Department of Energy (DOE) argued in its budget request for the following year that U.S. ITER was not a “capital asset” and therefore did not have to go through the usual DOE review process for large construction projects—which requires a performance baseline. Even though DOE promised to limit spending on ITER to $225 million a year so as not to starve domestic fusion research efforts, that statement irked Senators Dianne Feinstein (D–CA) and Lamar Alexander (R–TN), the chair and ranking member of the Senate Appropriations Subcommittee on Energy and Water Development, respectively. They and other senators asked the Government Accountability Office (GAO) to investigate the U.S. ITER project.

    This year, things appeared to come to a head. This past April, researchers working on U.S. ITER released their new $3.9 billion cost estimate and moved back the date for first runs to 2024 or later. Two months later, GAO reported that even that new estimate was not reliable and that the cost to the United States could reach $6.5 billion. Based on that report, the Senate energy and water subcommittee moved to kill U.S. ITER in its markup of the proposed 2015 budget, giving it only $75 million for the year, half of what the White House had requested and just enough to wind things down. Alexander supported the move, even though the U.S. ITER office is based in his home state of Tennessee, at Oak Ridge National Laboratory.

    ITER still has friends in the House, however. In their version of the DOE budget for 2015, House appropriators gave ITER $225 million, $75 million more than the White House request. Moreover, the project seems to have bipartisan support in the House, as shown by a hearing of the energy subcommittee of the House Committee on Science, Space, and Technology. Usually deeply divided along party lines, the subcommittee came together to lavish praise on ITER, with representative Lamar Smith (R–TX), chair of the full committee, and Representative Eric Swalwell (D–CA), the ranking member on the subcommittee, agreeing that ITER was, in Swalwell’s words, “absolutely essential to proving that magnetically confined fusion can be a viable clean energy source.” Swalwell called for spending more than $225 million per year on ITER.

    When and how this struggle over ITER plays out depends on the answers to several questions. First, how will Congress deal with the already late budget for next year? The Senate, controlled by the Democrats, has yet to pass any of its 13 budget bills, including the one that would fund energy research. And if the House and Senate decide to simply continue the 2014 budget past the end of the year, then the decision on ITER will pass to the next Congress. If, on the other hand, Congress passes a last-minute omnibus budget for fiscal year 2015, then the fight over ITER could play out by year’s end.

    Second, how sincere is the Senate move to kill ITER? The Senate subcommittee’s move may have been meant mainly to send a signal to the international ITER organization that it needs to shape up, says one Democratic staffer in the House. The international ITER organization received scathing criticism in an independent review in October 2013. That review called for 11 different measures to overhaul the project’s management, and the Senate’s markup may have been meant primarily to drive home the message that those measures had to be taken to ensure continued U.S. involvement, the staffer says.

    Third, how broad is the House’s support for ITER? Over the past decade or so, the House has been more supportive of fusion in general, the Democratic staffer says. But some observers credit that support mainly to one person, Representative Rodney Frelinghuysen (R-NJ), a longtime member of the House Appropriations Committee. “Over the years he’s become a champion of fusion,” Dean says. “He protects it in the House.” Dean and others say that’s likely because the DOE’s sole dedicated fusion laboratory, the Princeton Plasma Physics Laboratory (PPPL), is in his home state of New Jersey (but not Frelinghuysen’s district).

    Indeed, observers say that Frelinghuysen has been instrumental in preventing cuts to the domestic fusion program proposed by DOE itself. For example, for fiscal 2014, DOE requested $458 million for its fusion energy sciences program, including $225 million for ITER. That meant cutting the domestic fusion program by about 20% to $233 million and closing one of three tokamak reactors in the United States. The Senate went along with those numbers, but House appropriators bumped the budget up to $506 million, the number that held sway in the final 2014 spending plan. But some observers speculate that Frelinghuysen might be willing to let ITER go if he could secure a brighter future for PPPL.

    PPPL Tokamak
    PPPL Tokamak

    PPPL NSTX
    PPPL National Spherical Torus Experiment

    Finally, the biggest question surrounding U.S. participation in ITER is: How will the international ITER organization respond to the calls for changes in its management structure? That should become clear within months. So far, officials with U.S. ITER have not been able to produce a baseline cost estimate and schedule in large measure, because the ITER project as a whole does not have a reliable schedule. The international ITER organization has said it will produce one by next July, the House staffer says. And if the international organization doesn’t produce a credible schedule, the staffer says, “the project will be very difficult to defend, even by its most ardent supporters.”

    See the full article here.

    The American Association for the Advancement of Science is an international non-profit organization dedicated to advancing science for the benefit of all people.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 4:29 pm on September 29, 2014 Permalink | Reply
    Tags: , Clean Energy, , ,   

    From PPPL: “PPPL successfully tests system for mitigating instabilities called ‘ELMs’ “ 


    PPPL

    September 29, 2014
    John Greenwald

    PPPL has successfully tested a Laboratory-designed device to be used to diminish the size of instabilities known as “edge localized modes (ELMs)” on the DIII–D tokamak that General Atomics operates for the U.S. Department of Energy in San Diego. Such instabilities can damage the interior of fusion facilities.

    DIII-D
    DIII–D

    The PPPL device injects granular lithium particles into tokamak plasmas to increase the frequency of the ELMs. The method aims to make the ELMs smaller and reduce the amount of heat that strikes the divertor that exhausts heat in fusion facilities.

    The system could serve as a possible model for mitigating ELMs on ITER, the fusion facility under construction in France to demonstrate the feasibility of fusion energy.

    iter tok
    ITER Tokamak

    “ELMs are a big issue for ITER,” said Mickey Wade, director of the DIII-D national fusion program at General Atomics. Large-scale ELMs, he noted, could melt plasma-facing components inside the ITER tokamak.

    General Atomics plans to install the PPPL-designed device, developed by physicist Dennis Mansfield and engineer Lane Roquemore, on DIII-D this fall. Previous experiments using deuterium-injection rather than lithium-injection have demonstrated the ability to increase the ELMs frequency on DIII-D, the ASDEX-Upgrade in Germany and the Joint European Torus in the United Kingdom.

    jet
    Joint European Torus

    Researchers at DIII-D now want to see how the results for lithium-injection compare with those obtained in the deuterium experiments on the San Diego facility. “We want to put them side-by-side,” Wade said.

    PPPL-designed systems have proven successful in mitigating ELMs on the EAST tokamak in Hefei, China, and have been used on a facility operated by the Italian National Agency for New Technologies in Frascati, Italy. A system also is planned for PPPL’s National Spherical Torus Experiment (NSTX), the Laboratory’s major fusion experiment, which is undergoing a $94 million upgrade.

    PPPL NSTX
    PPPL NSTX

    PPPL used salt grain-sized plastic pellets as proxies for lithium granules in testing the system for DIII-D. The pellets fell through a pinhole-sized opening inside a dropper to a rotating high-speed propeller that projected them onto a target precisely as planned.

    Joining Mansfield and Roquemore for the tests were physicists Erik Gilson and Alessandro Bortolon, a former University of Tennessee researcher now at PPPL who will begin an assignment to the DIII-D tokamak at General Atomics this fall. Also participating were Rajesh Maingi, the head of research on edge physics and plasma-facing components at PPPL, and engineer Alexander Nagy, who is on assignment to DIII-D.

    See the full article here.

    Princeton Plasma Physics Laboratory is a U.S. Department of Energy national laboratory managed by Princeton University.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 1:59 pm on September 26, 2014 Permalink | Reply
    Tags: , Clean Energy, ,   

    From PNNL: “Off-shore Power Potential Floating in the Wind” 


    PNNL Lab

    September 2014
    Web Publishing Services

    Results
    : Two bright yellow buoys – each worth $1.3 million – are being deployed by Pacific Northwest National Laboratory in Washington State’s Sequim Bay. The massive, 20,000-pound buoys are decked out with the latest in meteorological and oceanographic equipment to enable more accurate predictions of the power-producing potential of winds that blow off U.S. shores. Starting in November, they will be commissioned for up to a year at two offshore wind demonstration projects: one near Coos Bay, Oregon, and another near Virginia Beach, Virginia.

    off
    PNNL staff conduct tests in Sequim Bay, Washington, while aboard one of two new research buoys being commissioned to more accurately predict offshore wind’s power-producing potential.

    “We know offshore winds are powerful, but these buoys will allow us to better understand exactly how strong they really are at the heights of wind turbines,” said PNNL atmospheric scientist Dr. William J. Shaw. “Data provided by the buoys will give us a much clearer picture of how much power can be generated at specific sites along the American coastline – and enable us to generate that clean, renewable power sooner.”

    Why It Matters: Offshore wind is a new frontier for U.S. renewable energy developers. There’s tremendous power-producing potential, but limited information is available about ocean-based wind resources. A recent report estimated the U.S. could power nearly 17 million homes by generating more than 54 gigawatts of offshore wind energy, but more information is needed.

    See the full article here.

    Pacific Northwest National Laboratory (PNNL) is one of the United States Department of Energy National Laboratories, managed by the Department of Energy’s Office of Science. The main campus of the laboratory is in Richland, Washington.

    PNNL scientists conduct basic and applied research and development to strengthen U.S. scientific foundations for fundamental research and innovation; prevent and counter acts of terrorism through applied research in information analysis, cyber security, and the nonproliferation of weapons of mass destruction; increase the U.S. energy capacity and reduce dependence on imported oil; and reduce the effects of human activity on the environment. PNNL has been operated by Battelle Memorial Institute since 1965.

    i1

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 8:14 pm on September 21, 2014 Permalink | Reply
    Tags: , Clean Energy, ,   

    From M.I.T.: “Magnetic fields make the excitons go ’round” 


    MIT News

    September 21, 2014
    David L. Chandler | MIT News Office

    A major limitation in the performance of solar cells happens within the photovoltaic material itself: When photons strike the molecules of a solar cell, they transfer their energy, producing quasi-particles called excitons — an energized state of molecules. That energized state can hop from one molecule to the next until it’s transferred to electrons in a wire, which can light up a bulb or turn a motor.

    temp

    But as the excitons hop through the material, they are prone to getting stuck in minuscule defects, or traps — causing them to release their energy as wasted light.

    Now a team of researchers at MIT and Harvard University has found a way of rendering excitons immune to these traps, possibly improving photovoltaic devices’ efficiency. The work is described in a paper in the journal Nature Materials.

    Their approach is based on recent research on exotic electronic states known as topological insulators, in which the bulk of a material is an electrical insulator — that is, it does not allow electrons to move freely — while its surface is a good conductor.

    The MIT-Harvard team used this underlying principle, called topological protection, but applied it to excitons instead of electrons, explains lead author Joel Yuen, a postdoc in MIT’s Center for Excitonics, part of the Research Laboratory of Electronics. Topological protection, he says, “has been a very popular idea in the physics and materials communities in the last few years,” and has been successfully applied to both electronic and photonic materials.

    Moving on the surface

    Topological excitons would move only at the surface of a material, Yuen explains, with the direction of their motion determined by the direction of an applied magnetic field. In that respect, their behavior is similar to that of topological electrons or photons.

    In its theoretical analysis, the team studied the behavior of excitons in an organic material, a porphyrin thin film, and determined that their motion through the material would be immune to the kind of defects that tend to trap excitons in conventional solar cells.

    The choice of porphyrin for this analysis was based on the fact that it is a well-known and widely studied family of materials, says co-author Semion Saikin, a postdoc at Harvard and an affiliate of the Center for Excitonics. The next step, he says, will be to extend the analysis to other kinds of materials.

    por
    Structure of porphine, the simplest porphyrin

    While the work so far has been theoretical, experimentalists are eager to pursue the concept. Ultimately, this approach could lead to novel circuits that are similar to electronic devices but based on controlling the flow of excitons rather that electrons, Yuen says. “If there are ever excitonic circuits,” he says, “this could be the mechanism” that governs their functioning. But the likely first application of the work would be in creating solar cells that are less vulnerable to the trapping of excitons.

    Eric Bittner, a professor of chemistry at the University of Houston who was not associated with this work, says, “The work is interesting on both the fundamental and practical levels. On the fundamental side, it is intriguing that one may be able to create excitonic materials with topological properties. This opens a new avenue for both theoretical and experimental work. … On the practical side, the interesting properties of these materials and the fact that we’re talking about pretty simple starting components — porphyrin thin films — makes them novel materials for new devices.”

    The work received support from the U.S. Department of Energy and the Defense Threat Reduction Agency. Norman Yao, a graduate student at Harvard, was also a co-author.

    See the full article here.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 7:53 pm on September 21, 2014 Permalink | Reply
    Tags: , Clean Energy, ,   

    From M.I.T.: “New formulation leads to improved liquid battery” 


    MIT News

    September 21, 2014
    David L. Chandler | MIT News Office

    Cheaper, longer-lasting materials could enable batteries that make wind and solar energy more competitive.

    temp

    Researchers at MIT have improved a proposed liquid battery system that could enable renewable energy sources to compete with conventional power plants.

    Donald Sadoway and colleagues have already started a company to produce electrical-grid-scale liquid batteries, whose layers of molten material automatically separate due to their differing densities. But the new formula — published in the journal Nature by Sadoway, former postdocs Kangli Wang and Kai Jiang, and seven others — substitutes different metals for the molten layers used in a battery previously developed by the team.

    Sadoway, the John F. Elliott Professor of Materials Chemistry, says the new formula allows the battery to work at a temperature more than 200 degrees Celsius lower than the previous formulation. In addition to the lower operating temperature, which should simplify the battery’s design and extend its working life, the new formulation will be less expensive to make, he says.

    The battery uses two layers of molten metal, separated by a layer of molten salt that acts as the battery’s electrolyte (the layer that charged particles pass through as the battery is charged or discharged). Because each of the three materials has a different density, they naturally separate into layers, like oil floating on water.

    The original system, using magnesium for one of the battery’s electrodes and antimony for the other, required an operating temperature of 700 C. But with the new formulation, with one electrode made of lithium and the other a mixture of lead and antimony, the battery can operate at temperatures of 450 to 500 C.

    Extensive testing has shown that even after 10 years of daily charging and discharging, the system should retain about 85 percent of its initial efficiency — a key factor in making such a technology an attractive investment for electric utilities.

    Currently, the only widely used system for utility-scale storage of electricity is pumped hydro, in which water is pumped uphill to a storage reservoir when excess power is available, and then flows back down through a turbine to generate power when it is needed. Such systems can be used to match the intermittent production of power from irregular sources, such as wind and solar power, with variations in demand. Because of inevitable losses from the friction in pumps and turbines, such systems return about 70 percent of the power that is put into them (which is called the “round-trip efficiency”).

    Sadoway says his team’s new liquid-battery system can already deliver the same 70 percent efficiency, and with further refinements may be able to do better. And unlike pumped hydro systems — which are only feasible in locations with sufficient water and an available hillside — the liquid batteries could be built virtually anywhere, and at virtually any size. “The fact that we don’t need a mountain, and we don’t need lots of water, could give us a decisive advantage,” Sadoway says.

    The biggest surprise for the researchers was that the antimony-lead electrode performed so well. They found that while antimony could produce a high operating voltage, and lead gave a low melting point, a mixture of the two combined both advantages, with a voltage as high as antimony alone, and a melting point between that of the two constituents — contrary to expectations that lowering the melting point would come at the expense of also reducing the voltage.

    “We hoped [the characteristics of the two metals] would be nonlinear,” Sadoway says — that is, that the operating voltage would not end up halfway between that of the two individual metals. “They proved to be [nonlinear], but beyond our imagination. There was no decline in the voltage. That was a stunner for us.”

    Not only did that provide significantly improved materials for the group’s battery system, but it opens up whole new avenues of research, Sadoway says. Going forward, the team will continue to search for other combinations of metals that might provide even lower-temperature, lower-cost, and higher-performance systems. “Now we understand that liquid metals bond in ways that we didn’t understand before,” he says.

    With this fortuitous finding, Sadoway says, “Nature tapped us on the shoulder and said, ‘You know, there’s a better way!’” And because there has been little commercial interest in exploring the properties and potential uses of liquid metals and alloys of the type that are most attractive as electrodes for liquid metal batteries, he says, “I think there’s still room for major discoveries in this field.”

    Robert Metcalfe, professor of innovation at the University of Texas at Austin, who was not involved in this work, says, “The Internet gave us cheap and clean connectivity using many kinds of digital storage. Similarly, we will solve cheap and clean energy with many kinds of storage. Energy storage will absorb the increasing randomness of energy supply and demand, shaving peaks, increasing availability, improving efficiency, lowering costs.”

    Metcalfe adds that Sadoway’s approach to storage using liquid metals “is very promising.”

    The research was supported by the U.S. Department of Energy’s Advanced Research Projects Agency-Energy and by French energy company Total.

    See the full article here.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 6:11 am on July 22, 2014 Permalink | Reply
    Tags: , Clean Energy, ,   

    From ESO: “Solar Farm to be Installed at La Silla” 


    European Southern Observatory

    21 July 2014
    Roberto Tamai
    E-ELT Programme Manager
    Garching bei München, Germany
    Tel: +49 89 3200 6367
    Email: rtamai@eso.org

    Lars Lindberg Christensen
    Head of ESO ePOD
    ESO ePOD, Garching, Germany
    Tel: +49 89 3200 6761
    Cellular: +49 173 3872 621
    E-mail: lars@eso.org

    As part of its green initiatives, ESO has signed an agreement with the Chilean company, Astronomy and Energy (a subsidiary of the Spanish LKS Group), to install a solar farm at the La Silla Observatory. ESO has been working on green solutions for supplying energy to its sites for several years, and these are now coming to fruition. Looking to the future, renewables are considered vital to satisfy energy needs in a sustainable manner.

    ESO LaSilla
    ESO at LaSilla

    solar

    ESO’s ambitious programme is focused on achieving the highest quality of astronomical research. This requires the design, construction and operation of the most powerful ground-based observing facilities in the world. However, the operations at ESO’s observatories present significant challenges in terms of their energy usage.

    Despite the abundance of sunshine at the ESO sites, it has not been possible up to now to make efficient use of this natural source of power. Astronomy and Energy will supply a means of effectively exploiting solar energy using crystalline photovoltaic modules (solar panels), which will be installed at La Silla.

    The installation will cover an area of more than 100 000 square metres, with the aim of being ready to supply the site by end of the year.

    The global landscape for energy has changed considerably over the last 20 years. As energy prices are increasing and vary unpredictably, ESO has been keen to look into ways to control its energy costs and also limit its ecological impact. The organisation has already managed to successfully reduce its power consumption at La Silla, and despite the additions of the VISTA and VST survey telescopes, power use has remained stable over the past few years at the Paranal Observatory, site of the VLT.

    ESO Vista Telescope
    ESO VISTA Telescope

    ESO VST telescope
    ESO VST Telescope

    The much-improved efficiency of solar cells has meant they have become a viable alternative to exploit solar energy. Solar cells of the latest generation are considered to be very reliable and almost maintenance-free, characteristics that contribute to a high availability of electric power, as required at astronomical observatories.

    As ESO looks to the future, it seeks further sustainable energy sources to be compatible across all its sites, including Cerro Armazones — close to Cerro Paranal and the site of the future European Extremely Large Telescope (E-ELT). This goal will be pursued not only by installing primary sources of renewable energy, as at La Silla, but also by realising connections to the Chilean interconnected power systems, where non-conventional renewable energy sources are going to constitute an ever-growing share of the power and energy mixes.

    The installation of a solar farm at La Silla is one of a series of initiatives ESO is taking to tackle the environmental impacts of its operations, as can be viewed here. Green energy is strongly supported by the Chilean government, which aims to increase the Chilean green energy share to 25% in 2020, with a possible target of 30% by 2030.

    See the full article, with note, here.

    Visit ESO in Social Media-

    Facebook

    Twitter

    YouTube

    ESO Main

    ESO, European Southern Observatory, builds and operates a suite of the world’s most advanced ground-based astronomical telescopes.


    ScienceSprings is powered by MAINGEAR computers

     
  • richardmitnick 12:19 pm on July 15, 2014 Permalink | Reply
    Tags: , , Clean Energy, , ,   

    From PPPL: “Experts assemble at PPPL to discuss mitigation of tokamak disruptions” 


    PPPL

    July 15, 2014
    John Greenwald

    Some 35 physicists from around the world gathered at PPPL last week for the second annual Laboratory-led workshop on improving ways to predict and mitigate disruptions in tokamaks. Avoiding or mitigating such disruptions, which occur when heat or electric current are suddenly reduced during fusion experiments, will be crucial for ITER the international experiment under construction in France to demonstrate the feasibility of fusion power.

    two
    Amitava Bhattacharjee, left, and John Mandrekas, a program manager in the U.S. Department of Energy’s office of Fusion Energy Sciences.(Photo by Elle Starkman/Princeton Office of
    Communications )

    PPPL Tokamak
    Tokamak at PPPL

    Presentations at the three-day session, titled “Theory and Simulation of Disruptions Workshop,” focused on the development of models that can be validated by experiment. “This is a really urgent task for ITER,” said Amitava Bhattacharjee, who heads the PPPL Theory Department and organized the workshop. The United States is responsible for designing disruption-mitigation systems for ITER, he noted, and faces a deadline of 2017.

    Speakers at the workshop included theorists and experimentalists from the ITER Organization, PPPL, General Atomics and several U.S. Universities, and from fusion facilities in the United Kingdom, China, Italy and India. Topics ranged from coping with the currents and forces that strike tokamak walls to suppressing runaway electrons that can be unleashed during experiments.

    Bringing together theorists and experimentalists is essential for developing solutions to disruptions, Bhattacharjee said. “I already see that major fusion facilities in the United States, as well as international tokamaks, are embarking on experiments that are ideal validation tools for theory and simulation,” he said. “And it is very important that theory and simulation ideas that can be validated with experimental results are presented and discussed in detail in focused workshops such as this one.”

    See the full article here.

    Princeton Plasma Physics Laboratory is a U.S. Department of Energy national laboratory managed by Princeton University.


    ScienceSprings is powered by Maingear computers

     
  • richardmitnick 4:47 am on July 14, 2014 Permalink | Reply
    Tags: , , , Clean Energy,   

    Friom NASA/ESA HUbble: “The oldest cluster in its cloud” 

    NASA Hubble Telescope

    Hubble

    14 July 2014
    No Writer Credit

    This image shows NGC 121, a globular cluster in the constellation of Tucana (The Toucan). Globular clusters are big balls of old stars that orbit the centres of their galaxies like satellites — the Milky Way, for example, has around 150.

    ngc
    Credit: NASA/ESA Hubble Acknowlegement: Stefano Campani
    Hubble ACS
    NASA Hubble ACS

    NGC 121 belongs to one of our neighbouring galaxies, the Small Magellanic Cloud (SMC). It was discovered in 1835 by English astronomer John Herschel, and in recent years it has been studied in detail by astronomers wishing to learn more about how stars form and evolve.

    Stars do not live forever — they develop differently depending on their original mass. In many clusters, all the stars seem to have formed at the same time, although in others we see distinct populations of stars that are different ages. By studying old stellar populations in globular clusters, astronomers can effectively use them as tracers for the stellar population of their host galaxies. With an object like NGC 121, which lies close to the Milky Way, Hubble is able to resolve individual stars and get a very detailed insight.

    NGC 121 is around 10 billion years old, making it the oldest cluster in its galaxy; all of the SMC’s other globular clusters are 8 billion years old or younger. However, NGC 121 is still several billions of years younger than its counterparts in the Milky Way and in other nearby galaxies like the Large Magellanic Cloud. The reason for this age gap is not completely clear, but it could indicate that cluster formation was initially delayed for some reason in the SMC, or that NGC 121 is the sole survivor of an older group of star clusters.

    This image was taken using Hubble’s Advanced Camera for Surveys (ACS). A version of this image was submitted to the Hubble’s Hidden Treasures image processing competition by contestant Stefano Campani.

    See the full article here.

    The Hubble Space Telescope is a project of international cooperation between NASA and the European Space Agency. NASA’s Goddard Space Flight Center manages the telescope. The Space Telescope Science Institute (STScI), is a free-standing science center, located on the campus of The Johns Hopkins University and operated by the Association of Universities for Research in Astronomy (AURA) for NASA, conducts Hubble science operations.

    ESA50 Logo large

    AURA Icon


    ScienceSprings is powered by MAINGEAR computers

     
  • richardmitnick 8:25 pm on July 13, 2014 Permalink | Reply
    Tags: , , Clean Energy, , ,   

    Clean Energy Project Hello everybody We’ve been overdue… 

    Clean Energy

    Clean Energy Project

    Hello everybody,

    We’ve been overdue for another progress report on the Clean Energy Project for some time, so here it finally comes. We hope you’ll enjoy this summary of the things that have happened since our last full report in April.

    Let’s start with the news on the CEP team again: Last time we reported that Alan was promoted to Full Professor and Sule got a job as an Assistant Professor in Ankara (Turkey). In the meantime, Johannes also landed an Assistant Professorship in the Department of Chemical and Biological Engineering at the University at Buffalo, The State University of New York, with an affiliation to the New York State Center of Excellence in Materials Informatics.
    http://www.cbe.buffalo.edu/people/full_time/j_hachmann.php

    Johannes will, however, stay involved in the Clean Energy Project and has already recruited students at Buffalo who will strengthen the CEP research efforts. Laszlo is gearing up to go out into the world as well and he will start graduate school next summer.
    http://aspuru.chem.harvard.edu/laszlo-seress/

    To compensate these losses, Ed Pyzer-Knapp from the Day Group at the University of Cambridge (UK) will join the CEP team in January 2014.
    http://www-day.ch.cam.ac.uk/epk.html

    Prof. Carlos Amador-Bedolla from UNAM in Mexico, who was active in the project a few years ago, has also started to be more active again.
    http://www.quimica.unam.mx/ficha_investigador.php?ID=77&tipo=2

    Continuity is always a big concern in a large-scale project such as the CEP, but we hope that we’ll manage the transition without too much trouble. Having the additional project branch in Buffalo will hopefully put our work on a broader foundation in the long run.

    Our work in the CEP was again recognized, e.g., by winning the 2013 Computerworld Data+ Award and the RSC Scholarship Award for Scientific Excellence of the ACS Division of Chemical Information for Johannes. CEP work has been presented on many conferences, webcasts, seminars, and talks over the last half year. It is by now a fairly well known effort in the materials science community and it has taken its place amongst the other big virtual screening projects such as the Materials Project, the Computational Materials Repository, and AFLOWLIB.

    Now to the progress on the research front: After a number of the other WCG projects have concluded, the CEP has seen a dramatic increase in computing time and returned results since the spring. These days we average between 24 and 28 y/d (that’s an increase of about 50% to our previous averages) and we have passed the mark of 21,000 years of harvested CPU time. By now we have performed over 200 million density functional theory calculations on over 3 million compounds, accumulating well over half a petabyte of data. We are currently in the process of expanding our storage capacity towards the 1PB mark by building Jabba 7 and 8. Thanks again to HGST for their generous sponsorship and Harvard FAS Research Computing for their support.

    Over the summer we have finally released the CEPDB on our new platform http://www.molecularspace.org. The launch made quite a splash. We received a lot of positive feedback in the news and from the community, and it was also nicely synchronized with the two year anniversary of the Materials Genome Initiative. We used the CEPDB release to also launched our new project webpage.

    Our latest research results and data analysis was published in “Energy and Environmental Science” and you can read all the details in this paper.

    There is still a lot of exciting research waiting to be done, and we are looking forward to tackling all this work together with you. Thanks so much for all your generous support, hard work, and enthusiasm – you guys and gals are the best! CEP would not be possible without you. CRUNCH ON!

    Best wishes from

    Your Harvard Clean Energy Project team

    The Harvard Clean Energy Project Database contains data and analyses on 2.3 million candidate compounds for organic photovoltaics. It is an open resource designed to give researchers in the field of organic electronics access to promising leads for new material developments.

    Would you like to help find new compounds for organic solar cells? By participating in the Harvard Clean Energy Project you can donate idle computer time on your PC for the discovery and design of new materials. Visit WorldCommunityGrid to get the BOINC software on which the project runs.

    CleanEnergyProjectPartners


    ScienceSprings is powered by MAINGEAR computers

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
Follow

Get every new post delivered to your Inbox.

Join 348 other followers

%d bloggers like this: