Tagged: Clean Energy Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 7:24 am on February 17, 2015 Permalink | Reply
    Tags: , Clean Energy, , , ,   

    From physicsworld: “Smaller fusion reactors could deliver big gains” 

    physicsworld
    physicsworld.com

    Feb 16, 2015
    Michael Banks

    1
    Hot topic: size may not be everything in tokamak design

    Researchers from the UK firm Tokamak Energy say that future fusion reactors could be made much smaller than previously envisaged – yet still deliver the same energy output. That claim is based on calculations showing that the fusion power gain – a measure of the ratio of the power from a fusion reactor to the power required to maintain the plasma in steady state – does not depend strongly on the size of the reactor. The company’s finding goes against conventional thinking, which says that a large power output is only possible by building bigger fusion reactors.

    The largest fusion reactor currently under construction is the €16bn ITER facility in Cadarache, France.

    ITER Tokamak

    This will weigh about 23,000 tonnes when completed in the coming decade and consist of a deuterium–tritium plasma held in a 60 m-tall, doughnut-shaped “tokamak”. ITER aims to produce a fusion power gain (Q) of 10, meaning that, in theory, the reactor will emit 10 times the power it expends by producing 500 MW from 50 MW of input power. While ITER has a “major” plasma radius of 6.21 m, it is thought that an actual future fusion power plant delivering power to the grid would need a 9 m radius to generate 1 GW.

    Low power brings high performance

    The new study, led by Alan Costley from Tokamak Energy, which builds compact tokamaks, shows that smaller, lower-power, and therefore lower-cost reactors could still deliver a value of Q similar to ITER. The work focused on a key parameter in determining plasma performance called the plasma “beta”, which is the ratio of the plasma pressure to the magnetic pressure. By using scaling expressions consistent with existing experiments, the researchers show that the power needed for high fusion performance can be three or four times lower than previously thought.

    Combined with the finding on the size-dependence of Q, these results imply the possibility of building lower-power, smaller and cheaper pilot plants and reactors. “The consequence of beta-independent scaling is that tokamaks could be much smaller, but still have a high power gain,” David Kingham, Tokamak Energy chief executive, told Physics World.

    The researchers propose that a reactor with a radius of just 1.35 m would be able to generate 180 MW, with a Q of 5. This would result in a reactor just 1/20th of the size of ITER. “Although there are still engineering challenges to overcome, this result is underpinned by good science,” says Kingham. “We hope that this work will attract further investment in fusion energy.”

    Many challenges remain

    Howard Wilson, director of the York Plasma Institute at the University of York in the UK, points out, however, that the result relies on being able to achieve a very high magnetic field. “We have long been aware that a high magnetic field enables compact fusion devices – the breakthrough would be in discovering how to create such high magnetic fields in the tokamak,” he says. “A compact fusion device may indeed be possible, provided one can achieve high confinement of the fuel, demonstrate efficient current drive in the plasma, exhaust the heat and particles effectively without damaging material surfaces, and create the necessary high magnetic fields.”

    The work by Tokamak Energy follows an announcement late last year that the US firm Lockheed Martin plans to build a “truck-sized” compact fusion reactor by 2019 that would be capable of delivering 100 MW. However, the latest results from Tokamak Energy might not be such bad news for ITER. Kingham adds that his firm’s work means that, in principle, ITER is actually being built much larger than necessary – and so should outperform its Q target of 10.

    The research is published in Nuclear Fusion.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    PhysicsWorld is a publication of the Institute of Physics. The Institute of Physics is a leading scientific society. We are a charitable organisation with a worldwide membership of more than 50,000, working together to advance physics education, research and application.

    We engage with policymakers and the general public to develop awareness and understanding of the value of physics and, through IOP Publishing, we are world leaders in professional scientific communications.
    IOP Institute of Physics

     
  • richardmitnick 1:17 pm on February 6, 2015 Permalink | Reply
    Tags: , Clean Energy, , Wave and tidal energy   

    From U Washington: “New tool monitors effects of tidal, wave energy on marine habitat” 

    U Washington

    University of Washington

    February 5, 2015
    Michelle Ma, News and Information

    Researchers building a new underwater robot they’ve dubbed the “Millennium Falcon” certainly have reason to believe it will live up to its name.

    1
    From left to right: UW researchers Ben Rush, Nick Michel-Hart, James Joslin and Paul Gibbs prepare to test the monitoring device underwater in a tank on campus.
    Applied Physics Laboratory, UW

    The robot will deploy instruments to gather information in unprecedented detail about how marine life interacts with underwater equipment used to harvest wave and tidal energy. Researchers still don’t fully understand how animals and fish will be affected by ocean energy equipment, and this instrument seeks to identify risks that could come into play in a long-term marine renewable energy project.

    “This is the first attempt at a ‘plug-and-socket’ instrumentation package in the marine energy field. If successful, it will change the way that industry views the viability of environmental research and development,” said Brian Polagye, a University of Washington assistant professor of mechanical engineering and one of the project’s leaders.

    2
    The Millennium Falcon robot maneuvers underwater in a testing tank on campus. The monitoring instruments (white box in the middle) are guided by the robot’s thrusters toward a docking station on the bottom of the tank. Researchers controlled the machine from above.Applied Physics Laboratory, UW

    The UW research team tested the Millennium Falcon and the instruments it transports, called the Adaptable Monitoring Package, underwater for the first time in January in a deep tank on campus. Researchers will continue testing in Puget Sound under more challenging conditions starting this month. They hope this tool will be useful for pilot tidal- and wave-energy projects and eventually in large-scale, commercial renewable-energy projects.

    “We’ve really become leaders in this space, leveraging UW expertise with cabled instrumentation packages like those developed for the Ocean Observatories Initiative. What’s novel here is the serviceability of the system and our ability to rapidly deploy and recover the instruments at low cost,” said Andrew Stewart, an ocean engineer at the UW Applied Physics Laboratory.

    The instrument package can track and measure a number of sights and sounds underwater. It has a stereo camera to collect photos and video, a sonar system, hydrophones to hear marine mammal activity, sensors to gauge water quality and speed, a click detector to listen for whales, dolphins and porpoises, and even a device to detect fish tags. A fiber optic cable connection back to shore allows for real-time monitoring and control, and the device will be powered by a copper wire.

    The breadth of sensors and various conditions this instrument can measure is unprecedented, researchers say. The tool also is unique for its ability to attach to most types of underwater infrastructure, ranging from tidal turbines to offshore oil and gas rigs. This allows researchers to easily deploy the instrument far offshore and recover it quickly at a relatively low cost compared with other approaches.

    “It could be a first step toward a standardized ‘science port’ for marine energy projects,” Polagye said.

    This speedy deployment and recovery — sometimes in rough seas — is possible because the instrument fits inside a remotely operated vehicle, or ROV, that can maneuver underwater and drop off the instrumentation package at a docking station integrated onto a turbine or other existing subsea infrastructure.
    The monitoring instruments are housed inside the white box in the middle. The Millennium Falcon ROV is positioned just over and under the white box. Researchers tested the device’s ability to fasten onto a docking station underwater, seen foreground.

    The monitoring instruments are housed inside the white box in the middle. The Millennium Falcon ROV is positioned just over and under the white box. Researchers tested the device’s ability to fasten onto a docking station underwater, seen foreground.Applied Physics Laboratory, UW

    The vehicle is about the size of a golf cart, and the research team outfitted the off-the-shelf Falcon underwater surveying machine with five extra thrusters on an external frame to give it more power to move against strong currents. Actuators on the vehicle latch the monitoring instruments onto a subsea docking station, and then the Millennium Falcon can disengage, leaving the instruments in place, and travel back to the water’s surface.

    The shape of the monitoring package resembles an X-wing Starfighter from the original “Star Wars” trilogy. (The researchers are mum on whether their Millennium Falcon can make the Kessel Run in less than 12 parsecs.)

    This project is a collaboration between researchers in mechanical engineering and the Applied Physics Laboratory, within the larger Northwest National Marine Renewable Energy Center, which is a multi-institution organization that develops marine renewable energy technologies through research, education and outreach. The center and the Applied Physics Laboratory recently received $8 million from the U.S. Navy to develop marine renewable energy for use at its facilities worldwide.

    Development of this environmental monitoring instrument was prompted by a long-running tidal energy pilot project with the Snohomish County Public Utility District in Admiralty Inlet that recently was dropped because of ballooning costs. Going forward, researchers expect to use the same device to monitor marine-energy projects cropping up around the world and help to reduce the cost of future developments.

    “Snohomish PUD was really at the forefront of projects grappling with this problem of monitoring a tidal turbine in deep, fast moving water. But as other projects in the U.S., Europe and Canada have faced similar monitoring scenarios, the instrumentation package is shaping up as a strong candidate to meet their needs,” Polagye said.

    Other lead researchers are UW mechanical engineering graduate students James Joslin and Emma Cotter.

    The project is funded by the U.S. Department of Energy, the U.S. Naval Facilities Engineering Command, the Snohomish County Public Utility District and the UW.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The University of Washington is one of the world’s preeminent public universities. Our impact on individuals, on our region, and on the world is profound — whether we are launching young people into a boundless future or confronting the grand challenges of our time through undaunted research and scholarship. Ranked number 10 in the world in Shanghai Jiao Tong University rankings and educating more than 54,000 students annually, our students and faculty work together to turn ideas into impact and in the process transform lives and our world. For more about our impact on the world, every day.

    So what defines us — the students, faculty and community members at the University of Washington? Above all, it’s our belief in possibility and our unshakable optimism. It’s a connection to others, both near and far. It’s a hunger that pushes us to tackle challenges and pursue progress. It’s the conviction that together we can create a world of good. Join us on the journey.

     
  • richardmitnick 8:20 am on January 30, 2015 Permalink | Reply
    Tags: , Clean Energy, , , Solar Fuels   

    From Science 2.0: “Calculating The Future Of Solar-fuel Refineries” 

    Science 2.0 bloc

    Science 2.0

    January 30th 2015
    News Staff

    The process of converting the sun’s energy into liquid fuels requires a sophisticated, interrelated series of choices but a solar refinery is especially tricky to map out because the designs involve newly developed or experimental technologies. This makes it difficult to develop realistic plans that are economically viable and energy efficient.

    In a paper recently published in the journal Energy & Environmental Science, a team led by University of Wisconsin-Madison chemical and biological engineering Professors Christos Maravelias and George Huber outlined a tool to help engineers better gauge the overall yield, efficiency and costs associated with scaling solar-fuel production processes up into large-scale refineries.

    1

    That’s where the new UW-Madison tool comes in. It’s a framework that focuses on accounting for general variables and big-picture milestones associated with scaling up energy technologies to the refinery level. This means it’s specifically designed to remain relevant even as solar-fuel producers and researchers experiment with new technologies and ideas for technologies that don’t yet exist.

    Renewable-energy researchers at UW-Madison have long emphasized the importance of considering energy production as a holistic process, and Maravelias says the new framework could be used by a wide range of solar energy stakeholders, from basic science researchers to business decision-makers. The tool could also play a role in wider debates about which renewable-energy technologies are most appropriate for society to pursue on a large scale.

    “The nice thing about it being general is that if a researcher develops a different technology – and there are many different ways to generate solar fuels – our framework would still be applicable, and if someone wants a little more detail, our framework can be adjusted accordingly,” Maravelias says.

    In addition to bringing clarity to the solar refinery conversation, the framework could also be adapted to help analyze and plan any number of other energy-related processes, says Jeff Herron, a postdoc in Maravelias’ group and the paper’s lead author.

    “People tend to be narrowly focused on their particular role within a bigger picture,” Herron says. “I think bringing all that together is unique to our work, and I think that’s going to be one of the biggest impacts.”

    Ph.D. student Aniruddha Upadhye and postdoc Jiyong Kim also contributed to the project. The research was funded by the U.S. Department of Energy.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 3:17 pm on January 23, 2015 Permalink | Reply
    Tags: , BNL, Clean Energy, , ,   

    From BNL: “Self-Assembled Nanotextures Create Antireflective Surface on Silicon Solar Cells” 

    Brookhaven Lab

    January 21, 2015
    Karen McNulty Walsh, (631) 344-8350 or Peter Genzer, (631) 344-3174

    Nanostructured surface textures—with shapes inspired by the structure of moths’ eyes—prevent the reflection of light off silicon, improving conversion of sunlight to electricity

    1
    Chuck Black of the Center for Functional Nanomaterials displays a nanotextured square of silicon on top of an ordinary silicon wafer. The nanotextured surface is completely antireflective and could boost the production of solar energy from silicon solar cells.

    Reducing the amount of sunlight that bounces off the surface of solar cells helps maximize the conversion of the sun’s rays to electricity, so manufacturers use coatings to cut down on reflections. Now scientists at the U.S. Department of Energy’s Brookhaven National Laboratory show that etching a nanoscale texture onto the silicon material itself creates an antireflective surface that works as well as state-of-the-art thin-film multilayer coatings.

    The surface nanotexture … drastically cut down on reflection of many wavelengths of light simultaneously.

    Their method, described in the journal Nature Communications and submitted for patent protection, has potential for streamlining silicon solar cell production and reducing manufacturing costs. The approach may find additional applications in reducing glare from windows, providing radar camouflage for military equipment, and increasing the brightness of light-emitting diodes.

    “For antireflection applications, the idea is to prevent light or radio waves from bouncing at interfaces between materials,” said physicist Charles Black, who led the research at Brookhaven Lab’s Center for Functional Nanomaterials (CFN), a DOE Office of Science User Facility.

    Preventing reflections requires controlling an abrupt change in “refractive index,” a property that affects how waves such as light propagate through a material. This occurs at the interface where two materials with very different refractive indices meet, for example at the interface between air and silicon. Adding a coating with an intermediate refractive index at the interface eases the transition between materials and reduces the reflection, Black explained.

    “The issue with using such coatings for solar cells,” he said, “is that we’d prefer to fully capture every color of the light spectrum within the device, and we’d like to capture the light irrespective of the direction it comes from. But each color of light couples best with a different antireflection coating, and each coating is optimized for light coming from a particular direction. So you deal with these issues by using multiple antireflection layers. We were interested in looking for a better way.”

    For inspiration, the scientists turned to a well-known example of an antireflective surface in nature, the eyes of common moths. The surfaces of their compound eyes have textured patterns made of many tiny “posts,” each smaller than the wavelengths of light. This textured surface improves moths’ nighttime vision, and also prevents the “deer in the headlights” reflecting glow that might allow predators to detect them.

    “We set out to recreate moth eye patterns in silicon at even smaller sizes using methods of nanotechnology,” said Atikur Rahman, a postdoctoral fellow working with Black at the CFN and first author of the study.

    2
    A closeup shows how the nanotextured square of silicon completely blocks reflection compared with the surrounding silicon wafer.

    The scientists started by coating the top surface of a silicon solar cell with a polymer material called a “block copolymer,” which can be made to self-organize into an ordered surface pattern with dimensions measuring only tens of nanometers. The self-assembled pattern served as a template for forming posts in the solar cell like those in the moth eye using a plasma of reactive gases—a technique commonly used in the manufacture of semiconductor electronic circuits.

    The resulting surface nanotexture served to gradually change the refractive index to drastically cut down on reflection of many wavelengths of light simultaneously, regardless of the direction of light impinging on the solar cell.

    “Adding these nanotextures turned the normally shiny silicon surface absolutely black,” Rahman said.

    Solar cells textured in this way outperform those coated with a single antireflective film by about 20 percent, and bring light into the device as well as the best multi-layer-coatings used in the industry.

    “We are working to understand whether there are economic advantages to assembling silicon solar cells using our method, compared to other, established processes in the industry,” Black said.

    Hidden layer explains better-than-expected performance

    One intriguing aspect of the study was that the scientists achieved the antireflective performance by creating nanoposts only half as tall as the required height predicted by a mathematical model describing the effect. So they called upon the expertise of colleagues at the CFN and other Brookhaven scientists to help sort out the mystery.

    3
    Details of the nanotextured antireflective surface as revealed by a scanning electron microscope at the Center for Functional Nanomaterials. The tiny posts, each smaller than the wavelengths of light, are reminiscent of the structure of moths’ eyes, an example of an antireflective surface found in nature.

    “This is a powerful advantage of doing research at the CFN—both for us and for academic and industrial researchers coming to use our facilities,” Black said. “We have all these experts around who can help you solve your problems.”

    Using a combination of computational modeling, electron microscopy, and surface science, the team deduced that a thin layer of silicon oxide similar to what typically forms when silicon is exposed to air seemed to be having an outsized effect.

    “On a flat surface, this layer is so thin that its effect is minimal,” explained Matt Eisaman of Brookhaven’s Sustainable Energy Technologies Department and a professor at Stony Brook University. “But on the nanopatterned surface, with the thin oxide layer surrounding all sides of the nanotexture, the oxide can have a larger effect because it makes up a significant portion of the nanotextured material.”

    Said Black, “This ‘hidden’ layer was the key to the extra boost in performance.”

    The scientists are now interested in developing their self-assembly based method of nanotexture patterning for other materials, including glass and plastic, for antiglare windows and coatings for solar panels.

    This research was supported by the DOE Office of Science.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    BNL Campus

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world.Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 8:36 pm on December 8, 2014 Permalink | Reply
    Tags: , Clean Energy, , , ,   

    From PPPL: “Monumental effort: How a dedicated team completed a massive beam-box relocation for the NSTX upgrade” 


    PPPL

    December 8, 2014
    By John Greenwald

    Your task: Take apart, decontaminate, refurbish, relocate, reassemble, realign and reinstall a 75-ton neutral beam box that will add a second beam box to the National Spherical Torus Experiment-Upgrade (NSTX-U) and double the experiment’s heating power. Oh, and while you’re at it, hoist the two-story tall box over a 22-foot wall.

    Members of the “Beam Team” faced those challenges when moving the huge box from the Tokamak Fusion Test Reactor (TFTR) cell to the NSTX-U cell. The task required all the savvy of the PPPL engineers and technicians who make up the veteran team. “They’re a tight-knit group that really knows what they’re doing,” said Mike Williams, director of engineering and infrastructure and associate director of PPPL and a former member of the team himself.

    The second box is one of the two major components of the upgrade that will make NSTX-U the most powerful spherical tokamak fusion facility in the world when construction is completed early next year. The new center stack that serves as the other component will double the strength and duration of the magnetic field that controls the plasma that fuels fusion reactions.

    The two new components will work together hand-in-glove. The stronger magnetic field will increase the confinement time for the plasma while the second beam box performs double-duty. Its beams will raise the temperature of the plasma and will help to maintain a current in the plasma to demonstrate that future tokamaks can operate in a continuous condition known as a “steady state.” The second box is “an absolutely crucial part of the upgrade,” said Masayuki Ono, project director for the NSTX-U.

    PPPL Tokamak
    PPPL Tokamak

    Work began in 2009

    Work on the second beam box began in 2009 when technicians clad in protective clothing dismantled and decontaminated the box as it sat in the TFTR test cell. While the box had used radioactive tritium to heat the plasma in TFTR, no tritium will be used in NSTX-U experiments.

    The decontamination took huge effort, said Tim Stevenson, who led the beam box project. Workers wearing protective garb used cloths, Windex and sprayers with deionized water to clean every part of the box by hand, and went over each part as many as 50 separate times. The cloths were then packaged and shipped to a Utah radiation-waste disposal site.

    Next came the task of moving the beam box and its cleaned and refurbished components out of the TFTR area and into the NSTX-U test cell next door. But how do you get something so massive to budge?

    The Beam Team solved the problem with air casters, said Ron Strykowsky, who heads the NSTX-U upgrade program. Using a ceiling crane, workers lifted the box onto the casters, which floated the load on a cushion of air just above the floor, enabling forklifts to tow it. Technicians then removed some hardware from the large doorway between the two test cells so the beam box could get through.

    The doorway led to a section of the NSTX-U area that is separated from the vacuum vessel by a 22-foot shield wall — a barrier too high for the box and its lid to clear when suspended by sling from a crane. Workers surmounted the problem by first lifting the box and then the lid, which had been removed during the decontamination process. The parts cleared the wall and sailed over the vacuum vessel before coming to rest on the test cell floor. The vessel itself was wrapped in plastic to prevent contamination from any tritium that might still be in the box and the lid as they swung by overhead.

    “Like rebuilding a ship in a bottle”

    The beam box was now ready to be reassembled and reinstalled. But carving out room for all the parts and equipment, including power supplies, cables, and cooling water pipes, proved difficult. “There were so many conflicting demands for space that it was like rebuilding a ship in a bottle,” Stevenson said, citing a remark originally made by engineer Larry Dudek, who heads the center stack upgrade project. “There was no existing footprint,” Stevenson said. “We had to make our own footprint.”

    Technicians needed to cut a port into the vacuum vessel for the beam to pass through. But the supplier-built unit that connected the box to the vessel left too much space between the unit and this new port, requiring the Welding Shop to fill in the gap. “The Welding Shop saved the port,” Stevenson said.

    Still another challenge called for ensuring that the beam would enter the plasma at precisely the angle that NSTX-U specifications required. Complicating this task was the test cell’s uneven floor, which meant that the position of the box also had to be adjusted. To align the beam, engineers used measurements to derive a bull’s-eye on the inside of the vessel; technicians then used laser technology to zero in on the target. The joint effort aligned the beam to within 80 thousands of an inch of the target.

    Installing power supplies

    Left to complete was installation of power supplies, a task accomplished earlier this year. The job called for bringing three orange high-voltage enclosures — the source of the power — up from a basement area and into the test cell through a hatch in the floor. Taken together, the two NSTX-U beam boxes will have the capacity to put up to 18 megawatts of power into the plasma, enough to briefly light some 20,000 homes.

    When asked to name the greatest challenge the project encountered, Stevenson replied, “The whole thing was fraught with challenges and difficulties. It was a monumental team effort that took a great deal of preparation. And when it was show-time, everyone showed up.”

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Princeton Plasma Physics Laboratory is a U.S. Department of Energy national laboratory managed by Princeton University.

     
  • richardmitnick 6:01 pm on November 26, 2014 Permalink | Reply
    Tags: , , Clean Energy, ,   

    From Caltech: “New Technique Could Harvest More of the Sun’s Energy” 

    Caltech Logo
    Caltech

    11/26/2014
    Jessica Stoller-Conrad

    As solar panels become less expensive and capable of generating more power, solar energy is becoming a more commercially viable alternative source of electricity. However, the photovoltaic cells now used to turn sunlight into electricity can only absorb and use a small fraction of that light, and that means a significant amount of solar energy goes untapped.

    A new technology created by researchers from Caltech, and described in a paper published online in the October 30 issue of Science Express, represents a first step toward harnessing that lost energy.

    m
    An ultra-sensitive needle measures the voltage that is generated while the nanospheres are illuminated.
    Credit: AMOLF/Tremani – Figure: Artist impression of the plasmo-electric effect.

    Sunlight is composed of many wavelengths of light. In a traditional solar panel, silicon atoms are struck by sunlight and the atoms’ outermost electrons absorb energy from some of these wavelengths of sunlight, causing the electrons to get excited. Once the excited electrons absorb enough energy to jump free from the silicon atoms, they can flow independently through the material to produce electricity. This is called the photovoltaic effect—a phenomenon that takes place in a solar panel’s photovoltaic cells.

    Although silicon-based photovoltaic cells can absorb light wavelengths that fall in the visible spectrum—light that is visible to the human eye—longer wavelengths such as infrared light pass through the silicon. These wavelengths of light pass right through the silicon and never get converted to electricity—and in the case of infrared, they are normally lost as unwanted heat.

    “The silicon absorbs only a certain fraction of the spectrum, and it’s transparent to the rest. If I put a photovoltaic module on my roof, the silicon absorbs that portion of the spectrum, and some of that light gets converted into power. But the rest of it ends up just heating up my roof,” says Harry A. Atwater, the Howard Hughes Professor of Applied Physics and Materials Science; director, Resnick Sustainability Institute, who led the study.

    Now, Atwater and his colleagues have found a way to absorb and make use of these infrared waves with a structure composed not of silicon, but entirely of metal.

    The new technique they’ve developed is based on a phenomenon observed in metallic structures known as plasmon resonance. Plasmons are coordinated waves, or ripples, of electrons that exist on the surfaces of metals at the point where the metal meets the air.

    While the plasmon resonances of metals are predetermined in nature, Atwater and his colleagues found that those resonances are capable of being tuned to other wavelengths when the metals are made into tiny nanostructures in the lab.

    “Normally in a metal like silver or copper or gold, the density of electrons in that metal is fixed; it’s just a property of the material,” Atwater says. “But in the lab, I can add electrons to the atoms of metal nanostructures and charge them up. And when I do that, the resonance frequency will change.”

    “We’ve demonstrated that these resonantly excited metal surfaces can produce a potential”—an effect very similar to rubbing a glass rod with a piece of fur: you deposit electrons on the glass rod. “You charge it up, or build up an electrostatic charge that can be discharged as a mild shock,” he says. “So similarly, exciting these metal nanostructures near their resonance charges up those metal structures, producing an electrostatic potential that you can measure.”

    This electrostatic potential is a first step in the creation of electricity, Atwater says. “If we can develop a way to produce a steady-state current, this could potentially be a power source. He envisions a solar cell using the plasmoelectric effect someday being used in tandem with photovoltaic cells to harness both visible and infrared light for the creation of electricity.

    Although such solar cells are still on the horizon, the new technique could even now be incorporated into new types of sensors that detect light based on the electrostatic potential.

    “Like all such inventions or discoveries, the path of this technology is unpredictable,” Atwater says. “But any time you can demonstrate a new effect to create a sensor for light, that finding has almost always yielded some kind of new product.”

    This work was published in a paper titled, Plasmoelectric Potentials in Metal Nanostructures. Other coauthors include first author Matthew T. Sheldon, a former postdoctoral scholar at Caltech; Ana M. Brown, an applied physics graduate student at Caltech; and Jorik van de Groep and Albert Polman from the FOM Institute AMOLF in Amsterdam. The study was funded by the Department of Energy, the Netherlands Organization for Scientific Research, and an NSF Graduate Research Fellowship.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The California Institute of Technology (commonly referred to as Caltech) is a private research university located in Pasadena, California, United States. Caltech has six academic divisions with strong emphases on science and engineering. Its 124-acre (50 ha) primary campus is located approximately 11 mi (18 km) northeast of downtown Los Angeles. “The mission of the California Institute of Technology is to expand human knowledge and benefit society through research integrated with education. We investigate the most challenging, fundamental problems in science and technology in a singularly collegial, interdisciplinary atmosphere, while educating outstanding students to become creative members of society.”
    Caltech buildings

     
  • richardmitnick 2:29 pm on November 11, 2014 Permalink | Reply
    Tags: , Clean Energy, , ,   

    From PPPL: “PPPL researchers present cutting edge results at APS Plasma Physics Conference” 


    PPPL

    November 10, 2014
    Kitta MacPherson
    Email: kittamac@pppl.gov
    Phone: 609-243-2755

    Some 135 researchers, graduate students, and staff members from PPPL joined 1,500 research scientists from around the world at the 56th annual meeting of the American Physical Society Division of Plasma Physics Conference from Oct. 27 to Oct. 31 in New Orleans. Topics in the sessions ranged from waves in plasma to the physics of ITER, the international physics experiment in Cadarache, France; to women in plasma physics. Dozens of PPPL scientists presented the results of their cutting-edge research into magnetic fusion and plasma science. There were about 100 invited speakers at the conference, more than a dozen of whom were from PPPL.

    sw
    Conceptual image of the solar wind from the sun encountering the Earth’s magnetosphere. No image credit

    The press releases in this issue are condensed versions of press releases that were prepared by the APS with the assistance of the scientists quoted and with background material written by John Greenwald and Jeanne Jackson DeVoe. The full text is available at the APS Virtual Pressroom 2014: http://www.aps.org/units/dpp/meetings/vpr/2014/index.cfm.

    How magnetic reconnection goes “Boom!”

    MRX research reveals how magnetic energy turns into explosive particle energy

    Paper by: M. Yamada, J. Yoo

    Magnetic reconnection, in which the magnetic field lines in plasma snap apart and violently reconnect, creates massive eruptions of plasma from the sun. But how reconnection transforms magnetic energy into explosive particle energy has been a major mystery.

    Now research conducted on the Magnetic Reconnection Experiment (MRX) at PPPL has taken a key step toward identifying how the transformation takes place, and measuring experimentally the amount of magnetic energy that turns into particle energy. The investigation showed that reconnection in a prototypical reconnection layer converts about 50 percent of the magnetic energy, with one-third of the conversion heating the electrons and two-thirds accelerating the ion in the plasma.

    “This is a major milestone for our research,” said Masaaki Yamada, the principal investigator for the MRX. “We can now see the entire picture of how much of the energy goes to the electrons and how much to the ions in a prototypical reconnection layer.”

    What a Difference a Magnetic Field Makes

    Experiments on MRX confirm the lack of symmetry in converging space plasmas

    Paper by: J. Yoo

    Spacecraft observing magnetic reconnection have noted a fundamental gap between most theoretical studies of the phenomenon and what happens in space. While the studies assume that the converging plasmas share symmetrical characteristics such as temperature, density and magnetic strength, observations have shown that this is hardly the case.

    PPPL researchers have now found the disparity in plasma density in experiments conducted on the MRX. The work, done in collaboration with the Space Science Center at the University of New Hampshire, marks the first laboratory confirmation of the disparity and deepens understanding of the mechanisms involved.

    Data from the MRX findings could help to inform a four-satellite mission—the Magnetospheric Multiscale Mission, or MMS—that NASA plans to launch next year to study reconnection in the magnetosphere. The probes could produce a better understanding of geomagnetic storms and lead to advanced warning of the disturbances and an improved ability to cope with them.

    Using radio waves to control density in fusion plasma

    Experiments show how heating electrons in the center of hot fusion plasma can increase turbulence, reducing the density in the inner core

    Paper by: D. Ernst, K. Burrell, W. Guttenfelder, T. Rhodes, A. Dimits

    Recent fusion experiments on the DIII-D tokamak at General Atomics in San Diego and the Alcator C-Mod tokamak at MIT show that beaming microwaves into the center of the plasma can be used to control the density in the center of the plasma. The experiments and analysis were conducted by a team of researchers as part of a National Fusion Science Campaign.

    The new experiments reveal that turbulent density fluctuations in the inner core intensify when most of the heat goes to electrons instead of plasma ions, as would happen in the center of a self-sustaining fusion reaction. Supercomputer simulations closely reproduce the experiments, showing that the electrons become more turbulent as they are more strongly heated, and this transports both particles and heat out of the plasma.

    “As we approached conditions where mainly the electrons are heated, pure trapped electrons begin to dominate,” said Walter Guttenfelder, who did the supercomputer simulations for the DIII-D experiments along with Andris Dimits of Lawrence Livermore National Laboratory. Guttenfelder was a co-leader of the experiments and simulations with Keith Burrell of General Atomics and Terry Rhoades of UCLA. Darin Ernst of MIT led the overall research.

    Calming the Plasma Edge: The Tail that Wags the Dog

    Lithium injections show promise for optimizing the performance of fusion plasmas

    Paper by: G.L. Jackson, R. Maingi, T. Osborne, Z. Yan, D. Mansfield, S.L. Allen

    Experiments on the DIII-D tokamak fusion reactor that General Atomics operates for the U.S. Department of Energy have demonstrated the ability of lithium injections to transiently double the temperature and pressure at the edge of the plasma and delay the onset of instabilities and other transients. Researchers conducted the experiments using a lithium-injection device developed at PPPL.

    Lithium can play an important role in controlling the edge region and hence the evolution of the entire plasma. In the present work, lithium diminished the frequency of instabilities known as “edge localized modes” (ELMs), which have associated heat pulses that can damage the section of the vessel wall used to exhaust heat in fusion devices.

    The tailored injections produced ELM-free periods of up to 0.35 seconds, while reference discharges without lithium showed no ELM-free periods above 0.03 sec. The lithium rapidly increased the width of the pedestal region—the edge of the plasma where temperature drops off sharply—by up to 100 percent and raised the electron pressure and total pressure in the edge by up to 100 percent and 60 percent respectively. These dramatic effects produced a 60 percent increase in total energy-confinement time.

    Scratching the surface of a material mystery

    Scientists shed new light on how lithium conditions the volatile edge of fusion plasmas

    Paper by: Angela Capece

    For fusion energy to fuel future power plants, scientists must find ways to control the interactions that take place between the volatile edge of fusion plasma and the physical walls that surround it in fusion facilities. Such interactions can profoundly affect conditions at the superhot core of the plasma in ways that include kicking up impurities that cool down the core and halt fusion reactions. Among the puzzles is how temperature affects the ability of lithium to absorb and retain the deuterium particles that stray from the fuel that creates fusion reactions.

    Answers are now emerging from a new surface-science laboratory at PPPL that can probe lithium coatings that are just three atoms thick. The experiments showed that the ability of ultrathin lithium films to retain deuterium drops as the temperature of the molybdenum substrate rises—a result that provides insight into how lithium affects the performance of tokamaks

    Experiments further showed that exposing the lithium to oxygen improved deuterium retention at temperatures below about 400 degrees Kelvin. But without exposure to oxygen, lithium films could retain deuterium at higher temperatures as a result of lithium-deuterium bonding during a PPPL experiment.

    Putting Plasma to Work Upgrading the U.S. Power Grid

    PPPL lends GE a hand in developing an advanced power-conversion switch

    Paper by: Johan Carlsson, Alex Khrabrov, Igor Kaganovich, Timothy Summerer

    When researchers at General Electric sought help in designing a plasma-based power switch, they turned to PPPL. The proposed switch, which GE is developing under contract with the DOE’s Advanced Research Projects Agency-Energy, could contribute to a more advanced and reliable electric grid and help lower utility bills.

    The switch would consist of a plasma-filled tube that turns current on and off in systems that convert the direct current coming from long-distance power lines to the alternating current that lights homes and businesses; such systems are used to reverse the process as well.

    To assist GE, PPPL used a pair of computer codes to model the properties of plasma under different magnetic configurations and gas pressures. GE also studied PPPL’s use of liquid lithium, which the laboratory employs to prevent damage to the divertor that exhausts heat in a fusion facility. The information could help GE develop a method for protecting the liquid-metal cathode—the negative terminal inside the tube—from damage from the ions carrying the current flowing through the plasma.

    Laser experiments mimic cosmic explosions

    Scientists bring plasma tsunamis into the lab

    Researchers are finding ways to understand some of the mysteries of space without leaving earth. Using high-intensity lasers at the University of Rochester’s OMEGA EP Facility focused on targets smaller than a pencil’s eraser, they conducted experiments to create colliding jets of plasma knotted by plasma filaments and self-generated magnetic fields.

    In two related experiments, researchers used powerful lasers to recreate a tiny laboratory version of what happens at the beginning of solar flares and stellar explosions, creating something like a gigantic plasma tsunami in space. Much of what happens in those situations is related to magnetic reconnection, which can accelerate particles to high energy and is the force driving solar flares towards earth.

    Laboratory experiment aims to identify how tsunamis of plasma called “shock waves” form in space

    By W. Fox, G. Fisksel (LLE), A. Bhattacharjee

    William Fox, a researcher at the U.S. Department of Energy’s Princeton Plasma Physics Laboratory, and his colleague Gennady Fiksel, of the University of Rochester, got an unexpected result when they used lasers in the Laboratory to recreate a tiny version of a gigantic plasma tsunami called a “shock wave.” The shock wave is a thin area found at the boundary between a supernova and the colder material around it that has a turbulent magnetic field that sweeps up plasma into a steep tsunami-like wave of plasma.

    Fox and Fiksel used two very powerful lasers to zap two tiny pieces of plastic in a vacuum chamber to 10 million degrees and create two colliding plumes of extremely hot plasma. The researchers found something they had not anticipated that had not previously been seen in the laboratory: When the two plasmas merged they broke into clumps of long thin filaments due to a process called the “Weibel instability.” This instability could be causing the turbulent magnetic fields that form the shock waves in space. Their research could shed light on the origin of primordial magnetic fields that formed when galaxies were created and could help researchers understand how cosmic rays are accelerated to high energies.

    Magnetic reconnection in the laboratory

    By: G. Fiksel (LLE), W. Fox, A. Bhattacharjee

    Many plasmas in space already contain a strong magnetic field, so colliding plasmas there behave somewhat differently. Gennady Fiksel, of the University of Rochester, and William Fox continued their previous research by adding a magnetic field by pulsing current through very small wires. They then created the two colliding plumes of plasma as they did in an earlier experiment. When the two plasmas collided it compressed and stretched the magnetic field and a tremendous amount of energy accumulated in the field like a stretched rubber band. As the magnetic field lines pushed close together, the long lines broke apart and reformed like a single stretched rubber band, forming a slingshot that propels the plasma and releases the energy into the plasma, accelerating the plasma and heating it.

    The experiment showed that the reconnection process happens faster than theorists had previously predicted. This could help shed light on solar flares and coronal mass ejections, which also happen extremely quickly. Coronal mass ejections can trigger geomagnetic storms that can interfere with satellites and wreak havoc with cellphone service.

    The laser technique the scientists are using is new in the area of high energy density plasma and allows scientists to control the magnetic field to manipulate it in various ways.

    See the full article here.

    Princeton Plasma Physics Laboratory is a U.S. Department of Energy national laboratory managed by Princeton University.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 7:10 am on November 11, 2014 Permalink | Reply
    Tags: , Clean Energy, , ,   

    From PPPL: “Hole in one: Technicians smoothly install the center stack in the NSTX-U vacuum vessel” 


    PPPL

    November 10, 2014
    John Greenwald

    With near-surgical precision, PPPL technicians hoisted the 29,000-pound center stack for the National Spherical Torus Experiment-Upgrade (NSTX-U) over a 20-foot wall and lowered it into the vacuum vessel of the fusion facility. The smooth operation on Oct. 24 capped more than two years of construction of the center stack, which houses the bundle of magnetic coils that form the heart of the $94 million upgrade.

    lift
    Closeup of the center stack being lowered into position by an overhead crane. (Photo by Elle Starkman/PPPL Office of Communications)

    “This was really a watershed moment,” said Mike Williams, the head of engineering and infrastructure at PPPL and associate director of the Laboratory. “The critical path [or key sequence of steps for the upgrade] was fabrication of the magnets, and that has now been done.”

    The lift team conducted the final steps largely in silence, attaching the bundled coils in their casing to an overhead crane and guiding the 21 foot-long center stack into place. The clearances were tiny: The bottom of the casing passed just inches over the shielding wall and the top of the vacuum vessel. Inserting the center stack into the vessel was like threading a needle, since the clearance at the opening was only about an inch. Guidance came chiefly from hand signals, with some radio communication at the end.

    more

    Key features

    The installation merged three key features of the upgrade that had been developed separately. These included the casing, the bundled coils and the work to ready the vacuum vessel for the center stack. Slipping the casing over the bundle was a highly precise task, with the space between them less than an inch. “The key word is ‘fit-up,’” said Ron Strykowsky, who heads the upgrade project. “We had a robust-enough design to handle all the very fine tolerances.”

    Installation of the center stack completed a key portion of the upgrade and opened another chapter. “For me, the burden is off our shoulders,” said Jim Chrzanowski, who led the coil project and retired on Oct. 31 after 39 years at PPPL. “We’ve delivered the center stack and are happy,” added Steve Raftopoulos, who worked alongside Chrzanowski and succeeds him as head of coil building. “This is my baby now,” said Raftopoulos, noting that he will be called on to resolve any problems that occur once the center stack is in operation.

    Praise for technicians

    The two leaders praised the many technicians who made the center stack possible. They ranged from a core of roughly a dozen workers who had been with the project from the beginning to technicians throughout the Laboratory who were called on to pitch in. “We drafted everyone,” Chrzanowski said.

    Their tasks included sanding, welding and applying insulation tape to each of the 36 copper conductors that went into the center stack, and sealing them all together through multiple applications of vacuum pressure impregnation — a potentially volatile process. Next came fabrication and winding of the ohmic heating coil that wraps around the conductors to put current into the hot, charged plasma that fuels fusion reactions.

    “Everyone who worked on this feels a lot of pride and ownership,” Raftopoulos said. “Steve and I were the conductors, but the technicians were the orchestra,” Chrzanowski said. “We’ve got to give credit to the guys who actually build the machines. They take our problems and make them go away.”

    Completion of the upgrade now rests with technicians working under engineer Erik Perry. Their jobs include connecting the center stack to the facility’s outer coils to complete a donut-shape magnetic field that will be used to confine the plasma. The work calls for installation of layers of custom-made electrical equipment plus hoses for water to cool the coils, all of which must fit around diagnostic and other equipment. Also ahead lies the task of connecting a second neutral beam injector for heating the plasma to the vacuum vessel. “It’s like a big puzzle,” said Perry. “Everything must fit together, and that’s what we excel at.”

    See the full article here.

    Princeton Plasma Physics Laboratory is a U.S. Department of Energy national laboratory managed by Princeton University.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 3:20 pm on October 29, 2014 Permalink | Reply
    Tags: , , Clean Energy, ,   

    From LBL: “New Lab Startup Afingen Uses Precision Method to Enhance Plants” 

    Berkeley Logo

    Berkeley Lab

    October 29, 2014
    Julie Chao (510) 486-6491

    Imagine being able to precisely control specific tissues of a plant to enhance desired traits without affecting the plant’s overall function. Thus a rubber tree could be manipulated to produce more natural latex. Trees grown for wood could be made with higher lignin content, making for stronger yet lighter-weight lumber. Crops could be altered so that only the leaves and certain other tissues had more wax, thus enhancing the plant’s drought tolerance, while its roots and other functions were unaffected.

    By manipulating a plant’s metabolic pathways, two scientists at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab), Henrik Scheller and Dominique Loqué, have figured out a way to genetically rewire plants to allow for an exceptionally high level of control over the spatial pattern of gene expression, while at the same time boosting expression to very high levels. Now they have launched a startup company called Afingen to apply this technology for developing low-cost biofuels that could be cost-competitive with gasoline and corn ethanol.

    two
    Henrik Scheller (left) and Dominique Loque hold a tray of Arabidopsis Thaliana plants, which they used in their research. (Berkeley Lab photo)

    “With this tool we seem to have found a way to control very specifically what tissue or cell type expresses whatever we want to express,” said Scheller. “It’s a new way that people haven’t thought about to increase metabolic pathways. It could be for making more cell wall, for increasing the stress tolerance response in a specific tissue. We think there are many different applications.”

    Cost-competitive biofuels

    Afingen was awarded a Small Business Innovation Research (SBIR) grant earlier this year for $1.72 million to engineer switchgrass plants that will contain 20 percent more fermentable sugar and 40 percent less lignin in selected structures. The grant was provided under a new SBIR program at DOE that combines an SBIR grant with an option to license a specific technology produced at a national laboratory or university through DOE-supported research.

    “Techno-economic modeling done at (the Joint BioEnergy Institute, or JBEI) has shown that you would get a 23 percent reduction in the price of the biofuel with just a 20 percent reduction in lignin,” said Loqué. “If we could also increase the sugar content and make it easier to extract, that would reduce the price even further. But of course it also depends on the downstream efficiency.”

    Scheller and Loqué are plant biologists with the Department of Energy’s Joint BioEnergy Institute (JBEI), a Berkeley Lab-led research center established in 2007 to pursue breakthroughs in the production of cellulosic biofuels. Scheller heads the Feedstocks Division and Loqué leads the cell wall engineering group.

    The problem with too much lignin in biofuel feedstocks is that it is difficult and expensive to break down; reducing lignin content would allow the carbohydrates to be released and converted into fuels much more cost-effectively. Although low-lignin plants have been engineered, they grow poorly because important tissues lack the strength and structural integrity provided by the lignin. With Afingen’s technique, the plant can be manipulated to retain high lignin levels only in its water-carrying vascular cells, where cell-wall strength is needed for survival, but low levels throughout the rest of the plant.

    The centerpiece of Afingen’s technology is an “artificial positive feedback loop,” or APFL. The concept targets master transcription factors, which are molecules that regulate the expression of genes involved in certain biosynthetic processes, that is, whether certain genes are turned “on” or “off.” The APFL technology is a breakthrough in plant biotechnology, and Loqué and Scheller recently received an R&D 100 Award for the invention.

    An APFL is a segment of artificially produced DNA coded with instructions to make additional copies of a master transcription factor; when it is inserted at the start of a chosen biosynthetic pathway—such as the pathway that produces cellulose in fiber tissues—the plant cell will synthesize the cellulose and also make a copy of the master transcription factor that launched the cycle in the first place. Thus the cycle starts all over again, boosting cellulose production.

    The process differs from classical genetic engineering. “Some people distinguish between ‘transgenic’ and ‘cisgenic.’ We’re using only pieces of DNA that are already in that plant and just rearranging them in a new way,” said Scheller. “We’re not bringing in foreign DNA.”

    Other licensees and applications

    This breakthrough technique can also be used in fungi and for a wide variety of uses in plants, for example, to increase food crop yields or to boost production of highly specialized molecules used by the pharmaceutical and chemical industries. “It could also increase the quality of forage crops, such as hay fed to cows, by increasing the sugar content or improving the digestibility,” Loqué said.

    Another intriguing application is for biomanufacturing. By engineering plants to grow entirely new pharmaceuticals, specialty chemicals, or polymer materials, the plant essentially becomes a “factory.” “We’re interested in using the plant itself as a host for production,” Scheller said. “Just like you can upregulate pathways in plants that make cell walls or oil, you can also upregulate pathways that make other compounds or properties of interest.”

    Separately, two other companies are using the APFL technology. Tire manufacturer Bridgestone has a cooperative research and development agreement (CRADA) with JBEI to develop more productive rubber-producing plants. FuturaGene, a Brazilian paper and biomass company, has licensed the technology for exclusive use with eucalyptus trees and several other crops; APFL can enhance or develop traits to optimize wood quality for pulping and bioenergy applications.

    “The inventors/founders of Afingen made the decision to not compete for a license in fields of use that were of interest to other companies that had approached JBEI. This allowed JBEI to move the technology forward more quickly on several fronts,” said Robin Johnston, Berkeley Lab’s Acting Deputy Chief Technology Transfer Officer. “APFL is a very insightful platform technology, and I think only a fraction of the applications have even been considered yet.”

    Afingen currently has one employee—Ai Oikawa, a former postdoctoral researcher and now the director of plant engineering—and will be hiring three more in November. It is the third startup company to spin out of JBEI. The first two were Lygos, which uses synthetic biology tools to produce chemical compounds, and TeselaGen, which makes tools for DNA synthesis and cloning.

    See the full article here.

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 3:35 pm on October 27, 2014 Permalink | Reply
    Tags: , Clean Energy, , , ,   

    From AAAS: “After Election 2014: FUSION RESEARCH” 

    AAAS

    AAAS

    ScienceInsider

    24 October 2014
    Adrian Cho

    Should we stay or should we go? Once the voters have spoken, that’s the question Congress will have to answer regarding the United States’ participation in ITER, the hugely overbudget fusion experiment under construction in Cadarache, France. Some lawmakers say it may be time for the United States to bow out, especially as the growing ITER commitment threatens to starve U.S.-based fusion research programs. The next Congress may have to decide the issue—if the current one doesn’t pull the plug first when it returns to Washington, D.C., for a 6-week lame-duck session.

    ITER Tokamak
    ITER Tokamak

    For those tired of the partisan squabbling on Capitol Hill, the ITER debate may provide curious relief. ITER appears to enjoy bipartisan support in the House of Representatives—and bipartisan opposition among key senators.

    ITER aims to prove that nuclear fusion is a viable source of energy, and the United States has agreed to build 9% of the reactor’s hardware, regardless of the cost. Recent estimates suggest the U.S. price tag could be $3.9 billion or more—nearly quadrupling original estimates and raising alarm among some lawmakers. In response, this past June a Senate appropriations subcommittee proposed a budget bill that would end U.S. participation in the project next year. In contrast, the next month the House passed a bill that would increase U.S. spending on ITER.

    Some observers think the current Congress will kick the issue to the next one by passing a stop-gap budget for fiscal year 2015, which began 1 October, that will keep U.S. ITER going. “I don’t think in the end they can come out and kill ITER based on what the Senate subcommittee did,” says Stephen Dean, president of Fusion Power Associates, a research and educational foundation in Gaithersburg, Maryland. Others say a showdown could come by year’s end.

    Trouble over ITER has been brewing for years. ITER was originally proposed in 1985 as a joint U.S.-Soviet Union venture. The United States backed out of the project in 1998 because of cost and schedule concerns—only to rejoin in 2003. At the time, ITER construction costs were estimated at $5 billion. That number had jumped to $12 billion by 2006, when the European Union, China, India, Japan, Russia, South Korea, and the United States signed a formal agreement to build the device. At the time, ITER was supposed to start running in 2016. By 2011, U.S. costs for ITER had risen to more than $2 billion, and the date for first runs had slid to 2020. But even that date was uncertain; U.S. ITER researchers did not have a detailed cost projection and schedule—or performance baseline—to go by.

    Then in 2013, the Department of Energy (DOE) argued in its budget request for the following year that U.S. ITER was not a “capital asset” and therefore did not have to go through the usual DOE review process for large construction projects—which requires a performance baseline. Even though DOE promised to limit spending on ITER to $225 million a year so as not to starve domestic fusion research efforts, that statement irked Senators Dianne Feinstein (D–CA) and Lamar Alexander (R–TN), the chair and ranking member of the Senate Appropriations Subcommittee on Energy and Water Development, respectively. They and other senators asked the Government Accountability Office (GAO) to investigate the U.S. ITER project.

    This year, things appeared to come to a head. This past April, researchers working on U.S. ITER released their new $3.9 billion cost estimate and moved back the date for first runs to 2024 or later. Two months later, GAO reported that even that new estimate was not reliable and that the cost to the United States could reach $6.5 billion. Based on that report, the Senate energy and water subcommittee moved to kill U.S. ITER in its markup of the proposed 2015 budget, giving it only $75 million for the year, half of what the White House had requested and just enough to wind things down. Alexander supported the move, even though the U.S. ITER office is based in his home state of Tennessee, at Oak Ridge National Laboratory.

    ITER still has friends in the House, however. In their version of the DOE budget for 2015, House appropriators gave ITER $225 million, $75 million more than the White House request. Moreover, the project seems to have bipartisan support in the House, as shown by a hearing of the energy subcommittee of the House Committee on Science, Space, and Technology. Usually deeply divided along party lines, the subcommittee came together to lavish praise on ITER, with representative Lamar Smith (R–TX), chair of the full committee, and Representative Eric Swalwell (D–CA), the ranking member on the subcommittee, agreeing that ITER was, in Swalwell’s words, “absolutely essential to proving that magnetically confined fusion can be a viable clean energy source.” Swalwell called for spending more than $225 million per year on ITER.

    When and how this struggle over ITER plays out depends on the answers to several questions. First, how will Congress deal with the already late budget for next year? The Senate, controlled by the Democrats, has yet to pass any of its 13 budget bills, including the one that would fund energy research. And if the House and Senate decide to simply continue the 2014 budget past the end of the year, then the decision on ITER will pass to the next Congress. If, on the other hand, Congress passes a last-minute omnibus budget for fiscal year 2015, then the fight over ITER could play out by year’s end.

    Second, how sincere is the Senate move to kill ITER? The Senate subcommittee’s move may have been meant mainly to send a signal to the international ITER organization that it needs to shape up, says one Democratic staffer in the House. The international ITER organization received scathing criticism in an independent review in October 2013. That review called for 11 different measures to overhaul the project’s management, and the Senate’s markup may have been meant primarily to drive home the message that those measures had to be taken to ensure continued U.S. involvement, the staffer says.

    Third, how broad is the House’s support for ITER? Over the past decade or so, the House has been more supportive of fusion in general, the Democratic staffer says. But some observers credit that support mainly to one person, Representative Rodney Frelinghuysen (R-NJ), a longtime member of the House Appropriations Committee. “Over the years he’s become a champion of fusion,” Dean says. “He protects it in the House.” Dean and others say that’s likely because the DOE’s sole dedicated fusion laboratory, the Princeton Plasma Physics Laboratory (PPPL), is in his home state of New Jersey (but not Frelinghuysen’s district).

    Indeed, observers say that Frelinghuysen has been instrumental in preventing cuts to the domestic fusion program proposed by DOE itself. For example, for fiscal 2014, DOE requested $458 million for its fusion energy sciences program, including $225 million for ITER. That meant cutting the domestic fusion program by about 20% to $233 million and closing one of three tokamak reactors in the United States. The Senate went along with those numbers, but House appropriators bumped the budget up to $506 million, the number that held sway in the final 2014 spending plan. But some observers speculate that Frelinghuysen might be willing to let ITER go if he could secure a brighter future for PPPL.

    PPPL Tokamak
    PPPL Tokamak

    PPPL NSTX
    PPPL National Spherical Torus Experiment

    Finally, the biggest question surrounding U.S. participation in ITER is: How will the international ITER organization respond to the calls for changes in its management structure? That should become clear within months. So far, officials with U.S. ITER have not been able to produce a baseline cost estimate and schedule in large measure, because the ITER project as a whole does not have a reliable schedule. The international ITER organization has said it will produce one by next July, the House staffer says. And if the international organization doesn’t produce a credible schedule, the staffer says, “the project will be very difficult to defend, even by its most ardent supporters.”

    See the full article here.

    The American Association for the Advancement of Science is an international non-profit organization dedicated to advancing science for the benefit of all people.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
Follow

Get every new post delivered to your Inbox.

Join 413 other followers

%d bloggers like this: