Tagged: ITER Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 10:09 am on December 12, 2016 Permalink | Reply
    Tags: , , , ITER, ,   

    From PPPL: “PPPL physicists win funding to lead a DOE exascale computing project” 


    October 27, 2016 [Just now out on social media.]
    Raphael Rosen

    PPPL physicist Amitava Bhattacharjee. (Photo by Elle Starkman/PPPL Office of Communications)

    A proposal from scientists at the U.S. Department of Energy’s (DOE) Princeton Plasma Physics Laboratory (PPPL) has been chosen as part of a national initiative to develop the next generation of supercomputers. Known as the Exascale Computing Project (ECP), the initiative will include a focus on exascale-related software, applications, and workforce training.

    Once developed, exascale computers will perform a billion billion operations per second, a rate 50 to 100 times faster than the most powerful U.S. computers now in use. The fastest computers today operate at the petascale and can perform a million billion operations per second. Exascale machines in the United States are expected to be ready in 2023.

    The PPPL-led multi-institutional project, titled High-Fidelity Whole Device Modeling of Magnetically Confined Fusion Plasmas, was selected during the ECP’s first round of application development funding, which distributed $39.8 million. The overall project will receive $2.5 million a year for four years to be distributed among all the partner institutions, including Argonne, Lawrence Livermore, and Oak Ridge national laboratories, together with Rutgers University, the University of California, Los Angeles, and the University of Colorado, Boulder. PPPL itself will receive $800,000 per year; the project it leads was one of 15 selected for full funding, and the only one dedicated to fusion energy. Seven additional projects received seed funding.

    The application efforts will help guide DOE’s development of a U.S. exascale ecosystem as part of President Obama’s National Strategic Computing Initiative (NSCI). DOE, the Department of Defense and the National Science Foundation have been designated as NSCI lead agencies, and ECP is the primary DOE contribution to the initiative.

    The ECP’s multi-year mission is to maximize the benefits of high performance computing (HPC) for U.S. economic competitiveness, national security and scientific discovery. In addition to applications, the DOE project addresses hardware, software, platforms and workforce development needs critical to the effective development and deployment of future exascale systems. The ECP is supported jointly by DOE’s Office of Science and the National Nuclear Security Administration within DOE.

    PPPL has been involved with high-performance computing for years. PPPL scientists created the XGC code, which models the behavior of plasma in the boundary region where the plasma’s ions and electrons interact with each other and with neutral particles produced by the tokamak’s inner wall. The high-performance code is maintained and updated by PPPL scientist C.S. Chang and his team.

    PPPL scientist C.S. Chang

    XGC runs on Titan, the fastest computer in the United States, at the Oak Ridge Leadership Computing Facility, a DOE Office of Science User Facility at Oak Ridge National Laboratory.

    ORNL Cray Titan Supercomputer
    ORNL Cray Titan Supercomputer

    The calculations needed to model the behavior of the plasma edge are so complex that the code uses 90 percent of the computer’s processing capabilities. Titan performs at the petascale, completing a million billion calculations each second, and the DOE was primarily interested in proposals by institutions that possess petascale-ready codes that can be upgraded for exascale computers.

    The PPPL proposal lays out a four-year plan to combine XGC with GENE, a computer code that simulates the behavior of the plasma core. GENE is maintained by Frank Jenko, a professor at the University of California, Los Angeles. Combining the codes would give physicists a far better sense of how the core plasma interacts with the edge plasma at a fundamental kinetic level, giving a comprehensive view of the entire plasma volume.

    Leading the overall PPPL proposal is Amitava Bhattacharjee, head of the Theory Department at PPPL. Co-principal investigators are PPPL’s Chang and Andrew Siegel, a computational scientist at the University of Chicago.

    The multi-institutional effort will develop a full-scale computer simulation of fusion plasma. Unlike current simulations, which model only part of the hot, charged gas, the proposed simulations will display the physics of an entire plasma all at once. The completed model will integrate the XGC and GENE codes and will be designed to run on exascale computers.

    The modeling will enable physicists to understand plasmas more fully, allowing them to predict its behavior within doughnut-shaped fusion facilities known as tokamaks. The exascale computing fusion proposal focuses primarily on ITER, the international tokamak being built in France to demonstrate the feasibility of fusion power.

    Iter experimental tokamak nuclear fusion reactor that is being built next to the Cadarache facility in Saint Paul les-Durance south of France
    Iter experimental tokamak nuclear fusion reactor that is being built next to the Cadarache facility in Saint Paul les-Durance south of France

    But the proposal will be developed with other applications in mind, including stellarators, another variety of fusion facility.

    Wendelstgein 7-X stellarator
    Wendelstgein 7-X stellarator,built in Greifswald, Germany

    Better predictions can lead to better engineered facilities and more efficient fusion reactors. Currently, support for this work comes from the DOE’s Advanced Science Computing Research program.

    “This will be a team effort involving multiple institutions,” said Bhattacharjee. He noted that PPPL will be involved in every aspect of the project, including working with applied mathematicians and computer scientists on the team to develop the simulation framework that will couple GENE with XGC on exascale computers.

    “You need a very-large-scale computer to calculate the multiscale interactions in fusion plasmas,” said Chang. “Whole-device modeling is about simulating the whole thing: all the systems together.”

    Because plasma behavior is immensely complicated, developing an exascale computer is crucial for future research. “Taking into account all the physics in a fusion plasma requires enormous computational resources,” said Bhattacharjee. “With the computer codes we have now, we are already pushing on the edge of the petascale. The exascale is very much needed in order for us to have greater realism and truly predictive capability.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Princeton Plasma Physics Laboratory is a U.S. Department of Energy national laboratory managed by Princeton University. PPPL, on Princeton University’s Forrestal Campus in Plainsboro, N.J., is devoted to creating new knowledge about the physics of plasmas — ultra-hot, charged gases — and to developing practical solutions for the creation of fusion energy. Results of PPPL research have ranged from a portable nuclear materials detector for anti-terrorist use to universally employed computer codes for analyzing and predicting the outcome of fusion experiments. The Laboratory is managed by the University for the U.S. Department of Energy’s Office of Science, which is the largest single supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.

  • richardmitnick 11:20 am on May 2, 2016 Permalink | Reply
    Tags: , , , ITER,   

    From COSMOS: “Getting primed for fusion power” 

    Cosmos Magazine bloc


    26 Apr 2016
    Cathal O’Connell

    Cracking fusion power would be one of the great technological achievements of the 21st century, providing almost limitless power with few drawbacks. With global efforts getting bigger and badder every year, Cathal O’Connell provides a primer to the basic technology.

    ITER Tokamak
    ITER Tokamak

    Fusion power is such a huge, potentially game-changing technology that it’s easy to get swept up in its utopian promise. Equally, it’s easy to dismiss the whole shebang as a wild fantasy that will never come to pass.

    Here’s what you need to know to help keep pace with developments in this global quest.

    What is nuclear fusion?

    Atoms are the really small bits from which we are all made. Inside each atom, when you strip away the shells of electrons, is an even smaller bit at the core – the nucleus.

    It turns out that when you join two small nuclei to make a bigger one, an enormous amount of energy is released – about 10 million times more energy than the puny chemical reactions that power most of our technology, such as burning oil, coal or the gasoline in your car.

    We know fusion works because it goes on in the core of the Sun. All the Sun’s heat and light are powered by fusion. The most important reaction is the fusion of two nuclei of hydrogen, which is the lightest atom, combine to form helium, the second lightest.

    How does it work?

    Igniting nuclear fusion is not as simple as starting a fire. It takes a lot of energy to get it going and typically, that means a temperature of millions of degrees Celsius.

    This is because atomic nuclei have a love-hate relationship. Each nucleus has a strong positive charge so they repel one another. To kickstart fusion, you have to overcome this repulsive barrier by ramming two nuclei together incredibly hard. That’s what happens in the core of the Sun, where the temperature is about 15 million °C and pressures are similarly insane.

    When the nuclei get close enough to touch, the nuclear strong force takes over – the strongest force in nature – and it’s the source of fusion energy.

    Fusion energy?

    The idea of fusion energy is to build power plants that generate energy by recreating the core of the Sun. Hundreds of research scale reactors have been built around the world.

    They are usually engineering marvels designed to containment hydrogen nuclei at a 100 million °C, or implode a nuclear pellet using massive lasers (see below).
    But I thought we already had nuclear power

    The nuclear power plants we have so far are based on a different process – nuclear fission, where you derive energy by splitting one big atom into two smaller atoms. That’s a much easier process and so fission reactors have been pumping power into the grid since the 1950s.

    Fusion reactors are much safer than traditional fission reactors because there is no chance of a runaway explosion and when the reaction is done, there’s no long-lived radioactive waste.

    Per kilogram of fuel, fusion releases four times more energy than fission and 10 million times more than coal.

    What’s the fuel?

    The first generations of fusion reactors will likely use two forms of hydrogen for the fuel – deuterium and tritium – because the fusion of these two nuclei is the easiest to achieve.

    Regular hydrogen is the smallest atom – just one electron orbiting a proton nucleus. Deuterium is a fatter version of hydrogen, where the nucleus contains a neutron as well as a proton. And tritium is the fattest hydrogen of all – its nucleus contains a proton and two neutrons.

    Deuterium is easily found in seawater, while tritium can be generated from lithium.

    The long-term goal is to switch to a deuterium-deuterium reaction, meaning all the world’s energy supply could one day be found in seawater.

    Has fusion ever been achieved?

    Humans first managed nuclear fusion on 1 November 1952 when the US exploded the first fusion bomb. Fusion bombs (also known as hydrogen bombs) are the most destructive weapons ever made. They typically use a fission-based atomic bomb to trigger a fusion reaction in the second stage.

    The challenge now is achieving fusion in a controlled manner.

    For more than 70 years researchers have been trialling different designs for containing the fusion reaction. Some of these designs (see below) have achieved fusion. The problem is releasing more energy than is put in, and doing it long enough to be useful. Nobody’s been able to do that yet.
    What’s been holding us back?

    Ah. The problem is the temperature. You have to heat the fuel to such a high temperature (100 million °C or so) that no material vessel could possibly contain it.

    The basic physics behind fusion has been known for decades. It’s the engineering that still needs to be worked out.

    What do fusion reactors look like?

    Fusion reactors come in all sorts of shapes and sizes.

    Most research has looked at containing the reaction within a sort of magnetic bottle. At the extreme temperatures of fusion, all of the electrons are stripped off the deuterium and tritium atoms, and what’s left is called a plasma.

    How a tokamak, or toroidal magnetic confinement system, works.Credit: Encyclopaedia Britannica/UIG Via Getty Images

    The most common design is the toroidal chamber-magnetic (tokamak), which looks a bit like the inside of the Deathstar. Tokamaks form twisting donut-shapes called a torus. The plasma runs in rings and never touches the walls of the torus because it’s contained by the magnetic fields.

    Other variations confined the plasma in different geometries, such as the stellerator design which adds a twist with a different configuration of magnets. Other tokamak designs are spherical.
    But if fuel is squeezed in a magnetic field, how do you get the energy out?

    When deuterium fuses with tritium, an extra neutron is kicked out and receives a huge kick of energy. The neutron is a neutral particle, and so is not affected by the magnetic field – it can fly through the magnetic bottle and smash into a lithium blanket just inside the donut.

    The neutron collisions heat up the lithium. This heat is used to convert water into steam which drives turbines to generate electricity, just like in any other electric power-plant.
    What’s this about using lasers?

    Instead of using a magnetic field to contain a plasma, another idea is to ignite small fusion explosions by firing a powerful laser at a pellet.


    At the National Ignition Facility at Lawrence Livermore National Laboratory in California, the world’s biggest laser (made of 192 laser beams) are fired simultaneously at a pellet of deuterium/tritium about the size of a pea.

    National Ignition Facility researchers have achieved fusion using this design, but the challenge is extracting more energy than is used to power the lasers. Their biggest problem is in constructing the pellet and its plastic container in the shape to absorb all the laser energy.

    And cold fusion?

    This is the idea to make a fusion reactor that works at close to room temperature. In 1989, British and American scientists seemed to achieve this a running a strong current through a platinum electrode in a thermos of heavy water (water where the hydrogen atoms are partially or completely replaced by deuterium) – but the experiment turned out to be flawed.

    Nowadays research into cold fusion is seen as an example of “pathological science”, like trying to build a perpetual motion machine.

    Best forget about this altogether. It’s not going to happen.

    What’s next for fusion?

    Despite the difficulties, progress in fusion power has actually been very rapid. Power output has increased by a factor of more than a million in 30 years.

    Much of the hope is centred on the ITER (Latin for “the way”) tokamak to be constructed in southern France by 2019.

    ITER, the International Thermonuclear Experimental Reactor, is being designed to test the principles surrounding the generation of power from nuclear fusion, the energy source of stars. It comprises a toroidal chamber in which a plasma (pink) is contained by strong magnetic fields.Credit: MIKKEL JUUL JENSEN / SCIENCE PHOTO LIBRARY/Getty Images

    This design has been in the pipeline for two decades and is designed to be the first fusion reactor to produce more energy (about 10 times more) than it puts in. However, this 500-megawatt reactor is still only a proof-of-concept design, and no electricity will be generated.

    If ITER is successful, the next step is DEMO – which is designed to be the world’s first nuclear power plant to generate electricity, to be constructed by 2033.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 4:29 pm on April 21, 2016 Permalink | Reply
    Tags: , , , ITER   

    From AAAS: “ITER leader faces tough questions, even from relatively supportive U.S. House panel” 



    Apr. 21, 2016
    Adrian Cho

    ITER construction is ramping up, even as the United States mulls its commitment to the project. ITER Organization

    For Bernard Bigot, director-general of ITER, the gargantuan fusion experiment under construction in Cadarache, France, a hearing in the U.S. House of Representatives was likely to be a relatively amicable event.

    ITER icon
    ITER Tokamak
    ITER Tokamak

    After all, even as budgetmakers in the Senate have tried repeatedly to pull the United States out of the troubled project, House appropriators have supported it and have prevailed in budget negotiations. Nevertheless, yesterday, in a hearing held by the energy subcommittee of the House Committee on Science, Space, and Technology, Bigot faced pointed questions from both Republican and Democratic representatives, suggesting some of them maybe losing patience with ITER.

    ITER aims to prove that a plasma of deuterium and tritium nuclei trapped in a magnetic field can produce more energy than it consumes as the nuclei fuse in a “burning plasma,” a process that mimics the inner workings of the sun. But ITER is running far over budget and at least 10 years behind its original schedule.

    In a tense exchange with Representative Dana Rohrabacher (R–CA), Bigot acknowledged that by the time ITER starts running late next decade its total cost will likely exceed $20 billion. That’s a big jump over the roughly $12 billion ITER was estimated to cost in 2006, when China, the European Union, India, Japan, Russia, South Korea, and the United States agreed to build the machine. The original plan also called for ITER to start running this year. More might have been done with the money slated for ITER if it had been spent instead on research involving more conventional nuclear energy, Rohrabacher said. “I still think that if we had put $20 billion into fission we would have done a lot more for humanity,” he said.

    The hearing opened an interesting fortnight for ITER and U.S. participation in it. Following a scathing external review in 2013, the international ITER organization revamped its management structure, including bringing in Bigot as director-general in November 2014. In November 2015, ITER officials presented a new baseline cost and schedule for the project. An independent committee will report on the reliability of that baseline on 27–28 April to ITER’s governing council. On 2 May, officials at the U.S. Department of Energy (DOE) are supposed report to Congress whether they think the United States should stay in ITER or leave. When pressed, Bigot agreed with Rohrabacher’s estimate that instead of the $1.1 billion originally envisioned, the U.S. contribution to ITER would likely total between $4 billion and $6 billion.

    Generally, subcommittee members seemed supportive of fusion research and ITER in particular. “I really appreciate the work that you’ve been doing, and from all that I’ve been hearing, ITER is in a much better place for your efforts,” Representative Randy Hultgren (R–IL) said to Bigot. “I do see how important this partnership is and I hope [the United States] can remain a reliable partner.”

    In fact, recently released budget numbers demonstrate exactly how much more supportive of fusion research and ITER the House is than the Senate. Earlier this week, the House appropriations committee passed its version of the bill that would fund DOE for fiscal year 2017, which starts 1 October. It includes $450 million for DOE’s fusion energy science (FES) program, a 2.7% bump up from this year’s budget. That sum includes $125 million for parts for ITER. In contrast, the Senate appropriations committee version of the bill would cut the FES budget by 36% to $280 million and would zero out ITER funding.

    Representative Bill Foster (D–IL) asked whether a U.S. withdrawal would be fatal to the ITER project. Bigot declined to answer the question directly, but said, “if the U.S. were to withdraw it would be a real drawback because it would be difficult to replace the expertise.”

    Another senior Democrat, Representative Alan Grayson (FL), expressed frustration with ITER. Fusion energy “is going to happen,” said Grayson, who is the ranking member of the subcommittee. However, he said, “it’s been 10 years already since the major governments signed off on the ITER project. We now have 11 years to go before we start the major experiments and there isn’t even a plan to generate net electricity from ITER, that’s not its design or its purpose.” He asked whether there was a way to achieve fusion on a 10-year timescale, but the witnesses—Bigot; Stewart Prager, the direct of DOE’s Princeton Plasma Physics Laboratory (PPPL) in New Jersey; and Scott Hsu, a fusion physicist at Los Alamos National Laboratory in New Mexico—cautioned that there was not.

    U.S. competitiveness

    The hearing aimed to assess more generally the United States’s fusion program. “Fusion energy research in Asia and Europe is escalating, and for the U.S. to contribute competitively in the face of larger investments elsewhere, we must focus on activity with breakthrough potential,” Prager said. PPPL researchers focus on four areas, Prager said: developing design for a “pilot plant” that might come after ITER and not only sustain a burning plasma, but generate a net gain in electricity; developing materials that can withstand the intense radiation in a fusion reactor; large-scale computer simulations; and physics related to ITER. Still, he said, U.S. fusion research is “resource limited.”

    NSTX-U tokamak at PPPL, Princeton, NJ, USA

    Hsu testified that DOE currently supports only two types of fusion research: magnetic confinement fusion that uses devices called tokamaks such as ITER and PPPL’s National Spherical Torus Experiment; and inertial confinement fusion, which uses the powerful lasers at the National Ignition Facility at Lawrence Livermore National Laboratory in California (which is supported by DOE’s weapons program) to implode a fuel pellet.

    NIF Bloc
    National Igniton Facility laser program at LLNL, Livermore, CA, USA

    DOE’s fusion energy science’s program used to spend $40 million a year on alternative fusion technologies, Hsu testified, but that money has dried up in recent years, even as nations such as China have pursued them.

    See the full article here .

    The American Association for the Advancement of Science is an international non-profit organization dedicated to advancing science for the benefit of all people.

    Please help promote STEM in your local schools.
    STEM Icon
    Stem Education Coalition

  • richardmitnick 10:40 am on November 20, 2015 Permalink | Reply
    Tags: , , , ITER   

    From AAAS: “ITER fusion project to take at least 6 years longer than planned” 



    19 November 2015
    Daniel Clery

    ITER construction earlier this year. ITER Collaborative

    The multibillion-dollar ITER fusion project will take another 6 years to build beyond the—now widely discredited—official schedule, a meeting of the governing council was told this week. ITER management has also asked the seven international partners backing the project for additional funding to finish the job.

    It remains unclear whether the project will get what it wants: Delegations from the partners—China, the European Union, India, Japan, Russia, South Korea, and the United States—concluded the council meeting today by announcing the council would conduct its own review of the schedule and funding to look for ways to tighten them up. In the meantime, the council approved the proposed schedule for 2016 and 2017, set out milestones for the project to reach in that time, and agreed to make available extra resources to help achieve it. After consulting their governments, the delegations committed themselves to agreeing on a final schedule at the next council meeting, in June 2016.

    “It was a very important meeting for us and it went well,” says ITER Director-General Bernard Bigot. “Every member expressed their concerns and in the end they reached an agreement.” Jianlin Cao, vice minister at the Chinese Ministry of Science and Technology, stressed the challenges the meeting faced. The council delegates “have been so careful about this work. But ITER is a new thing, and success does not come easily,” Cao told Science.

    The ITER project aims to show that nuclear fusion—the power source of the sun and stars—is technically feasible as a source of energy. Despite more than 60 years of work, researchers have failed to achieve a fusion reaction that produces more energy than it consumes. ITER, with a doughnut-shaped tokamak reaction chamber able to contain 840 cubic meters of superheated hydrogen gas, or plasma, is the biggest attempt so far and is predicted to produce at least 500 megawatts of power from a 50 megawatt input.

    ITER Tokamak
    ITER tokamak

    The project was officially begun in 2006 with an estimated cost of €5 billion and date for the beginning of operations—or first plasma—in 2016. Those figures quickly changed to €15 billion and 2019, but confidence in those numbers has eroded over the years.

    When Bigot took over as Director-General earlier this year, he ordered a bottom-up review of the whole project, which currently has numerous buildings springing up at the Cadarache site in southern France and components arriving from contractors in the partner states around the globe. That review produced a new description of the entire project, known as the “baseline,” including a revamped schedule and cost estimate. The baseline was presented to the council for approval this week. Although the official communique does not mention the proposed date for first plasma, it is widely acknowledged to be 2025.

    “The council acknowledged this resource-loaded schedule but they need more time to fully endorse this or another schedule and to reconcile it with the resources they have,” Bigot says. Delegates confirmed such plans. “We must take the schedule home and discuss it with the finance ministry,” says Anatoly Krasilnikov, head of Russia’s ITER domestic agency, the body responsible for awarding industrial contracts.

    “In the meantime, they have agreed to give us extra resources to meet the milestones in 2016–17. It keeps the momentum,” Bigot says. To make that possible, the council will move around some money already allocated for 2016 and possibly provide new money for 2017. The project will hire 150 new staff to top up the 640 currently employed by the ITER organization. In return, the council wants ITER to meet 17 major milestones from the new schedule in 2016 and another eight in 2017. “If we meet the milestones, it will consolidate the trust,” Bigot says.

    The true cost of ITER is almost impossible to define. When the project agreement was drawn up in 2006, all the necessary components were divided up among the partners according to their contributions: 45% for the European Union (as host), and 9% for each of the others. How much each partner pays to have those components manufactured is the partner’s individual concern and is not revealed. In addition to the components, which are shipped to Cadarache as in-kind contributions, each partner must make a cash contribution to the central ITER organization to cover its costs.

    The ITER organization’s role is to draw up the design, ensure everyone sticks to it, and then to supervise assembly of the reactor while also satisfying the local French regulators, especially the nuclear safety authority ASN. That has not been an easy job, as the organization does not deal directly with the industrial companies doing the manufacturing; that is handled by each partner’s domestic agency. Last year, a highly critical management assessment faulted the organization for failing to establish a workable “project culture.” Bigot has gone to great lengths to get contractors, domestic agencies, and ITER staff working better together. “I want that the ITER organization and the domestic agencies are never the limiting step for contractors to deliver,” he says. Previously, work on the tokamak building had been held up because ITER staff hadn’t agreed on a final version of its design.

    The problem that the next council meeting will have to resolve is that some member states are further ahead than others in their assigned tasks for the assembly of ITER. Those that are ahead, and are closer to meeting the old schedule, don’t see why they have to fund a slower—and hence more expensive—schedule imposed on them by other partners.

    See the full article here .

    The American Association for the Advancement of Science is an international non-profit organization dedicated to advancing science for the benefit of all people.

    Please help promote STEM in your local schools.
    STEM Icon
    Stem Education Coalition

  • richardmitnick 3:35 pm on October 27, 2014 Permalink | Reply
    Tags: , , , , ITER,   

    From AAAS: “After Election 2014: FUSION RESEARCH” 




    24 October 2014
    Adrian Cho

    Should we stay or should we go? Once the voters have spoken, that’s the question Congress will have to answer regarding the United States’ participation in ITER, the hugely overbudget fusion experiment under construction in Cadarache, France. Some lawmakers say it may be time for the United States to bow out, especially as the growing ITER commitment threatens to starve U.S.-based fusion research programs. The next Congress may have to decide the issue—if the current one doesn’t pull the plug first when it returns to Washington, D.C., for a 6-week lame-duck session.

    ITER Tokamak
    ITER Tokamak

    For those tired of the partisan squabbling on Capitol Hill, the ITER debate may provide curious relief. ITER appears to enjoy bipartisan support in the House of Representatives—and bipartisan opposition among key senators.

    ITER aims to prove that nuclear fusion is a viable source of energy, and the United States has agreed to build 9% of the reactor’s hardware, regardless of the cost. Recent estimates suggest the U.S. price tag could be $3.9 billion or more—nearly quadrupling original estimates and raising alarm among some lawmakers. In response, this past June a Senate appropriations subcommittee proposed a budget bill that would end U.S. participation in the project next year. In contrast, the next month the House passed a bill that would increase U.S. spending on ITER.

    Some observers think the current Congress will kick the issue to the next one by passing a stop-gap budget for fiscal year 2015, which began 1 October, that will keep U.S. ITER going. “I don’t think in the end they can come out and kill ITER based on what the Senate subcommittee did,” says Stephen Dean, president of Fusion Power Associates, a research and educational foundation in Gaithersburg, Maryland. Others say a showdown could come by year’s end.

    Trouble over ITER has been brewing for years. ITER was originally proposed in 1985 as a joint U.S.-Soviet Union venture. The United States backed out of the project in 1998 because of cost and schedule concerns—only to rejoin in 2003. At the time, ITER construction costs were estimated at $5 billion. That number had jumped to $12 billion by 2006, when the European Union, China, India, Japan, Russia, South Korea, and the United States signed a formal agreement to build the device. At the time, ITER was supposed to start running in 2016. By 2011, U.S. costs for ITER had risen to more than $2 billion, and the date for first runs had slid to 2020. But even that date was uncertain; U.S. ITER researchers did not have a detailed cost projection and schedule—or performance baseline—to go by.

    Then in 2013, the Department of Energy (DOE) argued in its budget request for the following year that U.S. ITER was not a “capital asset” and therefore did not have to go through the usual DOE review process for large construction projects—which requires a performance baseline. Even though DOE promised to limit spending on ITER to $225 million a year so as not to starve domestic fusion research efforts, that statement irked Senators Dianne Feinstein (D–CA) and Lamar Alexander (R–TN), the chair and ranking member of the Senate Appropriations Subcommittee on Energy and Water Development, respectively. They and other senators asked the Government Accountability Office (GAO) to investigate the U.S. ITER project.

    This year, things appeared to come to a head. This past April, researchers working on U.S. ITER released their new $3.9 billion cost estimate and moved back the date for first runs to 2024 or later. Two months later, GAO reported that even that new estimate was not reliable and that the cost to the United States could reach $6.5 billion. Based on that report, the Senate energy and water subcommittee moved to kill U.S. ITER in its markup of the proposed 2015 budget, giving it only $75 million for the year, half of what the White House had requested and just enough to wind things down. Alexander supported the move, even though the U.S. ITER office is based in his home state of Tennessee, at Oak Ridge National Laboratory.

    ITER still has friends in the House, however. In their version of the DOE budget for 2015, House appropriators gave ITER $225 million, $75 million more than the White House request. Moreover, the project seems to have bipartisan support in the House, as shown by a hearing of the energy subcommittee of the House Committee on Science, Space, and Technology. Usually deeply divided along party lines, the subcommittee came together to lavish praise on ITER, with representative Lamar Smith (R–TX), chair of the full committee, and Representative Eric Swalwell (D–CA), the ranking member on the subcommittee, agreeing that ITER was, in Swalwell’s words, “absolutely essential to proving that magnetically confined fusion can be a viable clean energy source.” Swalwell called for spending more than $225 million per year on ITER.

    When and how this struggle over ITER plays out depends on the answers to several questions. First, how will Congress deal with the already late budget for next year? The Senate, controlled by the Democrats, has yet to pass any of its 13 budget bills, including the one that would fund energy research. And if the House and Senate decide to simply continue the 2014 budget past the end of the year, then the decision on ITER will pass to the next Congress. If, on the other hand, Congress passes a last-minute omnibus budget for fiscal year 2015, then the fight over ITER could play out by year’s end.

    Second, how sincere is the Senate move to kill ITER? The Senate subcommittee’s move may have been meant mainly to send a signal to the international ITER organization that it needs to shape up, says one Democratic staffer in the House. The international ITER organization received scathing criticism in an independent review in October 2013. That review called for 11 different measures to overhaul the project’s management, and the Senate’s markup may have been meant primarily to drive home the message that those measures had to be taken to ensure continued U.S. involvement, the staffer says.

    Third, how broad is the House’s support for ITER? Over the past decade or so, the House has been more supportive of fusion in general, the Democratic staffer says. But some observers credit that support mainly to one person, Representative Rodney Frelinghuysen (R-NJ), a longtime member of the House Appropriations Committee. “Over the years he’s become a champion of fusion,” Dean says. “He protects it in the House.” Dean and others say that’s likely because the DOE’s sole dedicated fusion laboratory, the Princeton Plasma Physics Laboratory (PPPL), is in his home state of New Jersey (but not Frelinghuysen’s district).

    Indeed, observers say that Frelinghuysen has been instrumental in preventing cuts to the domestic fusion program proposed by DOE itself. For example, for fiscal 2014, DOE requested $458 million for its fusion energy sciences program, including $225 million for ITER. That meant cutting the domestic fusion program by about 20% to $233 million and closing one of three tokamak reactors in the United States. The Senate went along with those numbers, but House appropriators bumped the budget up to $506 million, the number that held sway in the final 2014 spending plan. But some observers speculate that Frelinghuysen might be willing to let ITER go if he could secure a brighter future for PPPL.

    PPPL Tokamak
    PPPL Tokamak

    PPPL National Spherical Torus Experiment

    Finally, the biggest question surrounding U.S. participation in ITER is: How will the international ITER organization respond to the calls for changes in its management structure? That should become clear within months. So far, officials with U.S. ITER have not been able to produce a baseline cost estimate and schedule in large measure, because the ITER project as a whole does not have a reliable schedule. The international ITER organization has said it will produce one by next July, the House staffer says. And if the international organization doesn’t produce a credible schedule, the staffer says, “the project will be very difficult to defend, even by its most ardent supporters.”

    See the full article here.

    The American Association for the Advancement of Science is an international non-profit organization dedicated to advancing science for the benefit of all people.

    ScienceSprings relies on technology from

    MAINGEAR computers



  • richardmitnick 6:33 pm on March 20, 2014 Permalink | Reply
    Tags: , , , ITER, , ,   

    From Oak Ridge via PPPL: “The Bleeding ‘Edge’ of Fusion Research” 

    March 20, 2014

    Few problems have vexed physicists like fusion, the process by which stars fuel themselves and by which researchers on Earth hope to create the energy source of the future.

    By heating the hydrogen isotopes tritium and deuterium to more than five times the temperature of the Sun’s surface, scientists create a reaction that could eventually produce electricity. Turns out, however, that confining the engine of a star to a manmade vessel and using it to produce energy is tricky business.

    Big problems, such as this one, require big solutions. Luckily, few solutions are bigger than Titan, the Department of Energy’s flagship Cray XK7 supercomputer managed by the Oak Ridge Leadership Computing Facility.


    Inside Titan

    Titan allows advanced scientific applications to reach unprecedented speeds, enabling scientific breakthroughs faster than ever with only a marginal increase in power consumption. This unique marriage of number-crunching hardware enables Titan, located at Oak Ridge National Laboratory (ORNL), to reach a peak performance of 27 petaflops to claim the title of the world’s fastest computer dedicated solely to scientific research.

    PPPL fusion code

    And fusion is at the head of the research pack. In fact, a team led by Princeton Plasma Physics Laboratory’s (PPPL’s) C.S. Chang increased the performance of its fusion XGC1 code fourfold on Titan using its GPUs and CPUs, compared to its previous CPU-only incarnation after a 6-month performance engineering period during which the team tweaked its code to best take advantage of Titan’s revolutionary hybrid architecture.

    “In nature, there are two types of physics,” said Chang. The first is equilibrium, in which changes happen in a “closed” world toward a static state, making the calculations comparatively simple. “This science has been established for a couple hundred years,” he said. Unfortunately, plasma physics falls in the second category, in which a system has inputs and outputs that constantly drive the system to a nonequilibrium state, which Chang refers to as an “open” world.

    Most magnetic fusion research is centered on a tokamak, a donut-shaped vessel that shows the most promise for magnetically confining the extremely hot and fragile plasma. Because the plasma is constantly coming into contact with the vessel wall and losing mass and energy, which in turn introduces neutral particles back into the plasma, equilibrium physics generally don’t apply at the edge and simulating the environment is difficult using conventional computational fluid dynamics.

    TFTR at PPPL Tokamak Fusion Test Reactor at Princeton Plasma Physics Laboratory Image Credit: Princeton.

    Another major reason the simulations are so complex is their multiscale nature. The distance scales involved range from millimeters (what’s going on among the gyrating particles and turbulence eddies inside the plasma itself) to meters (looking at the entire vessel that contains the plasma). The time scales introduce even more complexity, as researchers want to see how the edge plasma evolves from microseconds in particle motions and turbulence fluctuations to milliseconds and seconds in its full evolution. Furthermore, these two scales are coupled. “The simulation scale has to be very large, but still has to include the small-scale details,” said Chang.

    And few machines are as capable of delivering in that regard as is Titan. “The bigger the computer, the higher the fidelity,” he said, simply because researchers can incorporate more physics, and few problems require more physics than simulating a fusion plasma.

    On the hunt for blobs

    Studying the plasma edge is critical to understanding the plasma as a whole. “What happens at the edge is what determines the steady fusion performance at the core,” said Chang. But when it comes to studying the edge, “the effort hasn’t been very successful because of its complexity,” he added.

    Chang’s team is shedding light on a long-known and little-understood phenomenon known as “blobby” turbulence in which formations of strong plasma density fluctuations or clumps flow together and move around large amounts of edge plasma, greatly affecting edge and core performance in the DIII-D tokamak at General Atomics in San Diego, CA. DIII-D-based simulations are considered a critical stepping-stone for the full-scale, first principles simulation of the ITER plasma edge. ITER is a tokamak reactor to be built in France to test the science feasibility of fusion energy.


    The phenomenon was discovered more than 10 years ago, and is one of the “most important things in understanding edge physics,” said Chang, adding that people have tried to model it using fluids (i.e., equilibrium physics quantities). However, because the plasma inhabits an open world, it requires first-principles, ab-initio simulations. Now, for the first time, researchers have verified the existence and modeled the behavior of these blobs using a gyrokinetic code (or one that uses the most fundamental plasma kinetic equations, with analytic treatment of the fast gyrating particle motions) and the DIII-D geometry.

    This same first-principles approach also revealed the divertor heat load footprint. The divertor will extract heat and helium ash from the plasma, acting as a vacuum system and ensuring that the plasma remains stable and the reaction ongoing.

    These discoveries were made possible because the team’s XGC1 code exhibited highly efficient weak and strong scalability on Titan’s hybrid architecture up to the full size of the machine. Collaborating with Ed D’Azevedo, supported by the OLCF and by the DOE Scientific Discovery through Advanced Computing (SciDAC) project Center for Edge Physics Simulation (EPSi), along with Pat Worley (ORNL), Jianying Liand (PPPL) and Seung-Hoe Ku (PPPL) also supported by EPSi, this team optimized its XGC1 code for Titan’s GPUs using the maximum number of nodes, boosting performance fourfold over the previous CPU-only code. This performance increase has enormous implications for predicting fusion energy efficiency in ITER.

    Full-scale simulations

    “We can now use both the CPUs and GPUs efficiently in full-scale production simulations of the tokamak plasma,” said Chang.

    Furthermore, added Chang, Titan is beginning to allow the researchers to model physics that were just a year ago out of reach altogether, such as electron-scale turbulence, that were out of reach altogether as little as a year ago. Jaguar—Titan’s CPU-only predecessor— was fine for ion-scale edge turbulence because ions are both slower and heavier than electrons (for which the computing requirement is 60 times greater), but fell seriously short when it came to calculating electron-scale turbulence. While Titan is still not quite powerful enough to model electrons as accurately as Chang would like, the team has developed a technique that allows them to simulate electron physics approximately 10 times faster than on Jaguar.

    And they are just getting started. The researchers plan on eventually simulating the full volume plasma with electron-scale turbulence to understand how these newly modeled blobs affect the fusion core, because whatever happens at the edge determines conditions in the core. “We think this blob phenomenon will be a key to understanding the core,” said Chang, adding, “All of these are critical physics elements that must be understood to raise the confidence level of successful ITER operation. These phenomena have been observed experimentally for a long time, but have not been understood theoretically at a predictable confidence level.”

    Given the team can currently use all of Titan’s more that 18,000 nodes, a better understanding of fusion is certainly in the works. A better understanding of blobby turbulence and its effects on plasma performance is a significant step toward that goal, proving yet again that few tools are more critical than simulation if mankind is to use the engines of stars to solve its most pressing dilemma: clean, abundant energy.

    See the full article here.

    Princeton Plasma Physics Laboratory is a U.S. Department of Energy national laboratory managed by Princeton University.

    ScienceSprings is powered by Maingear computers

  • richardmitnick 8:16 am on July 25, 2013 Permalink | Reply
    Tags: , , , ITER,   

    From PPPL: “PPPL’s Rich Hawryluk recognized for service to ITER international fusion project” 

    July 23, 2013
    Kitta MacPherson
    Email: kittamac@pppl.gov
    Phone: 609-243-2755


    Rich Hawryluk served as Deputy Director-General for the ITER Organization and Director of the ITER Administration Department. ITER is an international fusion experiment that is under construction in France. Hawryluk, a former deputy director of PPPL, completed a two-year assignment at ITER in April, 2013. The Secretary of Energy’s Appreciation Award, signed by former Energy Secretary Steven Chu and presented by Energy Secretary Ernest Moniz cited Hawryluk for “applying his wealth of big-science project management experience to enable the ITER project to make the transition from design phase to construction, thus helping ensure that this important international project will successfully move toward demonstrating the feasibility of fusion as a future energy source.”


    See the full article here.

    Princeton Plasma Physics Laboratory is a U.S. Department of Energy national laboratory managed by Princeton University.

    ScienceSprings is powered by Maingear computers

  • richardmitnick 1:04 pm on March 29, 2013 Permalink | Reply
    Tags: , , , , ITER, ,   

    From PPPL Lab: “US ITER is a strong contributor in plan to enhance international sharing of prime ITER real estate” 

    March 28, 2013
    Lynne Degitz

    “When the ITER experimental fusion reactor begins operation in the 2020s, over 40 diagnostic tools will provide essential data to researchers seeking to understand plasma behavior and optimize fusion performance. But before the ITER tokamak is built, researchers need to determine an efficient way of fitting all of these tools into a limited number of shielded ports that will protect the delicate diagnostic hardware and other parts of the machine from neutron flux and intense heat. A port plug integration proposal developed with the US ITER diagnostics team has helped the international ITER collaboration arrive at a clever solution for safely housing all of the tokamak diagnostic devices.

    Iter Icon


    ‘Before horizontal or vertical modules were proposed, diagnostic teams were not constrained to any particular design space. When we started working on this, we suggested that there be some type of modular approach,’ said Russ Feder, a US ITER diagnostics contributor and Senior Mechanical Engineer at Princeton Plasma Physics Laboratory. ‘Originally, we proposed four horizontal drawers for each port plug. But then analysis of electromagnetic forces on these horizontal modules showed that forces were too high and the project switched to the three vertical modules.’”

    The proposal has been formalized by two ITER procurement agreements in late 2012 between US ITER, based at Oak Ridge National Laboratory, and the ITER Organization; other ITER partners are expected to make similar agreements this year.”

    PPPL’s Russell Feder, left, and David Johnson developed key features for a modular approach to housing the extensive diagnostic systems that will be installed on the ITER tokamak. (Photo credit: Elle Starkman/PPPL Office of Communications)

    See the full article here.

    Princeton Plasma Physics Laboratory is a U.S. Department of Energy national laboratory managed by Princeton University.

    ScienceSprings is powered by MAINGEAR computers

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: