The greatest nuclear fusion experiment, ITER, has been mothballed barely a year after being switched on.
“It’s unlikely we’ll ever achieve ignition,” said the project’s acting chief scientist, “and without substantial budget increases, it’s not worth doing anything else.”
The International Thermonuclear Experimental Reactor (ITER) project was begun in 2006, but took almost 20 years to reach completion, to the point where it could be activated. After decades of delays and cost overruns, there was little commitment from the international sponsors of the project for further indefinite experimentation.
First Russia, then the US and Japan pulled out of the funding group, leaving the EU – France and Germany – to shoulder the bulk of the financial burden on their own.
When faced with another 12-15 years of expanding budgets, just to see whether nuclear fusion in a tokamak reactor could deliver nett energy to the grid, the EU decided the wisest course of action was simply to pull the plug.
In one respect the project has been a great success. It has shown that you can keep a gaggle of scientists gainfully employed (although at great cost) for many years, chasing an elusive dream.
Now that reality has bitten, hard, the obscenely expensive structure may earn some Euros for its owners as a tourist destination; an obsolete curiosity from a bygone age.
What the ITER designers failed to see, was that other technologies like solar power and bio-chemical energy such as bacterial fuelcells, would advance exponentially, making the enormous capital outlay of fusion research simply uneconomic.
As any Silicon Valley venture capitalist will tell you, there’s nothing wrong with failure – but fail fast, and move on to something more promising. Fusion power has failed to fail fast enough, and now it’s doomed to never be given another chance.