Limits to the costs of backup for variable generation
We have all the tools we need to manage wind penetrations to 70%, so it’s not a question of “if”, only of “which combination”. In this post, I explore some of the costs of those tools, and find a limit to those costs.
The UKERC TPA report on intermittency was a primarily a literature review and synthesis, and its cost estimates of 0.1p-0.15p/kWh are valid to 20% penetration, but they do cover papers that go beyond that.
Various writers, including Poyry, Henrik Lund, David Milborrow, Mark Barrett, have looked at substantially higher penetrations, and some of their work is included in the UKERC TPA study. Several studies reviewed in the TPA looked at penetrations beyond 20% (see e.g. Table 3.6 in the final report).
It gets tricky when looking across papers, though, because different authors use various definitions of penetration levels: something that the TPA study handles admirably, and standardises on % of total system energy. At least one study reviewed goes out to 45% penetration (their ref 132b – Milborrow 2001), which had intermittency costs at around 0.3p/kWh at that level.
It is possible to cap estimates of the upper cost of intermittency, based on having capacity backup of 120% from OCGT. At that level, if you were to work with say 50GW mean electricity demand (438TWh/y), and have enough demand-side management that you can smooth demand to a 50GW flat line, then in extremis you might want 60GW OCGT, which would have build costs of around £20bn. Assuming a plant lifetime of ten years, your cost of capacity credit would be less than 0.5p/kWh across all the energy on the grid. That assumes no interconnectors, no capacity credit from wind, no other generation on the system; so it’s an over-estimate of the upper-limit cost.
There’d also be costs of gas and other O&M costs for when the plants were running, but that’s energy cost rather than capacity cost. To find out how much that was, you’d need a system model.
I think some/many would argue that we can expect a system with higher penetrations of wind to look/function quite differently than one with relatively low penetrations of wind.
The future grid will look and function quite differently from the past grid, regardless of whether wind provides 70% of our energy, or only 40%. Bear in mind that the Committee on Climate Change have made it clear that we need to have decarbonised the grid to below 50g CO2e / kWh by 2030, so wind will inevitably be our largest single energy contributor by 2030, because it’s the only clean tech that can reach that scale in that time frame.
Most notably, the previous pattern of having 99% of the balancing done on the supply side, will have to go, whatever the future grid looks like. It’s an economic and structural aberration, the result of technological and institutional inertia. The demand side will be quite capable of providing more than half the balancing, at the time-scale of seconds to 24 hours.
Relying on thermal plants being available for balancing renewables/nuclear in the future means that someone needs to sort out how you get a return on your investment for that plant to actually get built
Auctioned payments for capacity credit aren’t a great way to do it, but should be effective. Another way is to let the market do it, if you’re prepared to let market discovery find the value of energy security: take an extreme case where prices threatened to rise to, say £200/MWh for 250 hours per year, that’s a potential revenue of £50k/MW/year from those hours alone. At a build cost for OCGT of £330k/MW, that might start to look like a decent investment. Some combination of central co-ordination and the market can balance allocation between demand-side response, interconnectors, pumped hydro, flow batteries and dispatchable plant.
In summary, wind penetrations of 40%-70%, measured by proportion of annual mean energy delivered, can be balanced by a capacity cost of 0.5p/kWh at most, across all the energy on the grid.