Aspects of the dynamics of the UK National Grid, how the grid frequency varies – and the role of ENF in forensic science
The system frequency target is normally set at 50.00 Hz. Frequency changes as the balance between demand and generation alters. The Grid controller attempts to balance the two by instructing the generators, but there is a range of frequencies between which no instructions will be issued, say 49.95 to 50.05 Hz.
All generators and synchronous machines that are connected to the Grid are locked in to the system frequency. If the demand increases, the frequency start to fall. There is only a relatively small amount of kinetic energy in the rotating parts of the generators so the governor system of the turbines driving the generators respond by very rapidly increasing the energy input to restore the balance. The characteristic of the governor can be varied to allow a small deviation from the target frequency rather than attempting to return to exactly 50.00 Hz which gives a more stable system.
With a gas turbine, the fuel input is adjusted very rapidly. With a steam turbine, the energy within the boiler provides a relatively large amount of stored energy and the fuel input to the boiler (or nuclear reactor heat output) is increased after a minute or two to restore the equilibrium. So the combination of some rotational kinetic energy and some potential energy stored in the boiler pressure parts gives some flywheel effect. National Grid Co. may be prepared to give estimates of the amount of kinetic energy in the system, as they do need to know that when making stability calculations for different scenarios.
The power required to drive a machine is proportional to its speed (depending on the machine characteristics). When the system frequency drops the power of most rotational machinery being driven will drop, which also helps to stabilise the situation.
The opposite effects take place when demand falls.
In an AC system it is not desirable to allow voltages to vary more than a small amount with demand. Voltage variations are not a normal power control mechanism (they are a reactive power control mechanism, which is an very different matter) The generator and Grid system voltages are maintained within quite close limits, otherwise the system can become unstable and parts of it fall out of synchronism, with adverse consequences.
Voltages can be reduced deliberately at the final distribution stages of the system as the first stages of deliberate load reduction, and the system can also be deliberately run at a low frequency (down to 49.00 say) to reduce demand before having to cut customers off. We have not seen such tactics deployed very much in recent decades, but it was a regular feature before the 1970s and we may well see it again in a decade or two.
The financial consequences for energy users will depend entirely on their individual circumstances. They are only paying for energy used but they may suffer from reduced product outputs or quality control problems. Only individuals can answer that.
The fundamentals of AC grid systems everywhere are the same so all will have very similar characteristics to ours provided that there is not a shortage of generating capacity to meet demand.
an ex senior engineer in the UK national grid who works for http://www.carboncatalysts.com/index.html
Unfortunately, it is pretty complicated, and one of the intellectual achievements of the 1930s was the development of fully mathematical models of the Theory of Synchronous Machines. Still a vital (and difficult) part of Electrical Engineering degrees, so what follows is still a major simplification.
It is helpful to consider the position when the load increases. This is always at a particular point on the network, and extra load will create voltage reductions, and this will add a little bit of torque to the nearest generator(s). This slows it (them) down, but this in turn changes slightly the phase of the voltage on the transmission network. This in turn, adds a bit of torque to other generators, who, in turn, give up a bit of speed. The generators are “synchronised” by all the small changes in phase and voltage, so, as you say, the overall frequency of the system drops a little bit.
As you say, the energy comes in the very short term, from the inertia of the generators. My understanding is that today, in the UK, the inertial energy in the generators, if it could all be used, would keep the grid going for about 8 seconds.
On large grids, when there is a transfer of power from one part of the system to another, the “load surplus” bit phases itself slightly ahead of the load shortage bit. As would happen when you shift torque through a spring. So as load changes, there are also small variations in the phases of all the lines. It happens very fast, so within about 100ms of a load change the system will have steadied its phase position sufficiently for a simple frequency sensor to detect the change.
When the frequency drops, some loads also reduce. All motors, for example, will slow down a bit.
(Aside. These vibrations are potentially unstable. And the system as a whole can get into an oscillatory state. Thing of it as a huge jelly wobbling a bit whenever there is a change. If this wobble is not damped, you can get resonance, so the changes to the frequency of parts of the grid start getting into antiphase with changes to other parts of the grid. The result is that lines connecting the two parts of the grid start having huge transient currents, and then break. This is what happened in the big European split.)
When all is well, this means that the frequency is a system wide signal, and, along with its rate of change, gives an indication of the instantaneous state of the system.
Unfortunately, when there is a frequency drop, most generators will slow down, and this will reduce their output. There is a natural positive feedback, and, unless corrected, this will lead to a self sustaining collapse.
The necessary negative feedback is provided by generators (when their control system is set to a “frequency sensitive” mode) whereby any drop in frequency demands an INCREASE in its output. The relationship between frequency drop and output increase is set by a parameter known as droop. A droop of 4% implies that a drop of 4% in frequency demands 100% increased in output.
Voltage is only affected in the areas where the load has increased. Voltage change is a local phenomenon, and local depends on which level of the system you are involved with. The main point is that voltage is NOT system wide, so you cannot use it to derive information about the whole system. Voltage is affected by all sorts of things, and it is adjusted using “reactive load” – AKA “imaginary power”, as in the imaginary part of a complex number.
If a drop in frequency is not corrected, then the frequency will continue to fall, notionally indefinitely. However, all sorts of safety features start operating using both the frequency and its rate of change, and cut off loads (such as cities) at various frequencies. If the frequencies and loads are well chosen, then it will balance the shortage of generation, but if they are not well chosen, they may reduce load too far, and the frequency will rise again, and this time, the generators will cut off to prevent the frequency getting too high. A cascade leading to complete disconnection.
When the system gets unstable, then the carrying capacity of A/C lines can fall. One can get weird interactions of capacitance and induction, that means no “real” power is transferred. When this happens, you can get a “voltage” collapse, but this is a different phenomenon to frequency collapse, and may not arise. One can also get voltage collapse even without a frequency collapse.
Voltage drops are quite common, and thunderstorms, which can trigger short (~200ms) disconnections of transmission lines can cause voltage drops across large areas of the grid.
Whether voltage drops and other intermittent aberrations do any costly harm depends upon what the electricity is being used for. You might see a flicker on your lights, but your machine (say printing newspapers) may trip off, and take minutes or hours to start up again. This is costly. So the impact of electricity quality on end use is highly variable, and sometimes contentious. An expectation of good quality can save you quite a lot in extra UPS and protection systems.
My understanding is that flywheel systems are more often used to protect against voltage collapse (often triggered by loss of a transmission line) rather than frequency collapse. This seems to arise most often in metropolitan areas (like NY), where it has proved hard to get enough transmission lines, and the local distribution systems are antique.
For frequency, NG has a set of statutory and advisory limits to changes in frequency. It is these limits that set the volume of frequency response they have to buy. Broadly, in extremis, they can drop to 49.2Hz, or go up to 50.08, and in crisis can go beyond this. Loss of a major generator (say 1GW) can cause the frequency to drop to nearly 49.2 within about 10 seconds. Depending on load and the active generator fleet, each change of 1 Hz relates to about 1GW.
The rate of change depends on the relationship between the total load and the largest generator. Large generators in small grids make the system more frisky, and Ireland has one of the more frisky grids, as it is small, but has big gensets. Europe and the US grids generally have much larger loads, but their gensets are not that much bigger than the UK’s. So their frequency is usually more stable, even in the face of genset failures.
If the UK gets some EDF nukes, these have a capacity of some 1.6-1.8GW. This is~ 25% bigger than Sizewell, so the system will have to buy and extra 25% of frequency response to cover this contingency. It is not clear who will pay for this.
David Hirst (member of Claverton Energy Group)
Note added from the Metropolitan Police:
To give you the background of my interest; I am a forensic audio scientist working for the Metropolitan Police and I am writing a paper concerning the Electric Network Frequency Criterion (ENF) which relates to a relatively new forensic technique. ENF is based on extracting recorded mains hum from evidential digital audio/video recordings. From the extracted ENF data it is possible to establish the date and time the recording was created. The technique is based on the fact that the mains frequency is synchronous across the UK and has a unique frequency deviation pattern over relatively short periods of time resulting from the complex supply and demand behaviour of the grid. So comparing ENF patterns with a known database of ENF data we can establish the date and time of a recording or even if a recording has been edited. The article written by your colleagues on the Claverton web site provides a good insight into how and why the National Grid frequency is synchronised and fluctuates about it’s 50Hz mean value.
As a further question do you know when it was written (year will do )?
Digital & Electronics Forensic Service (DEFS)