Saturday 18 December 2021

The Folly of Renewables

 

All economics is ultimately about energy: about how much energy you have to put into something and how much energy you get out of it.

Take, for instance, a farm, most of the energy for which, of course, comes from the sun, which not only warms the soil, allowing seeds to propagate, but drives the weather, delivering water in the form rain, while providing sunlight for photosynthesis, thus enabling plants to grow. The farmer, however, still has to put some of his own energy into it. He has to plough his fields ready for sowing, tend his crops to ensure they are not overgrown by weeds, and finally he has to bring in the harvest, the calorific value of which will hopefully be sufficient to feed him and his family throughout the winter and provide them with enough energy to undertake another cycle of ploughing, sowing, tending and harvesting the following year.

With any luck, it may also provide him with a surplus: a unit of stored energy in the form of an extra sack of grain which he can exchange, for instance, for a new plough: one of an improved design which will allow him to cultivate more land without having to expend any more energy, thereby enabling him to harvest even more surplus units of stored energy in future.

Meanwhile, the blacksmith who made the plough and may well use some of the grain he received in payment for it to fuel his own energy needs, may also trade part of it for more iron, which he can then use to make more plough sheers: a process which, on the face of it, does not appear to have any energy as an output but which can be seen slightly differently if one takes into account the fact that it was the farmer’s new plough which allowed him to increase the efficiency of his own energy output so as to cultivate more land and hence produce more units of stored energy. Looked at from this perspective, therefore, tools which help us use energy more efficiently can be seen as both harnessers of energy and stores of the energy that went into making them.

In fact, the harnessing or extraction of energy, and its use to produce tools or technologies which allow us not only to harness this energy more efficiently but to exploit new sources of energy as well, is at the heart of almost every economy, and explains why economic and industrial development tends to happen in stages, with certain sources of energy only becoming available to an economy once the previously harnessed source has enabled it to develop the technology required to extract energy from the new source. Thus most primitive societies start their economic journey by using only the most readily available and abundant sources of energy in their environment, such as dead wood collected from the forest floor, from which energy can be extracted fairly easily by the simple application of fire. While wood fires produce enough energy for cooking and keeping oneself warm, however, they do not generate sufficient heat to render metals malleable enough to work, let alone to smelt them in the first place. For this, someone first has to stumble upon the process of making charcoal, along with the further process of using the more concentrated energy which charcoal can produce to extract metals from ores: a fairly massive leap which almost certainly had a number of intermediate steps, though we can only guess at what they were.

Once our now not-so-primitive society has acquired the ability to make metal tools, however, everything becomes a lot easier. For they are now able to construct machines for harnessing energy from at least two further sources wind and flowing water which would have led to a whole slew of further  advances. For in order to build wind and water mills, they also had to develop the kind of rotating cog technology that has been incorporated into just about every mechanical device ever since. This enabled them to use their mills not just for milling grain which was almost certainly the purpose for which they were initially built but for a whole range of new industrial processes, from spinning yarn for weaving to felting wool for making hats, from sawing timber into planks to mashing wood pulp for making paper. In fact, the list of new or improved products to which water mills gave rise is almost endless and led, in itself, to the exploitation of yet another source of energy: coal.

I say this because while water mills were extremely efficient, their dependence on suitable locations beside streams or rivers meant that they could not fulfil the demand for industrially manufactured goods which they had themselves created. That demand, however, along with the invention of the steam engine and the fact that coal produces around twice as much energy as wood, now made it economically viable to expend the additional energy required to extract this new source of energy from the ground: a step-change in the economic environment which, in turn, drove the further development of steam engine technology, the source of the energy and the technology for harnessing it thus being inextricably linked.

In fact, one might almost describe the relationship between the two as reciprocally adaptive, a term which, like evolution, one would normally reserve for developments in the natural world, but which nevertheless seems appropriate in describing an economy in which the application of steam engine technology to new industrial processes was constantly being extended to exploit the power of coal, while the business of mining coal was continually expanded to feed the hungry beast it had thus unleashed. What is truly remarkable, however, not just about the age of steam, but about our whole history of industrial and economic development, is that while there is a fairly obvious logic to the order in which developments occurred which, with the benefit of hindsight, one might almost call predictable it would be hard to make the case that any of these developments were in any sense planned, each step along the way coming about rather as a result of mostly very ordinary people making discoveries, solving problems and seeing opportunities.

Being thus an essentially bottom-up organic process, rather than a top-down directed one, what this also tells us and what we know from other perspectives upon history is that, throughout most of our history, industrial and economic development has largely been the preserve of those engaged in it, rather than those in power. For while the latter group, especially Europe’s ancient aristocracy, may have historically owned large parts of the economy particularly the agrarian economy and jealously controlled the distribution of its surpluses, they seldom had any interest in directly managing it, very often regarding themselves as above such menial employment. Being largely left to get on with it on their own, what this also meant, therefore, was that those who did actually engage in mining, manufacturing and trade seldom made business decisions for anything other than purely economic reasons, thereby rendering economic development largely independent of politics.

In fact, it has only been in the last century or so that we have become accustomed to third parties in the form of governments intervening in the economy for non-economic reasons, and it has only been in the last sixty or seventy years that any of these interventions have involved legislation concerning which sources of energy the public should be allowed to use. Despite London having been subject to dense fogs for centuries – notoriously known as ‘peasoupers’ – it wasn’t until 1956, for instance, that the British government passed the first of several ‘Clean Air’ acts, which banned the burning of untreated coal in urban areas, thereby significantly reducing the number of chronic bronchial disorders from which Londoners had traditionally suffered. In a similar vein, it wasn’t until the year 2000 that, after gradually phasing it out, the UK finally banned leaded petrol due to the neurotoxicologically effects it was found to have, particularly on children. In both of these cases, however, the problem was already upon us and clearly evident before action was taken. Moreover, the solutions were readily available, had almost immediate effect and were entirely within the scope and responsibility of the British government, or would be today.

This is in marked contrast, therefore, to the steps which governments throughout the western world have been taking over the last decade or so to phase out a whole class of energy sources, along with the technologies for harnessing them, not for any immediate health reasons – and certainly not for any economic ones – but to prevent a problem which may or may not occur at some unspecified date in the future, the exact nature of which is also unclear. What makes this even more remarkable, however, is the fact that the governments which have committed themselves to voluntarily sacrificing their countries’ economic interests in this way cannot actually solve the problem alone, but need the participation of other governments which have made no such commitment and are significantly greater contributors to what is believed to be the problem’s underlying cause.

Nor is even this the full extent of this extraordinary aberration. For in addition to being, in all probability, an entirely futile gesture, those governments which have made this commitment have done so with what, at best, can only be a very partial understanding of the economic consequences of giving up sources of energy on which they are still dependent, not least because in many areas of the economy it is very difficult to see what alternative sources of energy could be used or how the transition to them might be achieved.

Take shipping, for instance, where the only feasible replacement for large internal combustion engines would appear to be nuclear reactors, as already used in submarines and arctic ice breakers. Given the risks inherent in taking what is currently still a largely military technology and handing it over to commercial shipping companies, however piracy and terrorism being only the most obvious – and the probable need, therefore, to at least partially militarise the world’s merchant fleets, as well as the ports at which they dock, one has to ask the question as to whether the whole practice of transporting goods half way around the world from factories in Asia to consumers in Europe would still be viable, or whether manufacturers would be forced to build their factories closer to their markets, thus effectively bringing an end to globalism.

Not, of course, that I know the answer to this question. Indeed, it is unlikely that anyone does, making it also very unlikely that anyone would ban conventional cargo vessels until a viable way forward had been found. For no matter how confident one might be that, having issued a top-down directive, the fundamentally adaptive nature of economics would reassert itself and that someone somewhere would come up with a solution, to start ordering the burning of one’s boats before that solution had at least been tested would be decidedly premature. After all, we didn’t start phasing out steam trains until the reliability and cost-effectiveness of diesel locomotives had been fully established. And yet this is precisely what governments throughout the west would appear to doing with respect to the most fundamental component of any advanced nation’s economic infrastructure: its electricity grid. For not only have we already started decommissioning perfectly good coal, oil and gas fired power stations before the viability of the alternatives has been demonstrated, but we have actually been trialing these alternatives long enough to know that their viability is very much in question.   

Indeed, in many cases, it is their lack of viability, along with their lack of suitability for playing any serious part in a nation’s electricity supply, that has actually been demonstrated, leaving many national grids in a very precarious position, as can be seen, for instance, in the case of the UK, where, according to the government’s own ‘Digest of UK Energy Statistics’ the latest edition of which, DUKES 2021, can be found here the 329,906 GWh (gigawatt hours) of electricity consumed in by the UK in 2020, was supplied by an increasing unreliable mixture of sources as shown in Table 1.

 Table 1: UK Generation of Electricity 2020

The first thing you will probably note from this table is that coal and oil have already been more or less eliminated as sources of energy in the UK’s energy mix. In fact, we only have two coal-fired power stations left, both of which are scheduled to be decommissioned and, indeed, demolished in 2022. This will leave just natural gas and nuclear power stations as the only conventional sources of energy still supplying power to the grid in any significant quantities, with nearly half of our electricity being generated by renewables, of which wind power is currently the most significant and is likely to remain so for the foreseeable future. I say this because, if the UK is eventually going to phase out natural gas as well as oil and coal,  as would appear to be the plan – licences to exploit new gas fields in the North Sea having been effectively halted – then apart from nuclear power, there are not that many other sources of energy from which any significant amounts of additional electricity can be generated.

Hydroelectric power, for instance, requires damned rivers and flooded valleys, and most of the potential sites for this kind of engineering have long since been exploited. The potential for further hydroelectric power stations is therefore extremely limited, especially as nearly all of the most suitable sites are in national parks, where their development would be strenuously opposed by environmentalists.

This is also true in the case of wave power. For despite the UK’s 19,717 km of coastline, constructing wave barrages along almost any section of it raises concerns about marine habitat. Even more importantly, their viability is yet to be proven. For while a number of experimental projects have demonstrated that the technology works, none of them have been running long enough to test their durability in what is invariably a challenging environment. As a result, it is still not known how much  maintenance they require over their lifespans or what that lifespan actually is: an issue which is extremely important when it comes to the cost of operating almost any offshore installation, as we shall discover shortly with respect to wind farms.

Not, of course, that there are any such problems with biomass power stations, which have already shown themselves to be commercially viable. The problem here, however, as recently pointed out by a number of national newspapers with respect to the Drax power station in North Yorkshire, is that, because wood produces only half the energy of coal, the burning of biomass, mostly wood pellets imported from North America, produces proportionally more CO2 than coal per unit of electricity generated. Worse still, the rate of deforestation occurring in North America as a result of cutting down trees to produce the wood pellets mostly for the American market, where biomass power stations produce the largest proportion of electricity generated from renewable sources means that this resource will be exhausted long before 2050, when the British government has committed itself to being carbon neutral, and would have to be phased out by then anyway.

Other than wind and nuclear, therefore, this just leaves solar as the only other available source of power that could replace fossil fuels. Due to our weather, however, solar panels are far less  productive in the UK than in some other places, giving them a significantly lower ‘derating’ than any other source of energy on the national grid: the term ‘derating’, for those unfamiliar with the inner workings of the electricity industry, being one I should probably explain.

As you will have noticed, all the figures provided in Table 1 are stated in terms of gigawatt hours (GWh). These denote volumes or amounts of electricity. The capacity of each generator and of the grid as a whole, however, are defined in terms of how much electricity it can supply at any one moment and is measured in gigawatts (GW), the relationship between the two being that a generator with a capacity of 1 GW, operating for one hour, will nominally produce 1 GWh of electricity. The nominal output from a generator, however, may not always be the same as its actual output, especially if, for whatever reason, it is not always available.

From the data provided in DUKES 2021, this can be most easily demonstrated with respect to wind turbines, of which there are currently 11,006 installed on the national grid, 8,709 onshore and 2,297 offshore, the smaller, onshore machines having an average nominal generating capacity of 1.59 MW (megawatts), the larger, offshore machines having an average nominal generating capacity of 4.56 MW. If all 11,006 turbines had been operating 24 hours a day, 365 days a year throughout 2020, this would therefore have produced a total of 213,185 MGh (megawatt hours), or 213.19 GWh of electricity. As you can see from Table 1, however, they only actually produced 66.39 GWh, representing a derated capacity of 31.14%.

This is partly because wind turbines have a fairly narrow operating window in terms of wind speed, with the rota blades having to overcome inertia before they can start turning at all. Depending on their design and the weight of the materials employed in their construction, this usually happens at around 15 mph. In addition to requiring a minimum wind speed in order to get going, however, all wind turbines also have a upper wind speed tolerance, which is partly determined by the need to prevent the turbine being damaged in high winds, but is also inherent in the need to maintain a constant, rather than fluctuating output to the grid. As a result, the most common design of turbine is equipped with a servo motor which turns the rota blades more or less into the wind depending on wind speed, with the result that, by the time the wind speed reaches around 55 mph, most turbines have turned themselves completely sideways on to the wind, thereby effectively turning themselves off.

Even with a derating of 31.14%, however, wind turbines in the UK still out-perform solar panels, which have the inherent handicap of only operating during the day time, thus giving them an automatic derating of 50%. In the UK, however, where we have snow, frost, fog and predominantly overcast skies, the derating is brought down even lower to between 20% and 25% depending on the actual location. Even more importantly, however, what this also does is substantially reduce their potential viability. For the manufacture, installation and maintenance of a solar farm costs the same whether it’s sited in Catalonia or Cumbria. You get a lot more electricity from it, however, if it’s sited in Catalonia. What’s more, land in the UK – especially in or around populated areas where the electricity is consumed, an issue to which I shall also return shortly – is very expensive and highly regulated as to how it is used. Anyone owning such land, therefore, would be far better off building much needed houses on it, with solar panels on the roofs for private supply, than a solar farm connected to the grid.

What all this means, therefore, is that, if the UK is going to phase out all fossil fuels by 2050, along with all of its existing biomass power stations, the only replacement forms of generation it realistically has available to it are nuclear power and wind power, with wind power currently being the preferred option, leaving us with the very simply question, therefore, as to how many additional wind turbines we would have to install and on what timescale to achieve net zero carbon emissions by the declared deadline.

If the question is simple, the answer, however, is less so. For it depends upon the mix of onshore and offshore wind farms to be installed. If, for instance, we were to install the additional turbines in the same proportions of offshore to onshore as are currently installed, the number comes out at 29,244: 23,141 onshore, 6,103 offshore, bringing the total installed base to 40,250, an overall increase of 366%. It is highly unlikely, however, that these are the proportions that will be maintained, not least because of the problem of attenuation.

Again, this is probably something I need to explain. And the easiest way for me to do so is to refer you once again to DUKES 2021 and ask you scroll down to the second page where you will find a flow chart showings the sources and amounts electricity generated on the left and the destinations and amounts of the electricity consumed on the right. I would also ask you to do a quick calculation of the totals in each column, which, you will notice, are not the same. In fact, the amount of electricity consumed is less than half the amount of electricity generated, the difference being accounted for by a massive outflow in the middle of the chart labelled ‘Conversion, Transmission and Distribution Losses’, showing that more than half of all the electricity generated in the UK is used or  lost within the grid itself, the biggest cause of loss being the distance between the generator and the consumer, i.e. attenuation.

This is particularly a problem for onshore wind farms, which are best located on windy hilltops. The consumers of the electricity they generate, however, are mostly located in urban areas, which are mostly built in valleys, like the Thames Valley, or on open plains, such as the Vale of York. You can build all the wind farms you like in the Highlands of Scotland, therefore, but by the time the electricity generated reached Glasgow or Edinburgh, it would be so attenuated as to be hardly noticeable. Besides which, the government has already decreed that most if not all new licences for wind farms will go to offshore installations.

For simplicity and the purposes of this exercise, therefore, let us suppose that all the new wind turbines are going to be offshore and of the larger variety, which, during 2020, generated, on average, a total of 12.426 MWh each. If we further assume that this average annual output will be maintained, this means that the number of new turbines required would be 14,195, bringing the total number of offshore wind turbines installed to 16,492: a target which doesn’t initially seem to be overly daunting but looms larger the more one considers what is actually entailed in building each one and the limitations on where they can be sited.

I say this because the first point to make, of course, is that they will not be evenly distributed around our 19,717 km of coastline but built in clusters in a relatively small number of locations meeting fairly strict requirements. Because they need to be sited in shallow water, for instance, most of them will have be installed either in the English Channel or along the North Sea coast. To avoid attenuation, they also have to be located fairly close to land in places where there is already an electricity infrastructure to which they can be connected. Nor can they be crowded too closely together. For they create turbulence which reduces the efficiency of other turbines situated nearby. They also have to be placed away from busy sea lanes, such as those feeding into the Thames and Tees estuaries and, of course, much of the English Channel, especially at its narrowest point between Dover and Calais.

If geography is thus one factor in determining whether replacing all our gas and biomass power stations with offshore wind farms is even feasible, the timescale for achieving it may also have an influence, especially as Parliament has written the deadline date of 2050 into law, leaving us with very little wriggle room.  This means that having installed 2,297 offshore wind turbines in the last twenty years or so, we now have to install another 14,195 over a very similar period, at an average of around 490 per year, which is easier said than done. For even without taking the weather into account, installing offshore wind turbines poses problems which their onshore cousins do not present. Whereas the pedestals or monopiles for onshore wind turbines only have to be driven 15 metres into the ground, for instance, the monopiles for offshore wind turbines have to hammered down to at least twice that depth. These hollow tubes, which are up to six metres in diameter, are then filled with quick-setting concrete before being capped with a transition piece to which the tower is then bolted, all of which, being at sea, requires specialist rigs and equipment along with a highly trained corps of engineers. Building more than one a day, every day for twenty-nine years is therefore going to take some doing.

Even if we assume that all of these challenges can be met, however that we can find enough suitable sites to install another 14,195 turbines and can deploy all the necessary manpower and equipment to do so this then brings us to the cost, which I have based on figures provided by BVG Associates, an engineering company which specialises in building offshore wind farms and which puts the overall capital cost including components, materials and the installation itself at £2.37 million per megawatt of nominal capacity. Given that the average offshore wind turn turbine has a nominal capacity of 4.56 MW, this puts the average capital cost of an installed turbine at £10.8 million. The capital cost of installing 14,195 of them would therefore be £153.25 billion, which, on the face of it, would not appear to be much of an obstacle, especially as the investment would be spread over twenty-nine years.

Offshore wind turbines, however, have an average life expectancy of just twenty years. This means that during the building of the last third of the additional 14,195 turbines required during the 2040s we would also have to replace all the turbines built in the current decade, thereby doubling the number turbines installed each year. What’s more, due to this twenty year lifespan, we would have to continue replacing 5% of the installed base each year, around 825 machines, even after the target of 16,492 had been reached. And although, to begin with, we would probably be able to reuse the monopiles, thereby saving around 10% of the capital cost, eventually, of course, even these would have to be replaced.

Nor is this the end of the ongoing costs. For wind power is not as free as most people think, with wind turbines actually consuming a fair amount of energy in their operation. In addition to the power required to drive their servo motors, for instance, the nacelle at the top of each tower through which the drive shaft turns to generate the electricity is full of oil, which has to be constantly heated to maintain its required viscosity. This is why one sometimes sees pictures of turbine nacelles in flames, the heated oil having caught fire. Because they thus consume electricity even when they are not producing it, what this also means is that they are not self-powering and have to draw the electricity they use from the grid: a fact which ought to further affect their derating but which only actually appears as an operating expense in the operator’s accounts.

By far the biggest ongoing expense, however, falls under the heading of maintenance, of which offshore wind turbines require a considerable amount, not just to contend with the normal wear and tear to which all machines with moving parts are subject, no matter where they are situated, but to deal with a level of material degradation that is unique to the offshore environment and which still sometimes catches manufacturers out. A good example of this is in the construction of rota blades, for which manufacturers have always used the lightest materials possible, including resin bonded fibreglass, in order to reduce the wind speed required to overcome inertia. What some manufacturers failed to take into account, however, was the long-term effect of salt sea air on resin, with the result that the rota blades on many wind turbines installed in the North Sea and presumably elsewhere have been found to have developed a fraying along their edges, necessitating their replacement.

As with any machine that involves rotation, however especially ones that are designed to harness such a violent force of nature as the wind the principal areas of weakness in all wind turbines are those where one part of the machine rotates inside another, bearing failures being the most common reason for wind turbines being put out of action. In 2014, for instance, Siemens had to write off an entire wind farm with a book value of €223 million because of failed bearings. And although this was very probably due to a batch of substandard parts the wind farm being only two years old making it somewhat exceptional, without pre-emptive action, bearing failures are inevitable.

What this means, therefore, is that, while the projected life expectancy of most wind turbines is usually stated to be 20 years, in practice such longevity can only be achieved by instituting a rigorous regime of inspection and preventive maintenance, which has two quite considerable effects on cost. For not only do all inspections and most maintenance tasks require the turbine to be taken offline while the work is carried out thereby contributing to the turbine’s derating but in the difficult and often hazardous environment of the North Sea, all maintenance work, no matter how routine, is very expensive, usually involving either ships or helicopters and sometimes even both along with yet another corps of highly trained engineers, all of whom have to be kitted out and fed as well as paid.

In fact, according to another recent report by the UK Government’s Department for Business, Energy and Industrial Strategy, which you can find here, the lifetime Operations and Maintenance (O&M) costs for the average offshore wind turbine amount to 80% of its capital costs: a figure which totally transforms the economics of offshore wind farming. For while, as noted earlier, the average offshore wind turbine generates 12,426 MWh of electricity each year, which, at an average wholesale price of £60 per MWh, results in an average annual income of £745,556 and lifetime revenues of £14.91 million, the addition of 80% to the original capital cost of £10.8 million, brings the total cost of this same wind turbine, over its projected 20 year lifespan, to £19.44 million, thereby resulting in loss of £4.53 million, which, in one way or another, is currently being met by the UK tax payer. What this also means, therefore, is that, were we ever to actually build 16,492 of these machines, it would eventually cost us a minimum of £3.73 billion per year in subsidies to the operating companies merely to allow them to break even.

Even more significantly, however, this is not just about money. For bearing in mind that all economics is ultimately about energy, the role of money in the economics of wind farms can be more instructively seen as simply a common metric for measuring and comparing all the diverse forms of energy which go into building, operating and maintaining a wind farm on the input side of the equation, and the singular form of energy electricity – that is produced on the output side. If the cost of what we are putting into this transformational engine is more than the income we are getting out of it, this would therefore indicate that wind turbines are actually consuming more than they are producing, not just in terms of money, but in terms of energy as well: a deficit which, in the case of a technology designed to produce a surplus of energy, totally defeats its purpose.

One possible solution to this, of course, would be to extend the lifespan of the average wind turbine so as to increase lifetime energy output and hence lifetime earnings. The projected life expectancy of 20 years, however, is not something which has just been plucked out of the air. For as a wind turbine gets older, it naturally needs more maintenance. This means that it spends more time offline and generates less income. At the same time, the additional maintenance results in increased annual O&M costs, which eventually come to exceed the generated annual income, at which point, when the turbine is usually around 20 years old, it is decommissioned to avoid further losses.

Another solution, therefore, would be to raise the price of electricity, which has actually happened in the last few months as a result of shortages in the supply of natural gas and an increase in its price. This has pushed up the price of electricity generated by gas fired power stations, along with  wholesale electricity prices right across the board, with the result that if prices were to remain at their current level of £108 per MWh, an increase of 80% since 2020, wind farms would not only be profitable but would be able to extend the operating life of the average wind turbine by up to five years.

The problem with this, however, is that, were prices to remain at this level, it would not only have a massively inflationary effect upon the economy as a whole but would probably destroy parts of it altogether, particularly its most energy-intensive sectors such as steel, cement, glass and ceramics, which would not be able to compete with low cost imports from countries like China, which still use coal-fired power stations to generate cheap electricity. Even if we imported all our energy-intensive raw materials from the Far East, however, this would not prevent the cost of building and operating wind farms from eventually returning to the same uneconomic level they were at in 2020. For simply inflating the price of the energy produced by a generator which consumes more energy in its manufacture, installation, operation and maintenance than it ever actually produces does nothing to solve the problem of its inherent non-viability. It merely inflates the price of everything else until one is back at square one. It’s a vicious circle which one cannot escape by merely manipulating the rate at which one exchanges megawatt hours for pounds: a fact which, in itself, one can only begin to comprehend if one first understands the true nature of economics and the ineluctable laws upon which it is based.

If the destruction of what are left of our heavy industries were not bad enough, however along with the need to continue importing raw materials from low cost energy countries while subsidising our own high cost producers there is still yet another problem that arises from this vain attempt to power a national electricity grid on wind turbines: one which may even be more damaging in the long term than wind power’s inherent non-viability and which results from the fact that the 31.14% derated capacity of these machines is not constant. If it were, their output would be reliably predictable. But it isn’t. Sometimes, when wind speeds are optimal, wind farms can produce up to 70% or even 80% of their nominal capacity. Other times, however, when there is an area of high pressure sitting over the British Isles, for instance, with hardly a breath of wind to be felt even along the coast, they may produce nothing at all, thereby reducing the grid’s overall capacity by an amount which will become ever more significant the more wind power we install.

At present, the problem is not yet critical. For according to DUKES 2021, wind power still only represents 10% of the our total derated capacity of 75.8 GW, while average demand is only about 50% of this. The problem is that demand, like the available capacity of wind turbines, is also not constant. It not only varies throughout the day, with peak demand coming in the early evening when people are cooking supper and watching TV, but throughout the year. For while many households in the UK still use electricity for heating, the weather is seldom hot enough to merit installing air conditioning. This means that while demand can drop to as low 20 GW during the summer, during the winter it is not unusual for demand to reach or even exceed 60GW, or 79.16% of our total derated capacity.

Still plenty of headroom, you might think. However, it is just as possible to have a high pressure area sitting over the UK in winter as it is in summer. It’s what gives us that sharp, frosty weather with bright blue skies that can make winter days so dazzlingly beautiful. And even though the existing installed base of wind farms may still be relatively small, losing it for even one day during such a windless cold snap would bring down that head room in generating capacity to little more than 8 GW. Worse still, you will note from Table 1 that we import 6.79% of our electricity from abroad, some of it from Belgium and the Netherlands, but most of it via an interconnect with France: a cable under the English Channel which actually caught fire earlier this year, and will not be fully reinstalled until 2023, which makes it quite possible that, sometime over the next two winters, UK demand for electricity could exceed supply, resulting in power cuts. 

This is not absolutely certain, of course. We may get lucky: the next two winters could be exceptionally mild or, indeed, exceptionally windy (though not too windy). But as we increase our wind turbine capacity, while simultaneously decommissioning our conventional power stations, the risk of blackouts will increase inexorably. Indeed, our derated wind turbine capacity wouldn’t even have to get to the 75% of total capacity which our central planners seem to be intending, before outages became inevitable. Even at 30% of total capacity, we would not have sufficient head room to lose any significant proportion of our wind capacity for even a moment, which is all it would take for power cuts to occur. If we are going to press ahead with wind power, therefore, we will have to take some fairly major steps to prevent this from happening, either by providing back-up generating capacity or some form of storage

In fact, we already have some storage facilities. If you look again at Table 1, you will see that, in 2020, 1,402 GWh of our electricity was provided by something called ‘Pump Storage’, a method of storing energy which uses spare generating capacity at off-peak times to pump water from a low lying reservoir to one at a higher elevation. At peak times, the elevated water is then used to drive a turbine to generate electricity in the same way as a hydroelectric plant, returning the water to the lower level reservoir once again.

Despite being reasonably efficient with only between 20% to 30% of the original electricity being lost these pump storage facilities do however present a number of obstacles to widespread adoption. Although they are smaller than conventional hydroelectric power stations, for instance, they tend to attract the same environmental opposition. It is also difficult to find suitable, convenient and available sites for them, not least because they need to be built at the top and bottom of hills with large pipes between the two reservoirs, which also raises the problem of attenuation. For much of the North Sea coast along which most of our wind farms are being built is completely flat. Take East Anglia and Lincolnshire, for instance, no parts of which are more than a few feet above sea level. If one wanted to use pump storage for electricity generated almost anywhere along the southern stretches of the North Sea, therefore or, indeed, along the eastern stretches of the English Channel one would probably have to transport it all the way to the Pennines for conversion to elevated water, from where it would then have to be transported all the way back to lower lying urban areas to be consumed, thus quite clearly raising the question to how much such storage the UK could realistically provide.

That’s not to say, of course, that it cannot be used as a part of a larger solution. But it is doubtful whether it could be the solution in itself. This therefore brings us to battery storage, which is far more convenient in that it can be situated almost anywhere, and does not, therefore, create additional attenuation problems. The problem is rather the cost of the batteries along with their storage capacity. For given that the UK used 329,906 GWh of electricity in 2020, this represents an average of 904 GWh per day, with one day’s supply probably being our minimum storage requirement.

It could be argued, of course, that even when we have built all of our 16,492 offshore wind turbines and decommissioned all our gas fired power stations, we shall still have nuclear and hydroelectric power and so wouldn’t need storage backup for our entire average daily consumption of 904 GWh. This, however, is to ignore the fact that, in winter, we regularly use up to 50% more than this, making an average day’s usage seem like a reasonable basis for the purposes of this exercise.

For simplicity, I have also based all my calculations on the Tesla Powerpack 2 4HR. For although there may well be more competitive batteries on the market, none has been so well documented, with fairly detailed specifications available for at least two major installations: one at the Southern California Edison facility, which has a total storage capacity of 80 MWh; the second at the Hornsdale Energy Reserve in South Australia, which has a storage capacity of 100 MWh. More to the point, I have also managed to find a price, which, including the required inverter, comes to a total of $172,707 per Powerpack, or £127,803 at the current exchange rate

All that remains, therefore, is to work out how many Powerpacks we would need to store 904 GWh of electricity, which is a simple matter of straightforward if slightly shocking arithmetic. For although the Powerpack 2 4HR is designated as a utility scale battery, it actually only stores 210 KWh, which it can discharge over a period of 4 hours at an output rate of 52.5KW. In order to store 904 GWh, therefore, one would need over 4.3 million of them at a total cost of more than £550 billion, which is more than three times the capital cost of the turbines for which they would be providing back-up support. And all to store just one day’s supply of electricity.

Of course, there are other battery technologies in development which promise to be less expensive than the lithium-ion batteries which Tesla supply. But this is where we are today. And until other technologies are on the market, we have no way of knowing how much more viable they may be.

This therefore brings us to our third storage option, which has similarities to pump storage in that it is based on using the electricity generated by renewables during optimal conditions to produce something which can then be converted back into electricity at a later date: this something being hydrogen, which has one key advantage over pump storage in that it can liquefied, put in tankers and delivered to hydrogen fired power stations near to where the electricity is to be used, thus avoiding the problem of attenuation. The problem with hydrogen, however, is that there are no free hydrogen molecules in the atmosphere. This is because hydrogen loves oxygen, with which it bonds instantly to form water, and from which it is then very expensive to extract, the most straightforward method of extraction, that of electrolysis, which your chemistry teacher probably demonstrated to you at school, being the most energy intensive, requiring around 50% more energy in the extraction process than can then be obtained by burning the resulting hydrogen.

As a result, other methods of extracting hydrogen are constantly being developed. One such, for instance, uses artificial sunlight to break down lignocellulose in leaves and wood chips immersed in an alkaline solution. Others involve the conversion of biomass using microbial processes. By far the most easily scalable method, however, and hence the most common, is that of heating hydrocarbons such as natural gas so as to break down their complex structures into simpler molecules. Not only does this produce more CO2 than would be produced by simply burning the hydrocarbons, however thereby adding the cost of carbon capture to the overall cost of the process but one also has to factor in the cost of the gas.

It is for this reason that, in the past, the production of hydrogen as a fuel has only ever been contemplated when either the raw materials to be converted into hydrogen or the energy to be consumed in the process have effectively been free. Thus, in the early 2000s, plans were put forward to produce hydrogen from sea water in Iceland using geothermal energy that would otherwise have simply been lost. In a similar vein today, hydrogen is already being produced on a small scale in the Orkney Islands using power from a wind farm which produces more electricity than the islands, themselves, can use but which is too small and too far away from the Scottish mainland to make an interconnect viable. At one level, therefore, it makes perfect sense to use the excess energy to produce hydrogen, especially as the resultant product is then sold commercially in bottles rather than burnt. At another level, however, one has to ask how, given the otherwise insufficient demand for electricity in the islands, it could possibly have made sense to build a wind farm on Orkney in first place.

More to the point, as an example of how hydrogen might be used to store electricity generated by intermittent renewables it totally fails. For while it makes sense to use excess electricity to produce hydrogen in this way, it would not make sense to turn this into a two stage process in which an already non-viable form of energy was used to make a fuel which was then burnt in order to produce a quantity of energy that was actually less than that expended in the process itself, especially as this would always make the electricity generated by hydrogen fired power stations more expensive than the electricity directly generated by the primary source in this case, wind turbines which, in any real world scenario, would always have priority loading on the grid, with the electricity coming from the hydrogen store only being used when wind power is offline or insufficient, i.e. as backup. Given that any backup system must always have the same capacity as the primary system it is designed to support, this means, therefore, that dozens of hydrogen fired power stations would have to be built which, for the most part, would just sit around idle: a clearly inevitable outcome which then raises the question as to who would actually build and operate these power stations and how much they would have to charge for the electricity.

What this really highlights, however, is the basic problem with using any form of redundant capacity as a back-up: a problem which can be seen more clearly if the power stations intended to be used in this way are entirely independent of the primary capacity, as would be the case, for instance, if we chose to use nuclear power to fill this role. Unlike hydrogen power stations, moreover, the economics of which are largely unknown, the facts and figures concerning  nuclear plants are readily available, thereby allowing us to quantify the scale of the problem.

Starting with the basics, therefore, the UK currently has seven operational nuclear power stations with a total capacity of 9.57 GW, which gives them a potential to produce 83,816 GWh of electricity per year. As can be seen from Table 1, however, in 2020, they only produced 50,278 GWh, or 60% of their capacity, which is probably more due to loading than availability, in that other sources of power, most notably renewables, were given priority on the grid, their output being taken up first, with the result that nuclear power stations were only brought on at peaks times and when they were otherwise needed.

In this respect, one might therefore say that, in a limited capacity, the UK’s nuclear power stations have already been fulfilling a backup function, which, given the spare capacity they still represent, may yet help us avoid blackouts while the French interconnect is unavailable. If we want our nuclear power plants to provide full back-up in the longer term, however, not just for our existing 11,006 wind turbines, but also for the additional 14,195 offshore wind turbines yet to be built, we would have to increase their capacity by a minimum of 27.71 GW. I say ‘minimum’ because, not knowing their actual derated value, this assumes that these back-up nuclear power stations would be able to operate at 100% of capacity whenever they were called upon, rather than the current 60%: an assumption which may be a little over-optimistic, but which means that all the calculations which flow from it are on the conservative side and not, therefore, exaggerations.

Given that the latest generation of nuclear power plants, such as Hinkley Point C, have a larger capacity than their predecessors, averaging around 3.2 GW, the first of these calculations concerns how many of these new plants we would need. And the fairly straightforward answer, of course, is nine, not including Hinkley Point, itself, which, while due to come online in 2023, is not intended to be part of any additional capacity, but is a replacement for three of our existing nuclear power stations which are due to be decommissioned in the same year. If Hinkley Point is likely to provide the model for any future additional capacity, however, it is also likely to be indicative of the price. And at present, it is estimated that, at completion, it will cost £23 billion, up from £18 billion when it was first approved, indicating that the total cost of building an additional nine such plants would be £207 billion: a significant reduction on the £550 billion it would have cost to install Tesla batteries as back-up, but a substantial investment, nevertheless, which I think it would be fair to say no one would undertake unless they could sell at least the 60% of capacity which current operators are selling to the grid, and probably more. That is to say that no one would build such power stations purely as back-up.

They might, however, build them to make a profit. For while this increased nuclear capacity would cost 34% more than the £153 billion capital cost of an additional 14,195 wind turbines, nuclear power stations last, on average, around 40 years, twice as long as wind turbines, giving their operators an additional twenty years over which to spread the capital cost. Despite all the security that has to surround their sites, and the rigorous safety procedures which circumscribe their operation, their O&M costs are also lower. We know this because we know that they are actually viable, which means that they do not need public subsidies. Their output is also much more reliable than wind turbines, thus obviating the need to put in place any significant backup. From both an engineering and economic perspective, therefore, it would be better for the UK to simply licence another nine Hinkley Points and forget about building any more wind farms.

What’s more, the UK government must surely know this. After all, most of the information I have used to write this essay has been taken from the government’s own website. Nor have I needed more than a couple of simple spreadsheets to make the necessary calculations. The problem, however, is that the government’s energy policy does not just rest on engineering and economic considerations. And having convinced the public that the world is going to end if we don’t stop using fossil fuels and sold us all on clean renewable energy instead, it doesn’t have a lot of options, not least because the same people who object to fossil fuels because of their CO2 emissions, also tend to object to nuclear power, partly because of its inherent risks which, after Three Mile Island, Chernobyl and Fukushima, is not entirely unreasonable and partly because of the still unanswered question as to what to do with all the nuclear waste. With no other practicable or economically viable solutions left to it, however, the government has thus rather painted itself into a corner, with very little choice, therefore, other than to continue on its current path and hope that something turns up: a policy in which one suspects it is further encouraged, both by the subsidies it has already handed out, which it cannot now admit were a total waste of taxpayers’ money, and by the assurances of the recipients of these subsidies that new technological solutions are just around the corner and that, eventually, subsidies will no longer be needed.

Not, of course, that this is very likely. For while technological improvements can, and very often do increase the profitability of a technology, once that technology has been established, these improvements tend to be incremental and marginal rather than the revolutionary leap forward that would be required to turn the 25% losses that wind turbines currently incur into a profit. In fact, it is more or less a basic principle in the development of almost any technology that the big leaps come first: something which the recipients of the government’s largesse must also surely know, raising the question, therefore, as why so many large corporations continue to participate in what they must also therefore know is a fundamentally non-viable business. The fact is, however, that, in many industry sectors, large corporations actually prefer non-viable projects that are subsidised to viable projects that are not.

This is because every purely commercial enterprise involves risk. No matter how carefully one draws up one’s plans, or how much due diligence one undertakes, there is always the possibility that something will go wrong. An overall downturn in the economy may mean that, by the time one gets one’s product or service to market, people are spending less. A new technology may have emerged that makes one’s own redundant. The costs may not just turn out to be higher than one had estimated which is also the case but ruinously so, especially if one’s construction crew has the misfortune to uncover some form of contamination on the site or, worse still, an archaeological treasure trove which halts all development for six months. The list of possible calamities is infinite while only one thing is ever certain:  that, in business, there is no such thing as a sure thing… unless, of course, you can get somebody else to underwrite it, somebody with deep pockets and a political agenda who doesn’t mind, therefore, wasting their money.

How else do you think a wind farm gets to be built on Orkney?

The problem is that when projects are even partially funded out of the public purse, not only does this invert one of the most basic laws of free market capitalism – that, even with only imperfect knowledge, investors should only invest in projects which they are confident will produce a return on that investment - but it more or less ensures that capital resources, including energy, manpower, and energy-intensive materials such as concrete and steel, are continually wasted on White Elephants to the overall impoverishment of the economy in which this wastage occurs. For while government subsidies may mean that the commercial partners building these White Elephants do not incur any losses - and usually make a profit - this does not mean that losses do not occur. For if one expends more capital resources on a project than the project generates, this constitutes a loss. By subsidising it, the loss is merely transferred, either to the tax-payer - if the subsidy is paid for out of taxes - or, more commonly these days, to the government's balance sheet, as an increase in the national debt.

In fact, it is the ease with which governments have been able to borrow money over the last decade or so that has been responsible for the increase in this kind of capital misallocation and hence the rise of what is generally referred to as ‘crony capitalism’. For in the past, governments which borrowed money excessively and wasted it on non-productive projects would have been punished by the bond markets, which, fearing possible default, would have traded down the price of the government’s bonds, thereby increasing their yield and forcing the government to pay higher interest on future borrowings. Since the 2008 financial crash, however, the practice of central banks printing money and using it to buy their government’s bonds has ensured that bond prices have remained high and interest rates low, thereby removing this restraint. Worse still, governments are encouraged in their reckless spending by Keynesian economists who tell them exactly what they want to hear: that public expenditure of this kind stimulates economic growth, thereby increasing overall wealth, while the evidence would suggest that the only wealth it actually increases is that of the direct recipients of the subsidies.

To politicians, however, the attractions of such a regime are overwhelming. For not only do they get to create what, on the surface, looks like purposeful and productive economic activity, but they give the impression that they are the ones driving the economy rather than those who actually create viable and self-sustaining businesses. Worse still, in this case, they also get to strut around the world stage claiming that they are helping to save the planet: a hubristic claim to historic significance which reminds me of Shelly’s poem ‘Ozymandias’. For having replaced money based on stores of surplus energy with IOUs against future stores that can never be produced, when the financial bubble finally bursts not only will the subsidies dry up but so will the electricity needed to turn the rota blades of wind turbines out of the North Sea gales, such that, eventually, all that will be left of them are the stumps of their monopiles sticking above the waves, like the ‘two vast and trunkless legs’ which Shelly’s ‘traveller from an antique land’ came upon in the desert, alongside a half buried pedestal upon which these words appeared:

‘My name is Ozymandias, king of kings;

Look on my works, ye Mighty, and despair!’