Friday, 31 December 2010

Economic Misunderstanding: The Keynesian Legacy

One of the great difficulties in achieving clarity in the current economic and political debate over deficit reduction in the UK is the illusion, fostered in the second half of the twentieth century, that the economic laws that govern the wealth and wellbeing of nations are in some sense different from the economic laws that apply to individuals; that nations, in effect, can do things with money that you and I can’t. Nothing more exemplified this apparent difference than the efficacy with which Keynesian economic principles were brought to bear on the problems of mass unemployment in the 1930’s and post war reconstruction in the 40’s and 50’s. Through both FDR’s New Deal and the later Marshall Plan, the impression was created that, in some way, public expenditure, whether on new capital projects or on consumer support, actually created wealth. 

Throughout the 1960’s, as a result, left-wing parties were swept to power all around the world on programmes of increased public expenditure and improved social welfare, financed by a mixture of increased taxation and rising public debt. The view was that building schools and hospitals, and providing public services, not only created a healthier, wealthier, and more equitable society, but stimulated the economic growth that then paid for it. So strong was this prevailing orthodoxy, in fact, that by the late 70’s, when Margaret Thatcher came to power in Britain, and introduced a new policy of cutting the deficit and living within our means, many people regarded her as economically illiterate and an utter barbarian.

What they didn’t understand was that Keynesian economics only works in exceptional circumstances, when two key conditions apply. The first condition is that the economy to be stimulated should have large amounts of spare capacity, principally in labour: something that was clearly the case in the 1930’s and the again in the 1940’s as large numbers of soldiers returned home from World War II. Without this spare capacity, increases in public expenditure merely divert resources from wealth creating activities into wealth consuming ones. The second is that it must be possible to pay for the increased public expenditure out of borrowing rather than taxation. This is because increased taxation, no matter what form it takes, always reduces employment in the private sector. If one puts up income tax, for instance, employees effectively have to take a pay cut unless their employer increases their salaries to compensate for the additional deductions. If he does this, however, he then either has to cut staffing levels in order to reduce costs, or he has to put up his prices. If the latter, he then becomes less competitive. Sales fall and he either downsizes the company to bring costs in line with the reduced revenues, or he goes out of business. Either way, jobs are lost.

Borrowing money to pay for increased public spending also has its downside. Eventually one has to pay the money back. The last £45m instalment of Britain’s debt to the USA after World War II was only finally repaid in 2006. While the Attlee government of 1945 benefited from the loan, every government for the next sixty years therefore experienced it as a drain on the economy, often necessitating further borrowing to make up for these additional outgoings. It is for this reason, along with the fact that governments simply got used to living beyond their means, accepting the Keynesian approach as an economic fact if life, that, depending on what one includes under this heading, the UK now has a national debt of somewhere between £2.2 and £3.8 trillion, with annual interest payments now exceeding our entire expenditure on the National Health Service.

Failure to understand the dangers inherent in this mounting debt has also led to the development of a now very distinctive cycle in post-war British politics, in which every successive Labour government has dramatically increased both borrowing and public expenditure until, in each case, the inevitable financial crisis has brought in a Conservative administration and a period of retrenchment. Admittedly, this characterisation may seem slightly unfair in the case of the Attlee government, in that, due to wartime spending, Britain was already on the point of financial collapse when Labour was elected. On the other hand, it is certainly not an unfair description of the Wilson government of 1964 to 1970, whose growing debt and continuing need to borrow money led to a 14.3% devaluation of the pound in 1967. Although I was just thirteen at the time, I distinctly remember the Prime Minister going on radio and television to reassure a shocked and anxious nation that the pound in our pockets was worth just the same. By increasing the cost of imported raw materials, however, this one act of economic and political expediency contributed as much to the hyperinflation of the early 1970’s as did the almost annual hikes in oil prices imposed on a long-suffering public by OPEC during that period.

To counter both the inflation and the growing trade and fiscal deficits, the Heath government then tried to rein in public expenditure, particularly in the highly wasteful and strike-prone nationalised industries. The result, however, was a protracted miners’ strike, which, by cutting off supplies of coal to power stations, reduced Britain to a three-day working week, thus exacerbating our economic problems still further. In a snap election, called by the Prime Minister to decide the question as to ‘who ruled Britain’, the electorate then merely stated the obvious by replying, ‘Not you’. A Labour government was consequently returned to power, which, in order to get people back to work, immediately conceded to the unions everything they demanded, restoring former levels public expenditure and increasing both taxes and borrowing until, once again, the economy teetered on the brink of collapse.

This time the then Chancellor, Denis Healey, called in the IMF, which demanded £2.5bn cuts in government spending in return for a $3.9bn loan. In an attempt to explain this to the electorate, the Prime Minister, James Callaghan said: “We used to think that you could just spend your way out of a recession. I tell you, in all candour, that that option no longer exists.” But still the country refused to understand. Predictably, the public sector unions, whose members were going to be subject to a pay freeze, went out on strike. The dead went unburied; refuse remained uncollected for weeks; rubbish built up in the streets; and the newspapers called it ‘The Winter of Discontent.’

Ironically, it was this that led the electorate to reject a government which was simply doing what the IMF told it it had to do, and to elect, instead, a government which was going to do exactly the same thing – only more so, and of its own volition. For it was then that Margaret Thatcher embarked on bringing the economy back into balance: refusing to prop up failing nationalised industries; denationalising them whenever she could; cutting taxation, along with the bureaucracy and waste on which so much of it was spent; and rolling back the stultifying fingers of the state. To begin with, of course, it was extremely painful. Unemployment initially rose to over three million. There were strikes and riots; and in large parts of the country the Prime Minister was regarded with a visceral hatred that would have driven some to murder had they been able to get their hands on her. Yet within five years, the economy was soaring in a way it hadn’t done for most of the century. Unemployment fell rapidly and business thrived; and although there were some hiccups along the way – most notably those caused by our entry into the ERM – after 18 years of Tory government, the average person in Britain was wealthier than he or she had ever been. So much so that in 1997 they decided that they’d like to see more of the national wealth invested in public services and duly elected Tony Blair as Prime Minister.

Fortunately, during his first term in office, his new chancellor, Gordon Brown, stuck largely to the spending plans of his predecessor, Ken Clark, thus fostering his self-proclaimed reputation for prudence. Throughout the government’s second and third terms, however, public spending rose steadily. When the Conservatives left office in 1997, public expenditure represented just 32% of GDP. In 2008, before the banking crisis began, it had reached 40%. At the end of 2009/10, it was nudging 50%, with an annual fiscal deficit of £150b.

Of course, Labour politicians will say that this is due to the recession caused by the world-wide banking collapse, which had nothing to do with them. But even if Britain hadn’t had one of the largest banking sectors in the world – on which it was overly dependent – and even if our banking sector wasn’t regulated by a regimen which Gordon Brown, himself, had introduced, all the banking crisis really did was highlight a structural imbalance, which has meant that, while Germany has been able to come to the rescue of both Greece and Ireland – whose economies were also overly skewed towards the public sector, and were also running large deficits – the UK is now in a battle to save itself. 

One of the problems is that too few people realise how serious this is. Nor is it helped by the leader of the opposition telling them that there is some miraculous Keynesian alternative to public spending cuts: that by maintaining public expenditure at only a slightly reduced level, a Labour government would, instead, concentrate on stimulating economic recovery, from which the resultant increased tax revenues would, themselves, bring down the deficit, without the need for retrenchment and the hardship this necessarily causes. It is as if, indeed, the history of post-War Britain has taught us nothing, or that we never heard Jim Callaghan utter those fateful words that economic solutions which defy the laws of economics are ‘no longer an option’. The trouble is, of course, that for many of us this is actually true. Most younger people, for instance, would be hard pressed to say who Jim Callaghan was, let alone what, after a lifetime in Labour politics, he was eventually and traumatically brought to understand. And if you told them that, in 1976, the British government had to ask the IMF for a bailout in much the same way that the Irish government has recently had to do, most of them would be utterly shocked. 

It is this, however, which brings us to the most serious impediment to establishing the current debate on anything like a rational basis. For in knowing so little about our recent economic history, the opinions of most of us are informed not by the facts of this history, but by the myths. The most prevalent of these is that, in taking from the rich and redistributing to the poor, Keynesian social democratic parties, such as the British Labour Party, are like Robin Hood, whereas the Tories, who are simply out for themselves and their rich and aristocratic friends, are the wicked Sheriff of Nottingham. The evil Mrs. Thatcher, in particular, is seen as the symbol of this fundamental and primordial opposition, and it is her name that is now used to evoke collective folk memories of primeval battles on the picket lines and dolorous queues outside shabby unemployment offices, where the lost and defeated ultimately fade away to become dead statistics. 

We are only seven months into the new Tory led coalition, and already we have seen this atavistic conflict re-enacted in almost perfect detail in the student protests over increases in university tuition fees. It is almost as if the students of today are reliving the stories told to them by their parents who were students in the late 70’s, when Margaret Thatcher came to power, and when almost every student came out on to the streets in support of one striking group or another. That the current measures will have no immediate effect on student finances, of course, is irrelevant. The important thing is to take the fight to the Tory oppressors. For not only is it generally believed that these cuts are unnecessary – that, as Ed Milliband tells us, there is another way – but that they are politically motivated, and targeted in a very selective manner. Nor does it help that to some extent this is true. In the case of tuition fees, for instance, the size of the increase makes it fairly clear that the government is trying to affect a change in students’ perception of higher education, making them ask themselves whether they really want to go to university – whether it is actually the right thing for them – rather than simply going because there’s no real financial impediment to doing so, and because it’s what everyone does. Rather than opening up a debate on this issue, however, the government has engendered even greater cynicism as to their true motives by pretending that the increases are purely for financial reasons.

This is something, in fact, from which the government needs to learn a lesson. For, in the weeks and months to come, as more detailed cuts are announced and begin to bite, there are likely to be several more areas in which economic measures will naturally be accompanied by some element of social reengineering. And if the government doesn’t admit to this and argue its case openly, the more the myth of Tory mendacity and uncaring indifference will gain credibility. What’s more, it will allow both the opposition and the media to open up even more rifts between the Conservatives and the Liberal Democrats, just as the issue of tuition fees was seen to do. And if the LibDems continue to be vilified for supporting what the media will no doubt increasingly portray as a wanton and unprincipled crew of Etonian thugs, my fear is that the coalition will not be able to hold up under the pressure, that it will collapse much as Ted Heath’s government did in 1974, with the economy still unreconstructed, and the country firmly against it. Promising a restoration of public services, we will then see another Labour government following the same old Keynesian strategy, and history will repeat itself once again. For to paraphrase the combined words of Edmund Burke and George Santayana, for those who fail to learn from history – or, in this case, it seems, fail to learn any history at all – the travails of Sisyphus seemed destined to recur.

Thursday, 23 December 2010

Freak weather? Or something someone's not telling us?

Some people may have noticed that amid all the footage of snow-laden roads and stranded airline passengers that has filled our 24-hour news services over the last few days, very little has been said about the causes of the extraordinary weather Britain, and most of North West Europe, has been experiencing since the end of November. During some weather forecasts, some graphics showing a meandering jet stream have occasionally been presented, and the effects of this described; but little mention has been made of the cause, or causes, of this unusual phenomenon, an omission which is all the more surprising in that a number of contributory factors are clearly identifiable.

The most important of these is the fact that we are now emerging from one of the most extended periods of minimal solar activity in over a century. 

Solar cycles normally last 22 years: 11 years from minimum activity to maximum; and then 11 years back again to the minimal phase. As can be seen from Figure 1, however, the most recent minimal phase lasted nearly two years, from the end of Cycle 23 in early 2008, to start of Cycle 24 in late 2009. 

Figure 1: Solar Activity from the Start of the Century

The significance of this is that solar cycles have long been correlated with weather patterns. Solar flares, seen from earth as sun spots, produce an enormous amount of high-energy radiation, known as the solar wind. This is deflected around our planet by the earth’s magnetic field, but can be seen above the poles in the form of aurora. Among its other numerous effects, however, it also increase the amount of ozone in the upper atmosphere. This causes it to heat up, which in turn affects the strength and direction of winds all the way from the stratosphere to the earth’s surface. In effect, the greater the solar activity, the more energised our atmosphere becomes. Unfortunately, the opposite is also true. Periods of minimal solar activity effectively allow the upper atmosphere to cool, weakening the currents of air it otherwise thermally drives. In this instance, the critical effect has been upon the jet stream, which instead of rushing across the Atlantic, from west to east, in an almost straight line at the latitude of the English channel, is now meandering north towards the Arctic with all the lassitude of an exhausted polar explorer, before curving south again toward the Mediterranean, dragging the Artic air with it to lay siege to most of North Western Europe. 

The two questions to which this immediately gives rise, of course, are: ‘How long is this going to last? and ‘Is this just a one-off event?’ Unfortunately, the answer to the first is impossible to predict. But it could be some time. Not a happy thought if one is struggling with energy bills. The answer to the second, however, could be even more worrying. For as one can see from Figure 1, Solar Cycle 24 has not got off to an exactly flying start. To my untrained eye it doesn’t even look like it’s going to achieve the projected values for 2010/2011 indicated on the graph. Indeed, the protracted nature of this period of inactivity, and the sluggishness of the recovery so far, has even led the journal of the American Geophysical Union, Eos, to publish a paper suggesting that the sun might be returning to a state similar to that of the Maunder Minimum, which caused the ‘Little Ice Age’ in Europe during the 17th century.

Of course, it is too early to be sure of this or even entertain it as a possibility. However, another cycle, of a completely different kind, would suggest that, in Britain at least, we may be in for some considerably colder winters for some years to come. This is because in 2009 the AMO (Atlantic Multidecadal Oscillation), as shown in Figure 2, went into the negative part of its cycle for the first time since 1994. 

 Figure 2: AMO index 1964 – 2009

The AMO is a cyclical pattern of sea surface temperatures (SST’s) in the North Atlantic, with warm and cool phases each lasting between 20 and 40 years. The variation in temperature, at just ±0.5°F above or below the mean, is not dramatic; but in such a large body of water, the effect is significant, with air temperatures following the SST’s fairly closely. There are some anomalies, as in 1964 for instance, when the air temperature was significantly lower that the SST, but for the most part they are fairly consistent. It should also be noted that the AMO is perfectly natural, is quite independent of global warming, for instance, and that the same pattern has been consistently recorded since the mid 19th century.

The fact that the AMO dropped into the negative part of its range in 2009 does not, of course, mean for certain that we have entered another cool phase. Again it is too early to tell. Fifteen years is actually quite a short period for a full warm phase. As can be seen from Figure 3, on the other hand, longer phases are sometimes interspersed by shorter ones. Indeed, the 20th century norm of roughly 60 year cycles (20-30 years hot, 20-30 years cold), is somewhat different from the 19th century norm, suggesting that there could be cycles within cycles. In any event, the dip into the negative part of the range in 2009 would indicate that the trend is now most definitely downward. One way or another, therefore, it seems likely that, for the next two or three decades, winters in Northern Europe are going to be more like those we experienced in the sixties and seventies than those of nineties and noughties. 

Figure 3: AMO Index, 1856-2008

The bigger question, however, is how all this fits into the prevailing theory of global warming. If we are in for a weak solar cycle and the Atlantic is cooling, does this mean that the earth, generally, might be entering a cooling phase? Unfortunately, no. If we take the Atlantic, for instance, it is just part of global ocean system, with currents moving from ocean to ocean in a cycle that last around 1600 years. While some parts of the system are cooling down, therefore, other parts may be warming up. Moreover, the AMO graphs above have already been detrended. That is to say that the effects of global warming have already been taken out. If one were to look at a graph of the actual SST’s, one would see that the whole oscillation has been gradually rising, by around 0.22°C, in fact, over the last 150 years. 

This is not to say that, in the North Atlantic, many of the effects of global warming won’t be reversed over the next two or three decades as the AMO goes through its cold phase. At its lowest point, for instance, the Arctic ice is likely to return to more or less what it was like during the eighties and nineties. When it returns to its warm phase, however, the melt is likely to be even more severe and extensive than it has been during the last ten years. This is because the AMO both exaggerates and disguises global warming. During the warm phase, it exaggerates it. Thus, over the last ten years we have been constantly shown pictures of ice melting and polar bears losing their habitat. This, we have then been told, is a consequence of global warming. But this is not true. It has rather been the consequence of the AMO being at or close to its high point. Ten or fifteen years from now, however, the ice will have returned, and people will then say that global warming was just a load of nonsense. Again, it won’t be true. Global warming will simply be being disguised by the cold phase of the AMO. But it may be hard to convince people of this.

In fact, this may cause both climatologists and politicians some quite considerable problems. If so, however, they’ve only got themselves to blame. For at no point during the last fifteen years did they actually tell us that most of the Artic ice melt was due to the AMO. They may have thought that this would confuse people or send a conflicting message; but treating people as mushrooms is only ever a short-term strategy. Eventually they find out, usually in ways that cause more problems than would have been caused by telling them the truth in the first place. And in this case, the effect is likely to be even more exaggerated, in that the policy of disclosing as little actual scientific information as possible and relying largely on emotive images and polemics to get the message across has rendered the entire climate change debate essentially irrational. Instead of being given unequivocal, empirical evidence and a clearly argued case, people who question global warming, and in particular its man-made origin, are simply branded climate-change deniers and are bracketed, both morally and intellectually, with holocaust deniers. The backlash, when it comes, therefore, is likely to be all the more violent.

The problem for both climatologists and politicians, however, is that there doesn’t seem to be any way out of this potential mess. For the simple fact is, of course, that unequivocal, empirical evidence for man-made global warming doesn’t actually exist. The accumulation of greenhouse gases in the atmosphere is simply the best theory anyone has so far come up with to explain that part of global warming for which it is otherwise impossible to account. For global warming is a fact. Over the last century, the Earth's average temperature has increased by approximately 0.6°C, only 25%, or 0.15°C of which, can be attributed to other identifiable causes, predominantly the sun. That, therefore, leaves us with 0.45°C unaccounted for. And given the way greenhouse gases in the upper atmosphere reflect radiation back down to the earth’s surfaces instead of allowing it to escape into space, this is an explanation that certainly fits. But its not proof. It is a piece of inductive reasoning. And as Sir Karl Popper taught us, conclusions drawn from inductive reasoning can never be verified, only falsified. At this point, in other sciences, therefore, experiments would be set up to test the hypothesis. But climate science doesn’t really lend itself to experimentation. It has to rely on simulation, using mathematical models, which in all realms of science have often proved to be somewhat unreliable. One only has to miss out or underestimate the influence of one variable and one’s forecast can end up very wide of the mark. It would be interesting to know, for instance, whether our current climate change model anticipated the current solar minimum and whatever long term effects it may yet have. Indeed, it would be interesting to know whether solar activity is even factored into the equation. The one thing of which I’m sure, however, is that no climate scientist is ever going to tell us. 

Given the nature of the science, of course, this defensiveness is quite understandable. People want black and white answers where there is only the balance of probability. And it is this that places climatologists in such a dilemma. For if they explain to people the real nature of the science, a lot of people, particularly the more aggressive type of TV journalist, will then say that it is therefore just a theory, that there’s no proof. As a result, people will then be far less likely to act on it. Tell them that anthropogenic global warming is a fact, on the other hand, and we might just save the planet. The only problem is that, strictly speaking, it’s a lie. It also leads to further lies: lies of omission – things one cannot say; explanations one cannot give – so that when the country is covered in snow, and we are experiencing the worst winter in decades, climatologists are nowhere to be found, and the BBC is left showing endless hours of abandoned cars and people waiting in queues for Eurostar. Perhaps, therefore, its time to stop treating people like mushrooms, to tell the truth and  open up a proper debate. At least we might get some proper journalism from our 24 hour news services. Or is that just wishful thinking?