Thursday 11 May 2023

Freedom & Identity

1.    The Illusion of Choice

It may be supposed – and may even be true that a good many people spend a large part of their lives trying to find out who they are: their identity. Many others I suspect significantly more spend just as long trying to escape an identity forced on them by society. What is particularly unusual about these two desires or needs, however, is that, while they may pull us in opposite directions, they are not incompatible. In fact, in many cases, each may be a necessary condition of the other. For in order to discover ourselves, we may first have to cast off the shackles of race, sex or social position with which society has burdened us; while to escape these pigeonholes we may first have to discover our true selves.

In fact, one can look at second wave feminism in very much this way. For in order to discover their full potential, many women felt that they first had to escape the roles of wife and mother with which society had traditionally saddled them; while to liberate themselves from these bonds, or even to perceive this as a possibility, they may first have needed to conceive of an alternative which more strongly called to them as the possible fulfilment of who they truly were.

Not, of course, that everyone wants to escape the roles and identities assigned to them. In fact, for many people, the role they have been given in life provides them with a sense of certainly and security without which they’d be lost, thereby making our preferences in this regard a fairly strong indicator of who we really are. The problem with this, however, is that whether we would prefer a defined place in an immutable universe or the freedom to ‘find’ ourselves in a world of infinite possibilities is subject to a highly disparaging value judgement, especially today when we tend to think more highly of those who have the courage to be themselves than those who merely conform to social expectations, making it a very rare person, indeed, who would openly admit to wanting a safe and orderly life and demonstrating just how strong the social pressure can be on people to conform to current ideals, even if they are not actually recognised as such.

That they are not recognised as such is due, of course, to the fact that we assume that the desire to break free of social norms and hence discover who we truly are is entirely natural and cannot itself therefore be a social norm. Prior to the second world war, however and, indeed, throughout most of our recorded history – far from being viewed as a natural human aspiration, the desire to break free of social norms would have been regarded as highly delinquent, while the idea of discovering who one truly was would have been almost unintelligible. After all, who one was was defined by one’s place in society, which provided one with certainty and security as a reward for diligently fulfilling the responsibilities that went with it, making the abdication of those responsibilities in order to fulfil oneself an act of folly bordering one madness.

Looked at from this historical perspective, therefore, far from being innate, this whole quest to discover ourselves may be better seen as a largely historical phenomenon: the expression of our collective desire for change in the post-war era. Founded on what is essentially a wish, however, this then raises the question as to whether such self-discovery or self-realisation is actually possible and not just an idealistic dream based on a profound misunderstanding of the human condition.

I say this because, within any social environment, most individuals exert far less influence on that environment than they are influenced by it. This is even true with respect to such loose and informal associations as groups of friends, which, in their synchronicity, can be seen to behave very much like murmurations of starlings: those vast flocks of birds which gather in the autumnal twilight, darting one way and then the other, continually morphing into different shapes, before settling on a particular roost for the night. So instantaneously coordinated are these flocks, in fact, that one could easily imagine them to be organic, with a life and intelligence of their own. At the very least, they would appear to have the ability to communicate and make communal decisions. This effect, however, is actually caused by subtle changes in air pressure on the wings of each individual bird as a result of its proximity to others. While each individual bird thus influences the birds around it, not only is this influence miniscule when compared to the influence of the flock as a whole on each individual bird, but no individual bird ever actually chooses where the flock is going to roost. Agency resides entirely within the flock itself.

And something very similar is true with respect to unstructured, non-hierarchical and hence undirected groups of human beings. Although each individual within the group will naturally affect those around them, without an accepted leader which, of course, is an important provision it is the group collectively that exerts the stronger influence, as is particularly well illustrated in the case of tribal subcultures.

When I was growing up in England in the early 1960s, there were two prime examples of such subcultures: the ‘Mods’ and the ‘Rockers’. Mods rode Italian motor scooters, wore two-tone mohair suits and listened to The Who, while Rockers drove motorcycles, wore leather jackets and listened to John Mayall’s Bluesbreakers featuring a young Eric Clapton. As it was a long time ago and I was very young back then, it is possible, of course, that the three characteristics I have chosen to distinguish each group may not be entirely accurate. The more important point, however, is that it is also very unlikely that anyone who was actually a Mod or Rocker at the time actually chose any of the characteristics which did, as a matter of fact, characterise the subculture in question. Yes, there would have been those who influenced a particular subculture more than others highly visible rock stars and leading fashion designers, for instance but not only would it be impossible to identify any single guiding intelligence behind either these two cultural phenomena, but I strongly doubt whether anyone can really say how they started, why they developed in the way they did or why they came to an end.

Of course, it will be argued that the reason why people didn’t choose the aesthetic characteristics of the subculture of which they were a part is because they actually chose their subculture because of its already existing aesthetic. Not only does this not explain how the subculture got started, however, or how its particular aesthetic developed requiring it to be fully formed from the outset in order to attract adherents but it assumes that people consciously made a choice with respect to the subculture with which they identified and could therefore have chosen differently. It is far more likely, however, that in the vast majority of cases the choice would have been almost entirely determined by the social group of which the potential Mod or Rocker was already a part.

To put a little more flesh on this bone, imagine, for instance, that you grew up in a household in which your elder brother played classic blues albums and was forever tinkering with his motorbike while you stood by handing him his tools.  Suppose, too, that you lived in a working class neighbourhood where very few people ever wore suits except to go to weddings, and where most of your friends were saving up to buy their first leather jacket. In such circumstances, the chances of you becoming a Mod, I suggest, would have been extremely remote. This is not because you might not have preferred listening to The Small Faces and riding around town on a Lambretta though I rather doubt it but because the way in which you dressed, the type of two-wheeled vehicle you rode and the kind of music to which you listened would not have been chosen for themselves alone. Indeed, they would not have been chosen for themselves at all, but as symbols of the tribe with which you, your brother and your mates all identified, such that were you to have abandoned these tribal colours or, worse still, chosen rival colours, this would have amounted to an act of apostasy that would have lost you your entire social world. 

Thus, not only did very few people in the 1960s make any significant contribution to the aesthetic characteristics of their tribal subculture, but most of them didn’t even choose their tribal subculture in the first place. What’s more, one’s tastes in clothes and music were not the only aspects of one’s life that were effectively controlled by one’s tribe. For in addition to an aesthetic, most subcultures also have an ethos. In fact, the ethos of the Mods and Rockers was more or less the same; it was just expressed differently. One of its clearest articulations, however, is to be found in the lyrics of The Who’s song ‘My Generation’, in which the perceived chasm between the war-time and post-war generations, along with the scathing contempt in which the latter held the former, is most cogently expressed in the lines ‘Why don’t you all fade away | And don’t try to dig what we all say,’ the implication being that the two generations have become so alienated from each other by their respective values that communication is either impossible or pointless.

What this most tellingly reveals about the Mod culture, however, is that while The Who’s angry repudiation of the past is significantly more aggressive and hostile than the mere desire to break free of social norms and discover oneself with which we began, both of these expressions of our post-war culture essentially stem from the same collective desire for something better in the post-war world. Given the fact that we have a fair idea as to how the Mod subculture arose not through anyone making choices, but simply as the result of a human form of murmuration this may also therefore tell us something about that broader culture of self-discovery in which the Mod subculture, being  part of the same milieu, was, of course, embedded.

Even more significantly, it also tells us something more about the social dynamics within any social group that is culturally defined. For one of the most important implications of the fact that the adherents of such a culture do not choose either its aesthetic or its ethos is that they cannot reject them either, or not without ceasing to be a member of the culture in question. This therefore places enormous social pressure on those who identify with a particular culture to conform to its norms, making the Mod subculture, for instance, along with the wider culture of self-discovery, essentially no different from what they were explicitly rejecting.

2.    The Tyranny of Ideas

This idea that one can be compelled by social pressure to conform to a rebellion against conformity may, of course, seem a little contradictory. It is, however, one of the most salient and significant characteristics of social groups based on a cultural identity that they are able to control the attitudes and beliefs of their adherents no matter how mad or bad these attitudes and beliefs may be. In fact, if they did not, they would simply pull apart or  never exist in the first place. While it is essential to the existence of these groups that they are able to exert such control, therefore, throughout most of our history this ability has had consequences far more dire and destructive than anything with which the subcultures of the 1960s were associated.

This is largely because the latter were primarily based upon their aesthetic rather than their ethos. Indeed, it could even be said that their ethos was simply part of their aesthetic something that was certainly the case with respect to the Punk movement of the 1970s, which made transgressions against both the aesthetics and social norms of the prevailing culture a central element of its own aesthetic.

The situation is very different, however, in the case of subcultures in which the ethos is the main driving force. These are most commonly religions or political ideologies in which the systems of belief on which they are based are regarded as far more important than any mere fashion in music or dress. This also tends to make them far more enduring, not only because of their greater substance, but because, being based on ideas, they are subject to discussion and reasoned argument which can then lead to division.

If this sounds somewhat contradictory, it is because we generally think of divisions within a society as inimical to that society’s long term survival, as indeed they usually are, not least because the sects or splinter groups which result from the vast majority of schisms usually tend to then splinter even further. There are, however, two conditions which, either separately or in combination, can enable a breakaway group to survive. The first of these is the emergence of a powerful leader or leadership group which authoritatively establishes a fixed or settled set of doctrines which it then rigorously enforces, thereby preventing, if not apostasy in that anyone can always leave then certainly further fragmentation. While less common, it is the second possibility, however, that is the more insidious, in that it occurs when the ideas which make up a belief system actually trap adherents within that system such that they cannot break free of it: a situation which, in itself, therefore, tends towards authoritarianism.

Prime examples of the latter can be found among the various Congregational, Reformed and Presbyterian churches which emerged during the 16th and 17th centuries based on the teachings of John Calvin, the most definitive and divisive of which was the belief in predestination: an idea which was somewhat casuistically derived from the assumption of God’s omniscience, the argument being that if God knows everything, both past and future, He already knows whether any given individual is among the saved and is therefore going to heaven or among the damned and headed for hell, with the further consequence that everyone’s fate is thus already determined regardless of what they do in life.

Not only is there something counterintuitive about this, however, but it is also suspiciously paradoxical. For if God already knows that you are headed for heaven, it is because He knows that He is going to admit you. Unless He is prone to acting gratuitously which I strongly suspect Calvin actually thought He was – it follows, therefore, that God also knows that you are going to fulfil all the necessary conditions for gaining admission to heaven. The problem is that, while, on the one hand, we can therefore say that your fate is already determined, on the other, it surely cannot mean that you don’t have to do anything. For you still have to fulfil all the necessary conditions for heavenly admission. This would therefore suggest that everything is not yet fully determined. For just because God knows that you are going to do something doesn’t mean that you don’t have to put all your efforts into actually doing it.

This, however, is precisely how most Calvinists seemed to have interpreted the doctrine, believing that you are either among the Elect or you are not and that there is nothing that you can do about it either way: an idea which totally shocked and appalled most other Christians, especially Catholics, not least because it is a central tenet of Catholicism that no one is beyond redemption and that the sinner who repents is as welcomed into the Kingdom of God as the saint who has always lived a blameless life. The sinner who repents may, of course, have to demonstrate his repentance by undergoing penance, which, by the 16th century, typically meant donating large amounts of money to the Church. The important point, however, was that there was still, nevertheless, something one could do oneself in life to ensure one’s salvation.

This was also the view of Martin Luther, whose main objection to the Catholic doctrine was that the acceptance of donations in lieu of penance led to corruption. In fact, the main reason he published his ‘Ninety-five Theses’ in October 1517, thereby beginning the reformation, was because Albrecht von Brandenburg, Archbishop of Mainz had recently commissioned a well-known ‘pardoner’, Johann Tetzel, to sell indulgencies in his diocese, the proceeds of which were to be used both to help pay for the rebuilding of St. Peter's Basilica in Rome and to pay off his own debts. With the corruption thus clear for all to see, Luther’s answer to the problem was therefore to argue that salvation was not to be obtained through penance at all, but solely through faith, both in Christ’s teachings, which the faithful, being the faithful, would naturally follow, and in God’s mercy and forgiveness of our inevitable, if hopefully only occasional lapses.

Whether through penance or faith, there was thus something which followers of both these doctrines could do themselves to earn salvation, making the idea that their fate was already decided not just the cause of some consternation but of revolt. After all, most people like to think that they still have time to put things right in their lives: to make amends for their mistakes and the hurt they have caused to others. To be told that they haven’t and that there is nothing they can do therefore seemed so unfair. It was as if the chance to put things right had been taken away from them, as if God, Himself, were denying them that chance, thereby rendering God, Himself, unjust.

If other Christians were morally revolted by the idea of predestination, however, it was the effect of this doctrine on Calvinists, themselves, that was the most pernicious. For if one truly believes that one’s eternal fate is already determined, there are only four possible ways in which one can respond to this fait accompli, only one of whichresigned acceptance, combined, perhaps, with a pinch of hope and a great deal of courage does not lead to or is not, in itself, a form of insanity.

That there could be a response to predestination that is, in itself, a form of insanity may, of course, seem somewhat odd, especially when I tell you that the response in question is that of those who believe, without doubt, that they are among the Elect and therefore headed for heaven, in that this group would appear to be in the most fortunate position. The trouble is that this belief entails another unequivocal belief: that one is actually worthy. After all, unless one believes that God acts gratuitously, one cannot believe that one is headed for heaven unless one also believes, with absolute certainty, in one’s own righteousness, a conviction which, in most people, requires what can only be described as a quite remarkable lack of self-awareness.

I say this because it is an extremely rare man or woman who can honestly say that they have never hurt anyone. Being human and more selfish than we usually care to admit, we have all said and done things which, to our everlasting shame and regret, we know have caused others pain. And while we try not to think too much about the wrongs we have done, especially to those we love, they still haunt us, constantly reminding us of our weaknesses and failings and making it a very rare person indeed who can therefore view the prospect of standing before God without some degree of trepidation.

Given that genuine saints are probably quite rare, it is reasonable to assume, therefore, that the vast majority of Calvinists who went to meet their maker in the absolute certainty that they were saved did so on the basis of an entirely deluded view of themselves, one which  that would seem mildly comic if one of the corollaries of such poor self-knowledge were not a troubling lack of awareness of others as sentient beings: what Martin Heidegger called Dasein,  meaning ‘someone there’. I say this because, as I explained in ‘The Eyes of Another, Self-consciousness & Morality’, our awareness of others as Dasein is critical to the development of our moral consciousness. For it is only when we see the look of hurt or disappointment on another’s face, or the disapproval in their eyes when we have done something wrong, that we are forced to reflect upon our behaviour and not just confront our failings but commit ourselves to amending them. Someone who is unaware of their failings, therefore, is very unlikely to have ever been forced to look at themselves in this way through another’s eyes making it equally unlikely, therefore, that they actually see others as seeing them.

Nor is this a trivial failing. For if one does not see others as Dasein, it follows that one can only see them as objects. And if one only sees others as objects, it follows that one is under no moral constraint with respect to how one treats them. If one does not see others as people, one will have no qualms about killing them for instance. Thus, while this particular form of personality disorder causes no distress or discomfort to those suffering from it, it makes them very dangerous and, once recognised, rather frightening.

The good news is that they are also probably quite rare. This is not so, however, with respect to those who suffer from the second form of insanity to which a belief in predestination can give rise. This is the response of someone who has doubts about their chances of being admitted into heaven but tries convince themselves that they are saved nonetheless, usually by disciplining themselves to be more strict in their ‘Godly’ behaviour than most people would deem reasonable, which, of course, it is not. For if one truly believes that one’s fate is already decided and that no amount austere self-discipline is going to change it, the only reason one can have for acting in this way is try to reassure oneself by implicitly holding on to the belief that if one is sufficiently righteous then one surely cannot be damned.

The problem with this argument, however, is that it is clearly specious. For while righteousness may be a necessary condition for admission into heaven, in that no one who is not righteous is admitted, it is not a sufficient condition, in that not everyone who is righteous is given admission. Indeed, if this were not so, if everyone who was righteous were automatically admitted, this would take the decision out of God’s hands, something which Calvin explicitly rejected and which all educated Calvinists would have known. No matter how hard someone tried to convince themselves that their righteousness was a sign that they were saved, therefore, they would have known, deep down, that they were deceiving themselves.

Nor does the problem end there. For it is a distinctive characteristic of self-deception that it is always multi-layered, the primary layer in this case being the sufferer’s reluctance to contemplate his doubts about being saved for the simple reason that this would entail him also thinking about the fires of hell and the prospect of being of being burnt alive for all eternity: something he certainly doesn’t want to think about. So he develops this strict regime of righteous behaviour which allows him to convince himself that he is saved even though he knows that the belief upon which this conviction is based is entirely false. Not only does this add yet another item to the list of things he cannot allow himself to think about, however, but it also makes him liable to react violently if anyone tries to make him do so, typically by trying to reason him out of his increasingly irrational behaviour, thereby threatening to bring down the entire edifice of his carefully constructed defences.

Needless to say, this also makes him rather difficult to live with, particularly as he will have almost certainly imposed his own obsessively strict regime on those around him, especially his family, whose every act of indiscipline, irreverence and impiety he will see as reflecting upon himself, making him doubt once again the assuredness of his salvation. For how can one be saved if one’s family is damned, especially one’s children, who are, after all, an extension of oneself and whose every expression of natural spontaneity and childish exuberance will therefore be seen as wickedness which has to be beaten out of them.

Apart from the sheer cruelty of it, the problem with this, however, is that, if a child is told from a very early age that he or she is wicked and is going to hell, there is a very good chance that eventually they will come to believe it, especially as they enter puberty and start experiencing sexual desire, putting their sinfulness beyond all doubt and very possibly causing them to fall victim to the third form of insanity to which the doctrine of predestination can give rise.

According to 16th century visitors to the hospital of St. Mary of Bethlehem in London, better known as Bedlam, this third form of doctrinally induced mental illness typically proceeded in three phases, although the first phase was often regarded as merely an instance of adolescent religious fervour, the only sign that the sufferer was experiencing any kind of psychological distress being that they started spending an inordinate amount of time in prayer, often actually gaining praise for their piety as a result. Gradually, however, they would become increasingly withdrawn, only answering questions with monosyllabic answers or not answering them at all, while neither eating nor washing and becoming more and more listless. In some cases, they would also give way to outbursts of violence, although this was nearly always directed against themselves. Eventually, however, they would decline into a kind of catatonic torpor in which they just sat staring into space, not even seeming to notice when they soiled themselves, with the result that, if their families could afford it, it was at this stage that they were committed to the hospital, where the attendants would force-feed them and occasionally sluice them down with buckets of cold water.

If they survived this, the end then usually came as a result of further outbursts of violence, again always directed against themselves, either in the form of self-mutilation, which frequently led to infection and death, or actual suicide, which for many, trapped in this living hell, may have seemed like the only way to escape their suffering, even though they would have believed, of course, that death would lead immediately to that which, in their imaginations, so tormented them.

The real problem, however, was the doctrine of predestination, itself, which, once people came to believe in it, left them no way out. They could, of course, have just walked away. But if one believed that only the Elect were saved, such apostasy, in itself, would have confirmed their damnation. Thus once one had internalised this insidious belief system, there was no way to ever break free of it.

3.    Identity and Ideology

Given the pernicious nature of any system of beliefs that actually traps people inside it, one might be tempted to assume, therefore, that an ideology simply imposed on a population by an authoritarian regime would be infinitely preferable. One’s behaviour might be controlled but one’s mind would still be free.

Unfortunately, things are not quite that simple, not least because cultures based on systems of belief even ones as insidious as Calvinism are not actually that strong. This is because it is only those who are closely involved with the belief system a Presbyterian minister and his family, for instance who tend to be the real zealots. A Scottish farm labourer in the 16th century may have identified with his Kirk, but more as a cornerstone of his culture and community than as the embodiment of a particular theology. Indeed, the mere fact that Kirk ‘sessions’, which acted as both local parish councils and courts, were regularly forced to mete out punishments for breaches of the Presbyterian moral code suggests that the religion of many Kirk members didn’t run much deeper than the dark, sombre clothes they were required to wear as part of the Presbyterian aesthetic.

In fact, it is an almost universal truth about human beings that we care far less about ideas than the practical concerns of our daily lives. This, however, is a serious problem for anyone trying to establish a group identity and allegiance based on an ideology. For if, for the vast majority of the population, their belief in this ideology is only skin deep, then unless one can ally the ideology with something to which the population bears a greater allegiance, one will quickly discover that the only way to ensure ideological conformity is through coercion and force, as, indeed, many of those who attempted to establish various forms socialism in Europe at the beginning of the 20th century eventually found out, much to their own surprise and the great distress of the populations upon which they conducted their ideological experiments.

This was partly due to the fact that, being intellectually committed to their ideology, the zealots who drove these movements seem to have known very little about how ordinary people actually thought and behaved. At the beginning of the first world war, for instance, many socialists believed that working class soldiers on both sides would quickly realise that they had more in common with each other than with those leading them and that this would eventually result in an internationalist socialist uprising. What they failed to understand, of course, was that ordinary soldiers identified far more with those who shared the same cultural background and spoke the same language than with any abstract construct, especially that which intellectuals referred to as the ‘proletariat’.

This failure to understand ordinary human beings was also responsible for a lot of the divisions within the early Soviet Union, with idealists such as Leon Trotsky not only advocating perpetual revolution but perpetual revolution on an international scale, while realists like Joseph Stalin recognised that, if a communist state was ever going to be securely established, it had to happen in a country which had already undergone a communist revolution, i.e. Russia. What this also meant, unfortunately, was that it would have to happen in the context of some distinctly Russian problems, not the least of which was the fact that Marx had envisaged that the revolution would take place within an already industrialised economy, such as that of Britain or Germany, and that all that would be needed for the transition to take place, therefore, would be the transfer of the ownership of the means of production from the former capitalist owners to the socialist state, something which he could not imagine the workers not welcoming, let alone resisting. Before the first world war, however, Russia was one of the least industrialised countries in Europe, with a still largely agrarian economy, which not only meant that Russia’s communist revolution also had to be an industrial revolution but that it would also have to deal with some specifically Russian rural issues.

The first of these was the fact that, in addition to a tier of subsistence farmers, who, while they were able to feed themselves, contributed very little to the economy at large, Russia’s pre-war agricultural industry consisted of two main sectors. The first comprised the traditional large estates of the landed aristocracy described by Tolstoy, which were populated and worked by people who were still little more than serfs. As such, this sector did not constitute a problem for the new communist regime in that, once brought under state ownership, all that was necessary was a change of management. After the attempted revolution of 1905, however, the then Prime Minister, Pyotr Stolypin, had instituted a series of land reforms aimed at reducing disaffection amongst the peasantry while creating more incentivized and productive farmers. This he did by awarding small farms of up to 8 acres or 3.2 hectares to those deemed capable of running them profitably.

Ambitious, hardworking and only recently emancipated, these small, independent farmers, known as Kulaks, meaning ‘tight-fisted’, quickly established themselves as the most efficient part of Russia’s agricultural economy, producing far more per hectare than the large estates. By that very same token, however, they were never going to subscribe to the communist ideology, especially when, as a condition of keeping their farms, they were force to sell their surpluses to the state at prices which were well below previous market levels. The result was that their productivity immediately fell, causing Soviet officials to accuse them of holding back food or selling it on the black market, which, of course, is perfectly possible. On the other hand, it is just as likely that they had simply become demotivated. The result was, however, that by the beginning of Stalin’s first five year plan, in 1928, any patience which the Soviet regime had previously had with the Kulaks had finally run out, and the decision was made to ‘dekulakize’ the agrarian economy by confiscating the Kulaks’ land and herding them into new collectives farms which were little more slave labour camps.

By destroying what had been the most productive part of Soviet agriculture, however, production fell once again, and continued falling in what turned out to be one of the most murderous and devastating vicious circles in world history. For Stalin’s priority, of course, was industrialisation. This meant that whatever food was produced went first to urban areas to feed industrial workers, leading to shortages and eventually famine in the countryside. Between 1930 and 1933, it is estimated that between 5.7 million and 8.7 million people died of starvation in different parts of the Soviet Union, including the Northern Caucasus, the Volga Region, Kazakhstan, the South Urals, Western Siberia and, of course, Ukraine. What makes this whole episode so appalling, however, is that no one seems to have worked out or dared tell Stalin that the more people who died of hunger on collective farms, the less people there were to work the land and the less food was consequently produced.

Eventually, of course, the devastation caused, not just by this idiotic policy of depriving farmers of food, but by Stalin’s whole first five year plan, led to plots in Moscow to remove him. All this did, however, was precipitate the great purge of 1936 to 1938 in which around 700,000 people were killed in one way or another, as Stalin fought to stay in power and keep his particular vision of communism alive. For it is quite possible that the Soviet Union could have collapsed or torn itself apart at this point had it not been for the sheer scale of the terror Stalin unleashed and the massive favour done to him by Adolf Hitler. For ultimately, of course, it was not the communist ideology that held the Soviet Union together and cemented Stalin’s power, but the German invasion, which resulted in the Great Patriotic War, in which another 20 million Russians were to die but which not only accelerated Soviet industrialisation, turning the Soviet Union into the superpower it was to become, but united its people in defence of Mother Russia.

4.    The Power of Tribalism

That it was therefore nationalism rather than communism that really made the Soviet Union a success albeit for a very limited period should not, of course, come as a surprise. After all, as pack animals, it is our tribal allegiance, whether this be to our family, our clan, our village, our region or indeed our country, which, throughout most of our history, has constituted our strongest identity. Nor should it come as a surprise, therefore, that both of the other two attempts to establish socialists states in Europe during the first half of 20th century sought to tie socialism to some form of nationalism from the outset, the first of these attempts taking place in 1922 when Benito Mussolini seized power in Italy.

That I describe this as an attempt to establish a socialist state will, of course, surprise many people. After all, Mussolini was a fascist indeed, he invented the term and fascists are generally thought to be ‘right wing’, whereas socialists are regarded as ‘left wing’. Not only are these terms somewhat empty, however, with no precise definition, but they are also a little misleading when applied within modern political debate. This is because they actually originated in the summer of 1789, during the French Revolution, when the French National Assembly sat to decide on a new constitution, with some members, those with more conservative leanings, wanting something like the constitutional monarchy already operating in Britain, while others, those with more radical leanings, wanted a republic. As the days and weeks of the debate went by, it then just so happened, for no particular reason, that those with the more conservative views started to sit together on the right-hand side of the aisle in front of the speaker’s chair, while those with the more radical views started to gather together on the left-hand side. There was nothing in this arrangement to suggest, however, that only the conservatives on the right were or could be nationalists, or that reformers on the left, like Mussolini, could not be.

In fact, before the first world war, Mussolini had been a member of the Italian socialist party and had believed, like many others, that the war would precipitate a universal socialist revolution. When this did not happen, he was not just shocked, however, but forced into the painful realisation that a new socialist world order could not be brought about on the basis of Marxist ideology alone. It had to be augmented with something else. The problem was that Italy had only been fully consolidated into a nation state in 1871 and was thus only fifty years old, which meant that most Italians still identified more with their regions rather than with the nation state as such, thinking of themselves as Milanese or Genovese, for instance, rather than as Italians. As a result, Mussolini could not augment Italian socialism with nationalism in the straightforward way in which the two were eventually bound together in the Soviet Union.

In searching for a solution, however, he found a extremely helpful collaborator in the Marxist Hegelian philosopher, Giovanni Gentile, whom he appointed as Minister of Education in his first government and who was the actual author of ‘The Doctrine of Fascism’, nominally attributed to Mussolini. In it, Gentile describes a self-sufficient corporate state ruled by ‘philosopher kings’ for the benefit of ordinary people: a bit like the Rome of Marcus Aurelius. Indeed, the invocation of ancient Rome runs all the way through Gentile’s political philosophy, the word ‘fascist’ itself being derived from the Latin word ‘fasces’, which refers to the bundles of wooden rods, usually surrounding an axe, which were carried by the guards or lictors who accompanied Roman magistrates in procession.

Similarly, Mussolini’s self-styled title of ‘Il Duce’ was derived from the Roman title ‘Dux’, which was awarded to Roman generals defending significant stretches of the empire’s borders. It was then taken up by the Holy Roman Empire, where it became ‘Duc’ in French, after which it was brought to England by the Normans, where it became ‘Duke’ in English. The purpose of all this pageantry, however, was not just to satisfy Mussolini’s vanity. It was rather an attempt to create an aesthetic based on Rome’s imperial past with which to augment the ideology of National Socialism in the hope that people would identify with it. The problem was that, while it was all very theatrical and entertaining, it did not gain very much traction outside Rome itself. The Milanese in particular were not impressed.

As a consequence, Mussolini then sought to align himself with two institutions with which all Italians identified: the family and the Roman Catholic Church, of which the family was by far the easier to woo, in that supporting ordinary working people was already central to his socialist programme. All he had to do, therefore, was emphasise this by making well publicised awards to mothers with especially large numbers of children. The Church, however, was a far more difficult proposition, not least because there was something in particular the papacy wanted in return for its support, something which, up until then, the Italian state had been very loathe to grant it.

It concerned an issue that had arisen as a result of the unification of Italy in 1871, when the papacy had not only lost sovereignty over the Papal States, along with their tax revenues, but had found itself at the heart of a capital city it no longer controlled. Now that it found itself in possession of a significant bargaining chip, therefore, it was determined to obtain as many concessions from the new regime as possible, starting with the sum of 3,250,000 lire per year in compensation for its lost tax revenues. Even more significantly, it demanded independence for the Vatican as a separate nation state, which thus made it exempt from Italian taxes. In some very tough and protracted negotiations, which lasted until 1929, it even managed to include a clause in the treaty also making all businesses owned by the papacy and registered within the Vatican exempt from Italian taxation. What’s more, this applied no matter where the businesses operated. Over the next few decades, as a result, the Church made an absolute fortune buying up corporations all over Italy and registering them in the Vatican so that it could operate them tax free.

As a result, it is almost certainly the case that the Catholic Church gained far more from the Lateran Treaty than Mussolini ever gained from the Church’s grudging support for his government, especially when the second world war started to go against him and the Italian population was faced with a choice between his National Socialist government and a Roman Catholic priesthood that was far closer to the people than any government bureaucrat ever could be, with the result that, in the end, it was  simply no contest: the people chose the Church.

In fact, apart from Stalin, who only belatedly stumbled upon the solution, the only other politician to successfully combine a political ideology again socialism with an identity to which his population already owed a far greater allegiance was Adolf Hitler. Hitler, however, had far more factors in his favour than either Mussolini or Stalin. For not only had Germany lost the first world war and a huge amount of national pride with it but the punitive terms of the Versailles Treaty had forced the Weimar Republic to print enormous amounts of money, both to pay reparations and to support a German population that was suffering from mass unemployment. In an economy which, without investment, was intractably stagnant, all this money printing did, however, was precipitate hyperinflation which wiped out whatever savings people had, thereby making the situation even worse.

Angry, bitter and short of even the most basic necessities, the German people were thus in a state in which they were highly receptive to any leader who promised to restore German pride, rebuild the country and exact vengeance on those who had betrayed it, both during the war, itself, and during the financial crisis which followed, enriching themselves at the expense of ordinary Germans. It didn’t even matter that those chosen to be the scapegoats weren’t actually responsible for any of Germany’s woes. In order to address the grievances of the ordinary German, it was enough that they came from a different ethnic and cultural background and were thus of a different tribe at a time when the new regime was creating a whole new German identity based on an aesthetic which combined an ideal of Aryan physical perfection with heroic German mythology, torchlight rallies set against the natural beauty of the Bavarian Alps, and endless parades of high-stepping soldiers dressed in immaculate black set against a backdrop monumental buildings draped in equally massive red, white and black flags. If Mussolini had been theatrical, this was pure Hollywood, with a script written by Joseph Goebbels, cinematography by Fritz Lang and a score by Richard Wagner.

And it worked. Not only did it enable Germans to feel better about themselves than they had done in years, but it made it possible for them to accept the brutality with which, for the common good, all criticism of the new regime was suppressed, thus enabling those responsible to project their vision of a new Germany onto a world beyond its borders, thereby plunging that world into yet another war.

5.    A Murmuration of Righteousness

The real tragedy in all this, however, is that we learnt nothing from it, or learnt all the wrong things. Because the Soviet Union only discovered the usefulness of nationalism once it was engulfed in its Great Patriotic War and didn’t have the word ‘nationalist’ in any of its self-descriptions, we thought, for instance, that there was a fundamental difference between Communist Russia and Nazi Germany, when the only real difference was the choice of aesthetic, the Soviet Union having decided to depict Mother Russia as a Workers’ Paradise by producing posters and sculptures of heroic workers in overalls, while Nazi Germany went with the Ring Cycle.

Worse still, having decided that there was a fundamental difference between Soviet communism  and Germany’s National Socialism, and that this difference lay in Nazi Germany’s overt nationalism, we then came to the conclusion that, instead of being a natural expression of the universal human instinct to ally ourselves with those with whom we identify, nationalism was itself a political ideology. What’s more, we also fell into the trap of assuming that because the Soviet Union and Nazi Germany were on opposite sides during the war, their ideologies, communism and nationalism, must also be in fundamental opposition. To cap it all, we then assumed that because communism was an ideology of the left whatever that meant and because we thought that all political ideologies had to sit somewhere on a one-dimensional spectrum from left to right, this meant that nationalism had to be an ideology of the right, thereby linking it to conservatism.

So caught up in this taxonomically incoherent way of classifying political ideologies were we that we didn’t even stop to consider that Britain and France were probably just as nationalistic as Germany and that the difference between the three countries could not therefore have anything to do with this particular aspect of their political makeup. Having overlooked this fairly obvious point, this then precluded us from looking at other aspects of the three countries’ political constitutions, such as the fact that in Britain, for instance, we took great national pride in the fact that we lived in a ‘free country’, in which people were free to pursue their own goals, speak their own minds and live their lives free of coercion and intimidation. In fact, the national pride we took in these hard won freedoms and defining characteristics was one of the main things that kept us free, in that we were willing to come together to fight for them.

In both the Soviet Union and Nazi Germany, in contrast, where so many aspects of daily life were controlled by an all-pervasive state, national pride and patriotism almost always had to be instilled by a combination of indoctrination and fear. The indoctrination took the form of teaching everyone from an early age that they owed a duty to the Motherland or Fatherland and that their worth as a person and their place in society depended solely on how zealously they fulfilled this duty. This was then reinforced by equating the Motherland or Fatherland with the state, the party or, indeed, the ‘leader’, such that any betrayal of the state, party or leader was a betrayal of the Motherland or Fatherland and hence treason, which was punishable by death, as frequent exemplary demonstrations made clear.

Because we believed that nationalism was, in itself, an ideology, however, we completely failed to understand this fundamental difference in the way in which nationalism arises in an authoritarian state as opposed to a free country. Worse still, our failure to understand this difference then led to a belief that it was nationalism rather than statism that was the ideological threat to civilization. Despite the fact that the Soviet Union and, subsequently, Communist China continued to murder tens of millions of people on a systematic basis, it was nationalism we still therefore believed we had to eliminate to prevent another world war, with the result that in Europe, in particular, we even started the process of rolling back of the nation state which eventually resulted in the creation of the European Union: a borderless confederation which was not only intended to facilitate trade and wealth creation, but to put an end to conflict in Europe by creating a European identity to which all its peoples could subscribe.

As Mussolini discovered to his cost, however, tribal allegiances do not come about in this way. The mere creation of a unified Italy didn’t mean that the Milanese stopped being Milanese and suddenly became Italians. And the problem was even greater for the EU. For while many people half-heartedly point towards Europe’s common cultural heritage in Greco-Roman antiquity and Christianity as a unifying force, not only is Christianity in steep decline in many European states but few people today read classical literature or philosophy. In fact, the only thing most Europeans have in common is that we like visiting each other’s countries because we actually enjoy our cultural and culinary diversity and don’t much care for the EU’s bureaucratic attempts to homogenise us.

This, however, has not been the only unfortunate consequence of this obsession we have with eliminating nationalism. Even more pernicious is our tendency to view our history through an anti-nationalist lens. This is particularly true in the case of those countries, like Britain and France, which have a colonial past. After all, when Britain ruled in India, didn’t we British think of ourselves as the Master Race: a question which, in itself, leads us to view all nationalism as essentially racist and concerned with dominating others, which, in turn, leads us to feel guilty even though the reality of British rule in India was far more prosaic, being based on trade and the making of money which was actually beneficial to both sides.

From 1853 onwards, for instance, British companies in India built the fourth largest railway network in the world, comprising around 42,000 miles or 68,000km of track, which massively increased Indian agricultural and commodity production, much of which was then exported to Britain in exchange for other British engineered goods such as the Royal Enfield motorcycle, the manufacture of which was subsequently transferred to India. Although some British officers, administrators and engineers who served in India may well have been racists, therefore, the fact that the relationship was so mutually beneficial and generated so much wealth meant that, for the most part, relations between the two sides had to be cordial and respectful or the flow of wealth would have simply dried up. And so, for the most part, that is what they were: cordial and respectful.

By viewing all history through the distorting lens of attitudes towards nationalism fostered by the second world war, however, we have effectively reduced history to a parody of itself in which all Europeans who set out across the oceans of the world to make their fortunes were unequivocally bad, while all those who suffered at their hands were unequivocally good. Worst still, this in turn has given rise to a bizarre ideology in which suffering oppression and having sympathy for those who have suffered oppression are the only virtues.

Nor is this confined to those who suffered beyond our borders. This new ideology also extends to those who have experienced oppression at home, most notably women, homosexuals and other minorities.

One of the effects of this, of course, is that, to some extent, it empowers the groups involved, which was not the case, for instance, with respect to women of previous generations who sought to break free of the shackles society imposed on them. I say this because, for the most part, women who strove to have careers prior to the second world war did so largely for themselves, not for all women. These were individual acts based on individual choices which implied nothing about the rights or wrongs of the constraints under which most women lived. Once one begins to see these constraints as a kind of oppression, however, not only does it make the breaking of them a moral imperative but it makes each success a moral victory to be celebrated by all members of the group.

Indeed, it’s one of the main reasons why so many people subscribe to this ideology: because such moral victories make us feel good. It’s that ‘punch the air’ while shouting ‘Yesss!’ feeling that footballers experience when scoring a goal, which not only makes us want to experience the feeling again and again, but makes us want to share it with others. In fact, sharing the feeling with those with whom we identify actually amplifies it in the same that way as a team celebration of a goal amplifies the jubilation of the scorer. This then results in the group continually wanting to score more such victories.

There are, however, two problems with this. The first is that the wrongs done to an oppressed group cannot be merely historical. For if one is going to score more moral victories over the oppressor, his offences must be ongoing. Maintaining this perpetual antagonism, however, comes at a considerable cost, both in terms of the emotional energy it requires and in terms of the social divisions it can cause.

This is particularly the case if the identity of the oppressed group is defined, not by gender, but by race. For while a woman may be offended by a man’s sexist attitudes, assumptions or behaviour, the two most common emotional responses to this are anger which is usually short lived, burning out quite quickly and contempt: a downward looking emotion which often actually causes us to laugh at those we find contemptible. This is not the case, however, if someone is offended by attitudes, assumptions or behaviour they deem to be based on racial prejudice. This is because the most fundamental cognitive element defining racism is the belief that someone of a different race is inferior in some way, thereby causing the person who holds this belief to look down on them with contempt or disdain. As I have explained more fully elsewhere, however, being looked down on in this way is one of the three necessary conditions required to incite hatred, a far more enduring emotion which can actually take over our lives, gnawing at our souls and preying on our minds as we brood on the slights we have received and the wrongs that have been done to us.

That is not to say, of course, that everyone who is regarded with contempt will automatically feel hatred towards those regarding them so. For as I also explained in ‘The Phenomenology of Hatred’, not only are there two further conditions that have to be fulfilled in order for hatred to be triggered but, even if all three conditions are met, we are still capable of suppressing the emotion, walking away and refusing to let our mistreatment get to us. The problem is that, if our identity is based on our membership of an oppressed group, and if this identity is maintained, even in part, because of the moral superiority it bestows on us, then ‘walking away’ is precisely what we can’t do. For in order to maintain this identity, we must continually experience further instances of our oppression, which, in this case, means further instances of being looked down on with contempt or disdain.

In fact, in order to keep this going, someone in this position may even need to seek out such instances, perceiving them in attitudes, assumptions and forms behaviour in which the racism is so slight, trivial and unintended that most people wouldn’t even regard it as such. Even this, however, may be to the  advantage of the person suffering from this disorder. For in order to maintain his or her perpetual antagonism, he or she may actually prefer it if the perceived instances of racism are so slight, trivial and unintended that most people wouldn’t even notice them, because this, in itself, reveals how unconscious racism within our culture is, thereby making the ongoing fight against it even more righteous.Hatred

While this may afford the sufferer some satisfaction, however, what it also demonstrates is that maintaining an identity based on membership of an oppressed group can actually condemn those trapped inside its labyrinth of specious logic to a lifetime of opposition and conflict, making the ideology which creates this tortuous state almost as insidious as Calvinism. Worse still, it has a similarly insidious effect upon those who are drawn into its web even though they do not gain from being able to feel morally superior as a result. For these, of course, are the oppressors, without whom an ideology based on the virtuousness of the oppressed could not exist. And who are these oppressors? Well given the oppressed groups so far identified women, homosexuals and racial minorities – they are, of course, white, heterosexual males who, within the ideology, are universally demonised as racist, sexist homophobes.

That’s not to say, of course, that those who adhere to this ideology necessarily characterise all white, heterosexual males in this way, especially not those they know personally. For when we know someone personally, we tend to see them as an individual rather than as a member of a class or group. What’s more, by seeing people as individuals we tend to notice that they’re all different and that while very few of them are unequivocally good, even fewer are unequivocally bad. For while we all have our shortcomings and failings, most of us have at least some redeeming qualities. It is only when we abstract away from this complexity and start to think of people, not as individuals, but as instances of types that we reduce them to a few defining characteristics about which we can then have unequivocal moral views which extend to all members of the group.

It will, of course, be pointed out that this is precisely what racists, sexists and homophobes do. Which is precisely why racism, sexism and homophobia are wrong. But by this same token, it is no less wrong to think of all white, heterosexual males as racist, sexist homophobes. More to the point, if racist, sexist homophobes exist, as I suspect they do, I also suspect that they are very rare, while the ideology that requires their existence in that, without them, it could not itself exist is now the dominant ideology in the western world, especially in government, education and the media, with the result that most white, heterosexual males cannot help but feel its effects.

In my own case, of course, I am lucky in that I’m old, as are most of the people with whom I mix. More to the point, our generation was never taught to think in this way. For younger white, heterosexual males, however, especially those who are active on line, some sort of strategy for dealing with this new cultural reality is more or less essential.

One option, of course, is to defend oneself by arguing that no one actually chooses their skin colour, gender or sexual orientation and that, personally, one is scrupulous in ensuring that one never judges or treats anyone differently on the basis of their race, gender or sexual leaning. The chances are, however, that this argument will be met by the claim that it demonstrates, in itself, just how little self-awareness one has in not recognising that one’s white privilege, sense of male entitlement and toxic masculinity are, in themselves, the causes of oppression. What’s more, the fact that one has to work so scrupulously not to give offence merely reveals just how inveterate one’s racism, sexism and homophobia are.

An alternative strategy, therefore much favoured by politicians, celebrities and others in the public eye is not just to go along with the ideology, but to be an ardent advocate of it, not just saying all the right things, but strongly condemning anyone who strays from doctrinal orthodoxy. This, however, has two main drawbacks. The first is that, if one is only adhering to an ideology for the sake of appearances, one is essentially saying something that one doesn’t think or believe, which is another way of saying that one is lying. Of course, it may be that one doesn’t actually know what one thinks or believes on a particular issue because one hasn’t actually thought about it. Indeed, one suspects that a good many politicians fall into this category, espousing particular ideas not because they have thought deeply about the subject and have come these convictions, but because their advisers or instincts tell them that this is what the public wants to hear. In this sense, therefore, one could say that they are not actually lying. By holding one’s opinions at this level of superficiality, however, not only does one risk becoming an empty shell, without any real opinions at all, but one also risks forgetting oneself, letting something slip and finding it splashed all over the internet, with the result that one is deplatformed and cancelled anyway.

Almost just as bad, by not internalising the ideology one espouses, one can find oneself holding inconsistent beliefs with respect to it without realising that one is doing so until someone else points this out. What’s more, this is particularly true with respect to the ideology in question. For while it is generally thought to have had its origins in the Frankfurt School, with such writers as Erich Fromm and Herbert Marcuse, it has clearly come a long way since then and is still continually evolving, as the recent dispute between certain feminists and activist supporters of transgenderism quite clearly demonstrates. What is also clear from this dispute is that there is no single guiding intelligence directing it. For while there are millions of documents produced each year on related issues, mostly on the internet, not only are there no canonical texts, but having largely been developed collectively, it is neither curated nor fully defined. For it is, of course, a murmuration. And like all murmurations, developed by millions of contributors, each one influencing it but mostly being influenced by it, it is unregulated except by itself, with the result that inconsistencies and contradictions are more or less inevitable.

This then leads to the most serious problem of all. For unlike the subcultures of early 60s, which, being based on various aesthetics, didn’t have to be consistent, this is an ideology which people take seriously. So seriously, in fact, that those who deviate from what is currently deemed to be politically correct are excommunicated. This means that strict adherence to the ideology is demanded while actually being impossible, resulting in a form of cognitive dissonance in which the adherent is forced to simultaneously hold contradictory beliefs, often tying themselves in knots trying to defend a manifestly untenable position, as Nicola Sturgeon, the former First Minister of Scotland, recently discovered with respect to the transgender debate.

While the spectacle of watching a senior politician trying to defend both sides of a contradiction has a certain comedic value, however, the fact that she either (a) didn’t recognise that she was contradicting herself or (b) didn’t understand that two contradictory propositions cannot both be true, or (c) regarded political correctness as more important than truth is a fairly damning indictment of the state of our culture and civilization, especially as such a large proportion of the general public seem to be in the same position. It also puts us in a very perilous state. For any society which becomes so divorced from reality that it cannot allow people to state what is manifestly obvious that the males and females of its own sexually reproductive species have some fundamental biological differences surely cannot survive for long.

Indeed, one is tempted to describe it again as a form of insanity, which, as in the case of Calvinism, is again most injurious to those who are the most impressionable: the very young. For how do we imagine a young male is going to feel about himself if he is constantly being told that he is irredeemably evil simply for being a boy? Do we really imagine that he is going to be happy about himself, that he won’t try to be more feminine, or that, in his despair at ever being fully accepted, he won’t try to harm himself or be driven to the ultimate act of self-mutilation by having his genitals removed and declaring himself a girl? Indeed, isn’t this entirely predictable? And isn’t it precisely what is happening?

6.    Identity & The First Person Singular ‘I’

The most astonishing thing about all this, however, is not that we believe that reality is so malleable that we can be whatever we want and that we therefore have the right to choose totally fanciful though these beliefs are but that we do not see anything wrong with a society in which so many people seem to be so unhappy about who or what they are, especially as, over last decade or so, politicians from all parties have been simultaneously lamenting the increase in mental illness, particularly among the young. Instead of asking whether these two issues might not be related, however, and looking for possible common causes, all we have done is demand more public money for mental health, as if this is going to fix the problem.

What makes this even more astonishing, however, is the fact that possible common causes are not that difficult to spot. All one has to do is think about what changes have taken place in our society over the last two decades, especially among the young, with the emergence of two almost ubiquitous technologies quite clearly standing out. The first of these, of course, is the iconic smart phone with its integrated camera, specimens of which are hardly ever out of their teenage owners’ hands, allowing them to continually document their lives in words and pictures. The second and closely related phenomenon is the development of social media platforms on which these autobiographical chronicles are then posted, allowing their authors’ lives to be scrutinised by the entire world, though far more importantly, of course, by their own social group, with whom they identify and from whom they therefore need approval or approbation.

As with all the subcultures we have looked at to date, this approval or approbation is again delivered along two axes: the aesthetic and the ethical. Unlike the subcultures of the 1960s, however, the aesthetic of an online group has far less to do with its tribal colours the clothes it members wear, the music to which they listen and the mode of transport they ride than with personal attractiveness. This is partly because appearances are so important to both the online culture and the medium upon which it is based, and partly because today’s group identity is founded far less on a tribal aesthetic than on status, which personal attractiveness and hence general popularity bestows on those who then tend to group together in order to bask in each other’s reflected glory.

This then has an effect upon the ethical dimension of a group’s identity. For even though an individual’s position in a high status group may be primarily based on personal attractiveness, most people like to think that their popularity is at least partly due to them being, not just nice, but good in the sense of having all the right values. And while these values may be partly determined by the internal murmuration of the group itself, the group is inevitably influenced by the prevailing ideology of the wider culture. In order for an individual member to retain the approval and approbation of the group, therefore, it is essential that he or she espouse all the views central to this ideology, not just on race and gender, for instance, but on such issues as climate change, about which the young, in particular, take such an impassioned stance.

In fact, it is the need for people to continually signal their virtuousness on such issues that largely keeps the murmuration on which climate change alarmism is based alive. It is also what can make social media such a nasty environment at times. For while members of high status groups need to continually evince all the correct attitudes and beliefs, there are no such constraints on members of lower status groups who therefore have the freedom to say more or less whatever they like, which can sometimes be excessive, gratuitous and unpleasant. This, in turn, then gives rise to counterattacks from those who have been offended and who naturally regard the offenders as low-lifes, racists and misogynists etc. thereby prompting ever great calls for censorship and the curtailment of free speech.

Somewhat ironically, having a low status online can thus be said to set one free, while being part of a high status group can actually be damaging to one’s health, in that it forces one to be ever vigilant with regard to what one posts in case one inadvertently says something to which someone else takes offence, the ever present possibility of which is further exacerbated by the fact that one never really knows what someone else might find objectionable. Even without the further complications to which this can then give rise which I shall explain shortly this constant pressure, on its own, can therefore generate the symptoms of what used to be called neurosis but is now more generally referred to as simple anxiety, the standard treatment for which, of course, is a wide range  of drugs including Selective Serotonin Reuptake Inhibitors (SSRIs), Monoamine Oxidase Inhibitors (MAOIs), Benzodiazepines, Tricyclics and Buspirone, which collectively constitute some of the most commonly prescribed drugs in the UK today, especially to the young people, thereby giving us some indication as to where all the extra money for mental health is currently being spent and who has been principally gaining from it.

What makes this fraud even more reprehensible, however, is the fact that, once one understands the problem, it is relatively easy to solve rather than merely treat. For as should be obvious, by continually chronicling our lives in words and pictures, we turn ourselves into objects for both ourselves and others to scrutinise, which has two consequences. The first is that we are inevitably dissatisfied with ourselves. For no one can stand this level of scrutiny. The second is that we therefore create an idealised version of ourselves for public consumption which we know to be a lie and which we therefore have to cover up with multiple layers of self-deception which, in turn, creates more anxiety, as we constantly fear that we are going to be found out.

The answer, however, is not to take drugs, but to stop taking ‘selfies’. Or rather, it is to stop being an object and start being a subject. Instead of being the one who is looked at, one has to become the one who is doing the looking. Instead of being the object ‘me’, one has to become the subject ‘I’, which, as Kant explained in ‘The Critique of Pure Reason’, is simply that which accompanies all my representations, in that it is ‘I who sees this’, ‘I who thinks this’, ‘I who remembers this’, while the ‘I’ itself has no attributes. It is merely a perspective, a logical point in space and time, a window onto the world: my window onto the world.

Moreover, the liberating effects of deobjectifying ourselves in this way and becoming pure spirit or as pure a spirit as it is possible to be while retaining our corporeal existence have been understood for millennia. Buddhist monks, for instance, shave their heads and wear identical saffron robes in order to make themselves indistinguishable from an objective perspective. Franciscan monks, or friars, did something similar, wearing brown habits and shaving the crowns of their heads while retaining fringes at the front, back and sides, thereby creating the very unflattering ‘tonsure’ as an antidote to vanity, the point being that they could not then take pride in either their fine clothes or their handsome looks.

Like Buddhist monks, Franciscans also took oaths of poverty, forcing them to live on charity, which some might argue made them both parasitic and selfish in that not everyone could live in this way, it being necessary for other people to grow the food which the monks ate. Not only did having to live on charity mean that the monks had to approach their fellow man with a greater degree of humility, however, but having no money, they were also forced to give something of themselves for their supper, taking the time to listen to their benefactors’ joys and woes, for instance, thereby learning more about the human condition. Even more crucially, society at large also got something out of this dialogue. For not only were people taught to be more charitable, sharing what they had, but they were reminded that our material existence is the least of what we are and that, whether or not one believes in life after death, ultimately our corporeal existence comes to an end, making our spiritual life the only thing that really matters.

Thus both the monks, themselves, and society as a whole gained from the existence of these institutions. They also both gained from the fact that when people cease to view their objective attributes as what is important, they start to look more closely at their subjective attributes, especially the quality of their thought. For if all you really are is the ‘I’ that thinks, then what you think is of the greatest import, not in the sense that you should choose to adopt attitudes and beliefs that will endear you to others, which will only make you an object in your own eyes once again, but in the sense that your thoughts should be well thought out and your own.

In fact, of all the detrimental consequences of our self-objectification, a decline in the quality of our thought is probably the most pernicious. For when we do not think independently and critically about the world, when all our views are based on what others think or what others want to hear, not only does truth and honesty disappear from the world, but the very foundations of our existence are put at risk. For one cannot build a house on political correctness or grow a crop on wishful thinking. One has to know what one is doing. And for that, one has to be able to think clearly and critically. For any of this to be possible, however, we first have become, once again, what we most truly are: the ‘I’ that thinks rather than the ‘me’ that is thought about.