Monthly Archives: January 2018

B is for belief

Belief n. 1 an acceptance that something exists or is true, especially one without proof
– a firmly held opinion
– a religious conviction
2 trust, faith or confidence in (someone or something)

It’s a funny old thing, belief. It’s very personal, obviously. At least it should be, in a free society. It should derive from one’s own worldview, one’s own take on things, not be imposed from outside by any sort of authoritarian, brainwashing body.

It can – ideally – be based on reason and rationality or simply be blind unreasoning faith with no concession to common sense, logic or scientific knowledge. It can be a passionately held point of view as a general outlook on things, or on a specific issue, such as laboratory animal testing or abortion. It can be (sometimes uncompromising and absolutist) religious faith. Or it can manifest as trust, such as trusting a politician’s promises, or confidence that friends or loved ones will help if you need it.

Let’s take a look at each of these categories in a little more detail:

Taking an extreme (and laughable) example of definition 1, there are still a few deluded people who are firmly convinced that the earth is flat just because, on the face of it, seen from ground level, it appears so. Never mind the indisputable observation if you leave earth in a high-flying aircraft or space rocket that the higher you go, the more a curved, convex horizon becomes apparent, which would only happen, obviously, if earth were an orb but not at all if it were a disc. And not to mention of course the pictures sent back from the moon or deep space; or the awkward point from such a believer’s point of view that a disc would necessarily have an edge over which some hapless earthlings might fall to their doom.

Or the unassailable fact that you don’t do that; if you set off travelling westwards from, say, New York, you cross America, then the Pacific ocean, then reach and cross Asia, then Europe and finally the Atlantic ocean until you end up back in New York. That would only tend to happen if the earth were an orb and therefore proves it. It’s called empirical evidence. But hard-core flat-earthers will have none of it. No, they say, all the hard evidence blowing their ‘belief’ out of the water is explained by conspiracy theory. Their stubborn pre-Enlightenment worldview (literally) beggars –as it were – belief.

Or another only slightly less extreme example: climate change denial. Yes, I know, that’s believing a negative really. A scientist, an expert in his or her subject, one who has probably spent years studying, researching and thinking about it, will modestly present peer-reviewed findings for consideration. For them it isn’t a matter of unquestioning faith but of facts; of best current knowledge. Whereas a denier will ‘believe’ that it’s all a lot of nonsense and again, probably, all a sinister conspiracy by academics just to win funds and keep them in cushy jobs. How can the earth be warming, they say, when the winter of 2017/18 in the north-east United States was so bitterly cold? They cannot (or simply will not) accept the scientific consensus that there’s indisputably a warming trend if you take the planet as a whole and also observe weather patterns over time – particularly recent time, when the trend is exponential.

Now let’s consider, for want of a better expression, moral belief: one’s established attitudes. Or, more accurately perhaps, simply values: principles or standards of behaviour or outlook, or judgement of what you consider important in life, acquired over the years by experience or social conditioning. An obvious example of that might be whether you consider intrinsic things like ‘quality of life’ (happiness, fulfilment; that sort of thing) more important than materialism – having lots of stuff and far more than you actually need.

Or your personal political inclination, be it left-leaning, conservative (or somewhere on that spectrum) or environmentalist. And strong personal attitudes about specific issues like, say, blood sports or presumed organ donation consent, or alleviating world poverty, where it isn’t really a matter of conventional political inclination. On issues such as these you might take an unswerving position from which you’re unlikely to be dissuaded. You could call that, fair enough, deep-seated belief.

On the other hand, if you are a thoughtful and liberal-minded person, you might over time, in the light of new factors or considerations or changes in societal attitudes, modify or even change your view. It’s allowed. It happens. And it’s okay. But if you are of a more conservative mindset, you might be less inclined to change and advancing age will tend to harden your certainty of your position.

And then there’s belief in one of its commonest definitions: religious faith. In my humble view, belief in a god, any sort of god, perfectly illustrates the faith-versus-scientific evidence question. It seems oxymoronic to me that you can simultaneously have both. It’s surely a dichotomy: a choice between two mutually-exclusive concepts. I fail to see how you can both believe in what is, by any reasonable definition, a supernatural force (albeit one seemingly for good) which has no evidential basis whatsoever, and also accept science. Or to put it another way: if you need a deity for which there’s absolutely no evidence in your life, you have to fall back on unquestioning faith. You have to say, well, He just is.

Yes, it’s a beautiful and alluring idea that somewhere, on some mysterious astral plane which mere humans, even scientists, can’t begin to comprehend (which is why there’s no observable evidence) there exists an omniscient, omnipresent, caring being who somehow created earth (only earth, or the rest of the universe too?) and all the people here on.

And that somewhere there’s a wonderful Elysian place called Heaven, or Paradise if you’re Muslim, to which when we die our ‘souls’ will be transported to live in eternal bliss: all the countless billions of us who currently inhabit earth, have previously done and will do so in the future, until our sun (did God make that too, as it’s essential life-support for His earth?), as it gets hotter and hotter, strips earth of its water and renders it uninhabitable (unless Homo Sapiens does that first all by itself).

Clearly, human beings in all cultures have felt the need to ascribe nature and a moral sense to a god-like force for thousands of years, but to still accept, or need, the existence of such a being in this day and age, when so many of the wonders of nature can be satisfactorily explained by science, you do indeed need a hefty dose of religious faith. That’s fine if you want to or need to though. But there’s a perfectly acceptable alternative moral code by which to live that doesn’t require belief in a supernatural deity: humanism. It’s based on compassion, kindness and evidence.

Finally, in definition 2, there’s belief as in trusting or having confidence in someone, or something that will come to pass or deliver the promissory goods. As I alluded before, an obvious example here is election-campaigning politicians. They ask us to trust them; trust that they will in fact deliver on the mandate for government we give them. Of course, in the real world of pragmatic politics, this often doesn’t happen and we become very cynical, but that’s another story, for another post. It’s usually a better bet to trust good, true friends and loved ones to do the decent thing.

To sum up: some definitions of the word ‘belief’ do present the eternal conundrum. Should we operate on blind faith, believing what we want to believe, or try, thoughtfully, to base our attitudes on the weighing of empirical evidence or logic?

And take, for a moral code, care for the earth and humanity; the common good?

B is for beauty

Beauty n. [mass noun] a combination of qualities, such as shape, colour or form, that pleases the aesthetic senses, especially the sight.

Yes, perhaps especially sight. For nearly all of us, probably, when we think of the concept of beauty, lovely visual imagery springs automatically to mind.

Unless we’re particularly insensitive and beauty does nothing for us, we each have our own favourite images of course. Perhaps it’s a rainbow arching across a pewter sky after rain, framing a verdant, refreshed landscape. Or water tumbling down a wild mountainside from its source, impelled by gravity to its ultimate destination, the sea. Or wild flowers dancing, variegated, in a green Alpine meadow (or tame ones in a sun-spangled summer garden).

Or, in the world of art, beautiful painted images; or arrestingly, sensitively or compassionately photographed ones. Or in the exquisite three-dimensional form of (say) naturalistic Rodin or abstract Barbara Hepworth sculpture; or, on a smaller, handleable scale, ceramic or carved objects ď art that, if you close your eyes, or cannot see anyway, can give you tactile stimulus too.

Yes, I know, I know. I’m skimming very lightly over a vast subject, but to continue: then there’s beauty found in the human form – small, big-eyed children or, equally, the subtle, more profound, beauty in an old lady’s smile. And not forgetting beauty in the animal world either; not only cuddly baby ones but grown-ups too such as graceful antelopes or swans, or magnificent creatures like, say, elephants or eagles.

And then there’s the beauty of sound (as opposed to mere decibel level): the pure liquid tones of (an expertly played) violin or a gorgeously mellow cello; or an electric guitar or even a synthesizer. Or, by contrast again, the sounds of nature: wind soughing in trees; rain falling gentle, pitter-patter, on a pavement; waves crashing languidly on a beach.

Or beauty can be evoked rather than directly experienced, through the mental pictures painted by well-chosen words in poetry written down or recited. Or expertly, perhaps movingly, crafted in prose; the viewer’s pleasure enhanced when they’re printed in a visually beautiful book (or, okay, I’ll concede, displayed on screen on a well-designed web page ).

Speaking of mutually complementary combinations, there’s another of which, as a former typographic designer, I’m particularly fond: that’s fine words executed in beautiful calligraphy or incised monumental lettering. Other combinations might be filmic images enhanced by sympathetic music or ‘still’ photography dovetailed with songs, either sung or rendered in written or typed lyrics.

Yes, beauty comes in many forms.

Is there a neurological aspect to the aesthetic sense in humans? Is there a part of the brain which if absent, under-developed or damaged would deny us the appreciation of beauty? If there is such a region, it seems to produce a highly-developed sensibility in some people, but in others, decidedly not so much. And is this variability a cultural thing or a matter of upbringing, or the particular brain physiology we’ve drawn in the genetic lottery? Is it a case of nature or nurture?

In a paper published in 2011, Canadian researchers Steven Brown and Xiaoqing Gao suggested that there’s no specific part of the brain devoted exclusively to aesthetic response. But what they showed, rather, was that there seemed to be a shared function with areas that have evolved to perceive important things for survival such as whether food is all right to eat (a foul taste would suggest it wasn’t) or the attractiveness and vitality of a potential mate, the coupling with whom would tend to result in vigorous offspring and thus ensure continuance of the species.

Looking at many neuro-imaging studies, they found that the most important area with this seemingly dual function was the anterior insula in the cerebral cortex. This rather surprised them, as that area is known mainly for its registering of disgusting (and therefore dangerous) food and the perception of also danger-signalling pain. So why would that area also be involved in appreciating beauty? It didn’t seem to make sense.

They suggested an interesting hypothesis: perhaps the anterior insula evolved first as a defensive mechanism for warning of dangerous or undesirable things and for choosing the desirable, but later evolved (perhaps – and this is just my amateur theory – from visual appraisal of the desirability or otherwise of potential mates) into a means of appreciating higher things like art.

Although, speaking of beauty as sexual attraction, of course, in our modern, mostly fairly liberal society (there are exceptions) most thoughtful, mature and well-adjusted people recognise that superficial beauty is by no means the be-all-and-end-all in the quest for love (and, generally, procreation of mankind). That sort of beauty really is only skin deep. We like to think as grown-up adults that we’re at least as much attracted by good character qualities and things like shared outlook and interests in potential partners. Some studies show that that seems to be particularly so of women choosing men.

Nevertheless, as any teenager knows only too painfully, physical attractiveness (and therefore positive self-image) is only too real an issue. It’s a deeply ingrained, instinctual and ancient concern. And there are a lot of pressures from Hollywood, social media and peer pressure generally that insidiously suggest conventional standards of human beauty as a desirable and necessary thing for acceptance and happiness. But, although many young people nowadays are troubled by these pressures, they are facile. True, intrinsic beauty does of course reside deep down.

Leaving beauty associated with sexual attraction aside though, just why a higher aesthetic sensibility, just for its own sake, should have developed in humans alone isn’t clear. But whatever the reason, it obviously did. Like religion, the human need for art evolved very early on. Think the Palaeolithic period and the famous wall paintings of animals in the Lascaux caves in France. They’re reckoned to date from 17,000 years BC. They seem to have served no practical survival purpose whatsoever; as far as we can tell, they were done purely for the joy of creativity by the artist and (presumably) the pleasure of the viewer.

I started this post with the premise that beauty is mainly visual. Well, not from a blind person’s point of view, it isn’t. Here to finish are some definitions by blind people from the BuzzFeed website:

‘I have three kids and to me they are all beautiful. I don’t even know what they look like. They’re beautiful on the inside. They’re me.’

‘I think beauty is experience. The smell of warm, baked cookies. The warm breeze against your skin. The feeling of grass beneath your [bare] feet.’

‘I don’t need my eyes to see beauty . . . just imagining what the ocean looks like and what the sky looks like. That’s beauty for me.’

Yes, indeed.

B is for banker

Banker n. A person who manages or owns a bank or group of banks

Like prostitution and entrepreneurialism (although I’m not necessarily conflating the two), banking has been around for a very long time. As has trade. With the advent of agriculture, early hunter-gatherer humans moved on from a subsistence-level means of staying alive to a simple barter system of exchanging surplus agricultural produce or goods for other goods or services they couldn’t produce themselves. That led to more sophisticated means of trade using tokens to represent value, and then coinage.

Along with this there also evolved quick-brained characters with a sharp eye for profit ready to lend money to those currently lacking the wherewithal to buy what they desired or needed – for an appropriate fee, of course. These early capitalists quickly realised that there was good money to be made out of other people’s debt.

Simply dealing in money – banking – is thought to have begun around 2000 BC in Assyria and Sumeria, was then practised in ancient Greece followed by the Roman Empire, and subsequently found its way to China and India. But banking as we know it today really took off in medieval Italy, with institutions such as the Medici Bank being established in 1397. The oldest bank in the world still operating is the Banco Monte dei Paschi di Siena. It was also founded in Italy, in 1472, as a ‘mount of piety’: an altruistic sort of pawnbroker, a charity enabling the poor to get cheap credit.

Mind you, banking has sometimes been far from ethical. Remember the Bible story about Jesus, in an uncharacteristic show of violence, throwing the moneylenders out of the temple? If he had been made human flesh today, I don’t think he would have taken too kindly to the ‘payday’ lenders and other loan sharks we have now. Except perhaps in times of economic calamity, making money out of money (as it were) has usually been highly lucrative (why else, after all, would finance companies buy debt from other finance companies?). A career in banking, money market trading and financial services generally has oft been many an ambitious young man or woman’s dream.

That’s not to suggest that all banking is necessarily sleazy or corrupt, of course. There are still altruistically-motivated banks and other institutions, like the Cooperative Bank (although it’s now owned by an American hedge fund firm),the Ecology Building Society, which lends for building and retrofitting sustainable houses, and non-profit credit unions, all of which avoid the worst excesses of capitalism and try to work for the common good.

But many if not all conventional retail and investment banks make no bones about it: they exist only to make handsome profits financing industry (which is fine if the terms are fair) but also feed the desire of many people, many consumers (that’s a key word) to buy stuff now, today (and increasingly online to swell the coffers of the Jeff Bozoses of this world) without deferring the gratification.

Of course, most of us have to go into debt to some extent, usually to buy a home as a better alternative to paying a large chunk of income for ever more to a private landlord. (Few people would be willing or able to do what I did to achieve mortgage-less house ownership: buy a ruin on a small mortgage, do it up, sell it much improved a few years later when the market had risen, thereby making enough to repay the mortgage and buy another ruin, this time outright, for another renovation).

But necessary debt to put a roof over one’s head is one thing, whereas debt (except for buying essentials when you really are too poor to be able to buy outright ) for other, relatively frivolous things, is another. According to the Money Charity, total personal debt (including mortgages) in the UK last year was a staggering £1,562 trillion. No wonder the lenders were laughing all the way to the, er, bank. That’s an awful lot of interest being charged and an awful lot of profit.

This little essay isn’t really about the rights or wrongs of debt though or whether consumer indebtedness is actually necessary to fuel a capitalist economy. Let’s stay with banks and bankers.

Since those far-off days in medieval and Renaissance Italy, and particularly since the Industrial Revolution, banking has prospered. Because its very raison d’ être is to deal in money, mountains of it, bankers only need to skim a very slender layer of cream off the top to make vast corporate and personal profits and fortunes. And, inevitably, with all that money involved, the temptations of shady or reckless practice – not to mention corruption sometimes – have sometimes been too great.

Of course, greedy bankers aren’t the only villains. There are the speculators – gamblers with other people’s money – of the stock market too and there have been huge peaks and deep troughs of economic boom and bust sometimes caused by their selfish and greedy shenanigans. The most devastating failure of the stock market in its effect on millions of people was the Wall Street Crash of 1929, which triggered the Great Depression of the 1930s.Stock prices and general wealth would not recover until 1954.

But the next major upheaval could be blamed fairly and squarely on the banking sector, with the financial which began in 2007. It too began in America with reckless lending of ‘subprime’ mortgages to people who simply couldn’t afford them, followed by excessive risk-taking by banks such as Lehman Brothers, which paid the ultimate price. And we all know what happened next: a global financial collapse; another Great Depression. For a few anxious hours it looked as though there might be complete global financial breakdown. Because they were considered to important for the economy to fail, many banks were bailed out with eye-wateringly huge amounts of taxpayer money and quantitative easing.

Few of the big beasts of banking, apart from Fred Goodwin, the boss of Royal Bank of Scotland, paid for their recklessness with their jobs (although even he got an extremely generous redundancy package) but many of the minions did.

But the real losers, the ones who always pay the highest price for bungles at the top of industry and high finance, were of course the ordinary folk. In Britain, Labour lost the 2010 general election, having been blamed rather unfairly by the electorate and hypocritically by the Conservatives (who were fully supportive of the Chancellor of the Exchequer’s ‘light touch’ banking regulation when in opposition) for a worldwide economic crash that had been precipitated in America.

And when the resultant Conservative/Liberal Democrat coalition government came in, what did it do? It imposed swingeing austerity measures in an effort to rebalance the books after the enormous cost of the banker folly. In the tenth year since that near-catastrophe, austerity is still with us. Public services, not to mention the NHS, have been cut to the bone with no end to this ‘belt tightening’ in sight. And the people who most rely on help from the welfare state have suffered disproportionally.

But have the bankers, whose greed and incompetence caused the crisis in 2007/8 felt the pinch? No, of course not. Have their wages stagnated since then? Au contraire. Figures for 2015 showed that more than 4,000 city and senior bank employees in Europe (the vast majority of whom worked in London) were paid, including their lavish bonuses, more than one million euros – including one fund manager who got a cool 35 million.

So, no matter how much havoc, how much hardship their ‘misjudgements’ (to put it kindly) cause to ordinary people, bankers continue to do very nicely, thank you.

A is for awe

Awe n. [mass noun] a feeling of reverential respect mixed with fear or wonder
v. inspire with wonder
Also:
Awesome adj. Extremely impressive or daunting; inspiring awe
(informal, chiefly North American): extremely good; excellent

Picture this. You’ve stopped at Lipan viewpoint on the south rim of that enormous gash cut by the Colorado River through the vast arid emptiness of Arizona: the Grand Canyon. What’s your overwhelming emotional response? Unless you’re singularly insensitive, it will be one of utter awe, right? You’ll most likely be completely blown away.

There are many places on our beautiful blue planet where Mother Nature so bombards our senses. Others include the world’s highest waterfall: the 978 metres-high Angel Falls in Venezuela and the towering, tallest mountain on earth in the truly spectacular Himalaya range: Everest.

Another one, not so much a geographical feature but an extraordinary natural phenomenon, is the polar aurora borealis, or Northern (or Southern) Lights.

If you like lists, here’s the Touropia website’s top ten: 1) the Sahara Desert, 2) Ha Long Bay (Vietnam), 3) Mount Everest, 4) Antarctica, 5) the Great Barrier Reef, 6) the Grand Canyon, 7) Iguacu Falls (Brazil/Argentina), 8) the Amazon, 9) the Galapagos Islands and 10) the Serengeti. This is just Touropia’s take, of course. Lists by others are also available.

Or what about the animal kingdom of the natural world, and the fear factor? Imagine getting close up and personal with some fearsome creature. Could your potentially dangerous situation be, for example, something like being stranded out on the African veldt without the means of escape in a vehicle, or of self-defence, eyeball to eyeball with a hungry lion? Or finding yourself in hot water, as it were, with a glinty-eyed, jaw-snapping crocodile?

Or even more terrifyingly, assuming you could time travel, going back seventy million years or so to the Cretaceous period, long before the time of Homo sapiens or indeed pre-human apes, to be dropped as a cat would offer a hapless captured mouse at the tiny front feet but humungous bone-shattering jaws of that distant relative of the crocodylidae, Tyrannosaurus rex? That would certainly be awe-inspiring (but not in a good way), I’m sure you’ll agree.

And what about humans? Not fearsome ones but those who command (or used to) considerable respect? Those in whose presence, should you find yourself in that fortunate position – or again, able to time travel – you would be rendered almost speechless with reverence?

Of course, this would be a very personal choice. Religious people (and I don’t only mean Christians) might go for a supernatural-being-made-human-flesh such as Jesus Christ; people of other faiths might go for a prophet such as the Buddha or Muhammad. In present times they might choose the Pope. If a politician it might be Barack Obama if you’re American and a Democrat, or someone like Mrs Thatcher if you’re British and Tory. Or it could be a revered and hugely influential scientist like Albert Einstein or a world-changing innovator like Johan Gutenberg or Bill Gates.

I feel another list coming on. Michael H Hart, in his popular 1993 book 100 Most Influential People In The World, chose this top ten: 1) Muhammad, 2) Isaac Newton, 3) Jesus Christ, 4) The Buddha, 5) Confucius, 6) St Paul, 7) Ts’ai Lun (Chinese inventor of paper), 8) Gutenberg (inventor of printing from moveable type, 9) Christopher Columbus, and 10) Einstein. That’s Hart’s ordering of not necessarily the morally ‘best’ people but those who have most influenced the destinies of and changed the lives of millions, which is a sort of awesomeness after all.

And then what about person-made wonders? I’ll spare you yet another list, but here are a few examples. From the ancient world there’s Rome’s colossal Colosseum, a stupendous structure considering that it was built (albeit by unwilling slaves) beginning in 74AD. With its 80 entrances, it’s still the planet’s largest amphitheatre. Or the achingly beautiful medieval mosque of Masid i- Imam, with its huge delicately blue dome and exquisite decoration. Or the unbelievable Great Pyramid in Egypt. Not to mention the soaring, awe-inspiring, apparently Heaven-touching to the congregation, interiors of Christian cathedrals like Westminster Abbey or Notre Dame.

Or in modern times, back in the Middle East, the staggering – and staggeringly expensive – sail-shaped Burj Al Arab hotel in Dubai with its vertigo-inducing (if you stand at the highest level and dare to look down), amazing atrium. It’s the second-tallest hotel in Dubai and third in the world. Or the Skytree broadcasting, restaurant and observation tower in Tokyo. At 643 metres high, it’s the tallest structure in Japan, the second-highest in the world (the 829.8 metre-high Burj Khalifa in Dubai beats it) and the world’s highest tower.

These are all wondrous and inspiring facets of nature and creations of man, who him/herself is sometimes pretty impressive too, but nowadays the adjective ‘awesome’ has been purloined, especially (and, as so often, originally by the American) young to describe anything/anyone even slightly remarkable. It’s conferred generously and automatically, as if the user can’t think of a more fitting adjective, without any value, any degree of impressiveness, really being applied.

It’s used almost oxymoronically, no matter how comparatively slight the impressiveness. For example: ‘Yeah, I think the band’s lead singer is awesome.’ Or, an illustration typical of social media: ‘Hey, your hair is awesome, Cindy’. Well, no, not really. ‘Nice’, perhaps. Even ‘beautiful’ or ‘handsome’. But not ‘awesome’!

Or it can be and often is used as a substitute for the quaint old-fashioned response ‘Yes please’, as in, if being asked if the young recipient of a Birthday gift liked it, ‘Yeah, awesome’.

To some extent it’s supplanting the idiom ‘cool’ – rendering it, in effect, un-cool. Which to a geriatric like me who was brought up in the 1950s on Elvis, Buddy Holly, the Beatles etc, is amusingly ironic, as ‘cool’ is far from being a modern term; it was in use then and goes back even further to the American jazz circles of the thirties.

And furthermore, ‘awesome’ is used so universally, with such catholic abandon, encompassing and qualifying all manner of subjects – even cutesy (but not ‘awesome’!) baby animal pictures on Facebook – that it has become stripped of all force of meaning. So if strong nouns or descriptors like ‘awe’ and ‘awesome’ are applied so indiscriminately and lazily to even the most unimpressive objects, what is there left to use to describe places, creatures, people or person-made environments that really do warrant a superlative?

I expect I’m just being a grumbling old fogey, but sometimes I despair for the inexorable degradation and decline of the English language!

A is for austerity

Austerity n. 1 sternness or severity of manner or attitude
– plainness and simplicity in appearance
2 difficult economic conditions created by government measures to reduce
public expenditure

Ever since the financial crash of 2008, the word ‘austerity’ has really only meant one thing: Hard Times. For the poorest and most disadvantaged members of society, those who have to rely most on help from the state, anyway.

But before I get onto that, let’s consider the Oxford Dictionary’s first definition: sternness or severity of manner or attitude. You might say it’s the antonym of gentleness, warmth and generosity of character: all the qualities most of us consider desirable. We probably know austere, uncaring people in real life and there are many examples of the cold-hearted in the fictional world of literature too.

Dickens’s Ebenezer Scrooge probably springs to mind, or Miss Havisham in Great Expectations. In that case, the embittered, man-hating, jilted-on-her-wedding day lady (and who can blame her, really?) spreads the contagion of her bleak soul to the person nearest to her: her adopted daughter Estelle. Miss H is a wonderfully drawn, eccentric monster who gives the young protagonist Pip a really hard time and encourages Estelle to do so as well.

Although that hadn’t been her original intention, we’re told. She’d initially wanted to have and love a surrogate daughter for the natural one she could no longer have as she’d given up any hopes of ever being loved by a man again. Her intention was to protect Estelle from the fate she’d suffered herself. But instead she engenders the same hatred of anything in trousers she now harbours herself. As she bitterly confesses years later to the adult Pip, who’s become hopelessly smitten by the beautiful, cruel, manipulative but unattainable Estelle, ‘I stole her heart and replaced it with ice.’

There’s a different sort of coolness in the other aspect of austerity in the dictionary’s first definition: visual plainness and simplicity. Or as we say nowadays: minimalism. This sort of plainness evokes all things Scandinavian: plain yet elegant functionality echoing the cold austere northern climate and reflecting also the Puritanism of northern lands. Or it brings to mind the simple homespun orderliness of North American Shakers, with their tidily-hung-on-the-wall chairs – ‘a place for everything and everything in its place.’

Or again, going to the extreme, that ultimate expression of austere living: the cloistered existence of monks and nuns; a seemingly bleak life of total devotion to faith and surrendered individuality; one’s personal physical space no more than a simple cell with probably no adornment other than a crucifix on the plain wall.

Contrast that with the other religious extreme: the glitzy opulence of Catholic southern climes. The sensory extravagance of the Vatican City, or Emilio Gaudi’s part-gothic, part-art nouveau but completely over-the-top confection: the basilica Sangrada Familia in Barcelona. Begun in 1882, it’s so complex and elaborate that it’s still unfinished – but it’s optimistically hoped to be completed by 2026, the anniversary of his death. What an extraordinarily protracted labour of love!

Now let’s return to the economic poverty of definition number two: government-decreed austerity. If you’re in the bottom stratum of society, working for, or even less than, the minimum wage (as opposed to a decent living wage), or for whatever reason not working at all but relying on benefits, you won’t be concerned about dictionary definitions of austerity in art, literature, architecture, Scandinavian design or religion. You’ll simply, in the British context, be trying (to quote from a certain post-general election Downing Street speech) to ‘just about manage.’

For many people in the last nine years, that hasn’t been easy. The reckless banker-caused crash of 2008 was far from being a short-lived recession, a normal trough in the economic cycle, from which the economy would bounce back. Many people, those at the very bottom of the economic pile, have suffered a real cut in an already low income. People are driven desperately to food banks for good reason however the government might try to whitewash the situation.

Other workers slightly less badly-off, in socially important fields like healthcare, social care and other emergency services (and public sector workers generally) have had a near-cap on wage rises, which translate into an actual drop when adjusted for inflation. The government’s Office for National Statistics says that incomes generally are about £1000 higher now than in 2008, but proportionally speaking that’s a very modest rise over nine years. And over the last year, we’re told, median incomes have risen by 2.1%

That statistic (given by the government, remember) might sound reasonably okay, but it’s an average. Some people will have done better than that; others will had seen little or no increase – or even a drop – when it’s set against inflation which, since the loss in value of the pound following Brexit, is now running at 3%. Other economists contend that most people, particularly the very poorest, have yet to regain in real terms the disposable income they had in 2008.

Pensioners who, like me, rely for at least half of their income on state pension and (also like me) are property owners and not paying rent, have done comparatively well as their incomes have outstripped inflation due to the ‘triple lock’ guarantee, and so they’ve seen a modest but actual rise, although from a very low base.

The only real winners amongst ordinary folk have been those on relatively higher incomes, because they’re less affected by things like inflation of the cost of food or rising rents; and the upwardly mobile: those who can still ‘better themselves’ by moving to more lucrative employment. And then of course there are the very richest, like television presenters and top sportspeople, who can often find ways of dodging tax, not to mention the biggest winners of all, whose reckless greed caused the problem in the first place: bankers. They have no shame; having been bailed out by the taxpayer, they’re now back to paying themselves obscenely high bonuses and ‘remuneration’ (nothing so vulgar or plebeian as ‘wage’ or ‘salary’, you notice). Equally shameless are the CEOs of top companies whose outrageous incomes, gifted them by their mutually-back-scratching friends on ‘remuneration committees’, have also rocketed since the crash.

But for most people, the oft-repeated government mantra and justification that the country can’t live beyond its means and that only by enduring the pain of austerity can ‘we’ balance the books and stimulate growth to increase the tax take to pay for better public services, particularly the crisis-ridden NHS, must ring pretty hollow. The poorest, anyway (and quite rightly), don’t buy that we must have ever more swingeing cuts to the welfare state to help us attain those promised sunny uplands.

Yes, it’s true that no country can have decent public services by going unsustainably ever deeper into debt; if we want these desirable things we have to pay for them. The alternative, usually espoused by the right, is to simply pay less tax and keep cutting them to the bone, so that the worst-off, those who most rely on them, suffer. There’s no free lunch. It’s about time Britain (and America and many other countries too) had a serious and honest debate about a fair, genuinely progressive taxation system with the burden of paying for a civilised society falling mainly on those with the broadest shoulders.

My income isn’t all that great, but I for one would be perfectly willing to pay a little more tax rather than the burden constantly fall on the poor.

A is for arguing

Argue v. 1 give reasons or cite evidence in support of an idea, or theory, typically with the aim of persuading others to share one’s view
2 exchange or express diverging or opposite views, typically in a heated or angry way

It all comes down to personality, I suppose. Some people hold very strong opinions and instinctively and vigorously express them in the company, whether face-to-face or online, of others who equally and firmly disagree.

Some people argue passionately for their beliefs and get het up; others simply enjoy a verbal duel and can argue the toss, trying to convince the opponent of the rightness of their view without ever losing their cool.

Others again, those who generally fall into the ‘passionate’ rather than ‘calm’ debater category, prefer to keep their views to themselves – or at least only discuss things with other like-minded people with whom they generally agree – rather than get into a heated argument with someone with diametrically opposed opinions. I have to confess I’m one of those.

The original past masters of arguing-as-in-debating: offering calm and intelligent reasoning and evidence in order to persuade others of their view, were of course the likes of philosophers such as Socrates and his pupil Plato in ancient Greece. In fact, one form of argumentative discussion inspired by Socrates is known, amongst other terms, as the Socratic method, or Socratic debate. Essentially, it’s based on posing and answering questions to stimulate thinking in a critical manner – as opposed to receiving your views spoon-fed by, for example, the populist tabloid press.

It draws out creative ideas, and questions ingrained presumptions and prejudices. It’s a method of weak hypothesis elimination, by robust reasoning and scrutiny, with the aim of reaching better ‘truth’, if you like. It’s a way of questing using logic and indisputable facts as opposed to blind entrenched belief, which is what present-day politics is supposed to be about but frequently isn’t.

You only have (if you can face it) to sit through fifteen minutes of BBC Parliament on telly to realise how far away from the Athenian ideal today’s politics has drifted. Yes, admittedly, many MPs do speak with passion, logic and insight, but parliamentary sessions are rarely discussions or debates in the proper sense of the word, with issues actually being discussed in an interactive way, but simply a series of prepared speeches often delivered to a near-empty chamber.

You have to wonder how many MPs are actually persuaded away from their original view by the strength of a better argument, especially if it comes from the other side, and even more especially if they’re being whipped to loyally vote with party. Free voting based on conscience, persuasion or reason seems pretty rare. And as for the disgraceful filibustering tactics sometimes used to thwart private member’s bills the government wants to strangle at birth: they’re a complete perversion of democracy.

The first major debating fora as far as access by the general public (as opposed to parliament or the rarefied intellectual heights of Oxbridge Union debates) was concerned came with radio and television. Think of discussion shows like the BBC’s Question Time. It’s a strange cocktail really: a mix of sometimes serious intellectual debate; usually politicians trotting out their predictable party line; and often heated argument, augmented by sometimes thoughtful but sometimes populist or simplistic contributions from the audience. Perhaps it’s a valid exercise in democracy; perhaps not so much.

But now, with the emergence of platforms such as Facebook and Twitter, traditional broadcast media have been overtaken as arenas of debate by social media. This ‘people’s media’ is certainly democratic, at least in that literally anyone with a computer or smartphone can put in their twopenn’orth daily, on any current issue, and they do. I certainly do. Again, there’s the spectrum between calm reasoned debate on one end and sometimes not-very-well articulated, or indeed spelled, capitalised or punctuated (but then good education shouldn’t be a qualification for the right to express a view, of course) angry commentary on the other.

Well, that’s not the other end. Beyond, there’s downright abusiveness, sometimes violently so, as in vile threats of killing, rape or generally wishing for death, like medievally invoking a curse, which is totally unacceptable not to say illegal.

Personally, although an avid user, I hate getting into arguments on Facebook and avoid them like the plague. It’s just a character thing; I’m just not jaw-juttingly confrontational, more the seeing-both-sides compromising peacemaker. Far from enjoying a good argument or trying to forcefully win others with different opinions around to my point of view, I’d rather avoid the ill-feeling that can so easily arise.

I prefer to put the world to rights in generally agreeing with people. Yes, I know, that’s the echo chamber effect. It’s a bit of a cop-out, but it’s psychologically more comfortable to agree than violently disagree. But then we all tend to gravitate towards others who are like-minded rather than of a different persuasion. If I read a comment by a Facebook friend with which I strongly disagree, I simply keep quiet. The same goes for non-friends. I might make a comment criticising a general opinion being expressed on a thread if, for example, it’s showing xenophobia or intolerance, but I won’t have a go at any specific person. And I won’t get involved, troll-like, and butt into threads the drift of which I strongly disagree with. I won’t get into verbal sparring or insulting likely to cause offence or produce an angry reaction; life’s too short. Particularly at my age.

And then there’s arguing with family. I can’t conclude without a mention of the subject that’s been exercising and dividing – sometimes bitterly – many British people since June 2016: Brexit. It’s been such an opinion-polarising matter as far as most people (the ones who bothered to turn out and vote anyway) are concerned that it must have caused many bitter arguments in families where not everyone voted the same way. In my case I was for remaining – and found myself feeling surprisingly strongly about the result. I’d assumed that, because we generally agree politically in my family, all members would vote as I did. But one didn’t, voting with equal conviction for leaving.

I found that actually quite distressing, and after some exchanges on Facebook that for me at least weren’t entirely good-tempered, I simply had to block any more input from the other person. It wasn’t worth risking a good relationship, and now the subject is a no-go area. It could have been worse though. I can’t imagine how difficult it would be, especially if Brexit was likely to significantly impact personally and you had a spouse or partner or lived with someone else with a differing view. Life then could be extremely fraught.

That’s the thing about arguing – at least, if you feel passionately about something and are unlikely to change or even modify your view. Like a person of religious faith, you feel you’re enlightened; privy to obvious and absolute truth; that anyone with a different view is a deluded fool. It can be difficult to acknowledge that a verbal opponent feels exactly the same way, feels just as strongly.

And therein lies the conundrum of liberalism. Although probably holding strong opinions, a liberal thinker by definition isn’t absolutist; he or she tries not to be dogmatic; tries to be civilised and respectfully see the other point of view, the bigger picture.

It can be hard to sometimes, though!

A is for ambition

Ambition [mass noun] desire and determination to achieve success.

‘Getting on’, my old mum used to call it when referring in admiring, awed tones to anyone who had risen above their station in life in the working class and, if not become dazzlingly rich, had at any rate moved a few notches up the social scale to the middle class.

Ambitiousness is generally synonymous with the acquiring of a much larger income than those souls in the general population with less get-up-and-go, less drive and usually content with their lot. The driven people we usually think of in these terms, who have ‘succeeded’ beyond most peoples’ wildest dreams, are often entrepreneurs or industrial magnates and most often American.

Think of the world’s current two wealthiest men jostling for the top spot: Bill Gates and, currently numero uno, Jeff Bezos. Not to mention that other thrusting young whippersnapper, Mark Zuckerberg. Although even these unimaginably wealthy men will probably, in this increasingly unequal world, soon be surpassed by the first trillionnaires – and they won’t necessarily come from America.

For Gates, Bezos and Zuckerberg it wasn’t exactly a rags-to-riches story; coming from middle class backgrounds, none of them had to claw their way up out of poverty or other disadvantage. Neither did the offspring of one Donald Trump. You might say that the truly ambitious are those who do.

Like, for example, John D Rockefeller. He certainly wasn’t born with the proverbial silver spoon in his mouth. Far from it; born in 1839, he grew up in a humble little house in New York, one of six children of a snake oil-selling con-artist vagabond and a devout Baptist mother. When he was a boy his family moved to Ohio. A studious youngster, he did a ten-week book keeping course before taking his first job as an assistant book keeper, which he threw himself into with enthusiasm. He’s reported to have said that his two ambitions were to live to a hundred and make one thousand dollars (that’s equivalent to over 2.5 million today).

He didn’t quite achieve his first ambition – he died aged 97 – but he certainly did the second, many times over. By the age of twenty he had formed a business partnership, before buying his partners out and astutely entering the burgeoning oil refining business with others. He and his new partners were canny; they could see the growing importance of oil and the products for transport and industry that could be made from it, and they set about buying up other local oil refineries.

But he was still restlessly ambitious. In 1870 he dissolved the partnership to create Standard Oil of Ontario and now he went from strength to strength. Ruthlessly competitive, his wealth ballooned, and by 1882 the virtually monopolistic company was the richest and most powerful in the world and Rockefeller would go on to become America’s first billionaire. At its height, in 1913, his worth was estimated at $392 billion in 2013 terms, equivalent to nearly 2% of the entire American economy. Good going, for one who started out a humble book keeper.

He didn’t hug all his wealth to himself though; a huge chunk of it was put to good philanthropic use. He created foundations to support medicine, scientific research and education, including the University of Chicago and the Rockefeller University.

Not all ambitious people necessarily start from the desire to amass shed-loads of money. Many in the creative arts, entertainment or sport have become hugely rich through a combination of intrinsic talent, hard work and in some cases, a fair amount of good luck. Here’s just one example from this category, another John: John Lennon.

Lennon wasn’t given a helping hand by privilege either. Born in Liverpool in 1940 into what would become a broken home after his parents split, he was raised by his Aunt Mimi, although his mother Julie did maintain contact. She encouraged a love of music, teaching him banjo and buying his first guitar, as did his uncle, who bought him his first instrument, a mouth organ.

So young John Lennon was on his way to fame and fortune, in spite of (apart from art) doing badly at school, a school report opining that he was ‘Certainly on the road to failure.’ Whoever said that must have later eaten their words. His first music making as a fifteen-year-old was in skiffle, and in 1957 he formed his first group, the Quarrymen, which would morph into the Beatles in 1960. Meanwhile he went to Liverpool College of Art but wasn’t successful there, leaving in some disgrace in his final year.

But he wasn’t bothered; music was his thing. The Beatles, now consisting of Lennon, Paul McCartney, George Harrison, Stuart Sutcliffe and drummer Pete Best, had their first outing in Hamburg in the early sixties, after which they returned to England leaving Sutcliffe behind and replacing Best with Ringo Star. And the rest, as they say, is history. Their success was phenomenal throughout the sixties, until Lennon left in 1969 and the band folded acrimoniously.

Lennon (and indeed the others) was far from finished though. Having divorced his first wife Cynthia, he married Yoko Ono and moved to New York in 1971, to have a further nine years of huge success, making eleven more post-Beatles albums, five of them with Yoko, as well as well as becoming involved in protest and radical politics. Tragically, aged forty, his life was cut brutally short by Mark Chapman in 1980. In death, his estate is worth 800 million dollars. Again, not bad, for a rebellious, musically-inclined young lad from Liverpool.

Then again, ambitiousness is often a quieter, more modest affair, with success being simply a matter of achieving a fulfilling and happy life doing something you love. So here’s an example of that, the story of a third John: Yours Truly. Now I don’t for a moment compare myself with my foregoing namesakes; I simply contrast myself with them.
I, too, certainly wasn’t born into any sort of privilege. My parents were working class; few people in the family rose higher than that back in the post-WW2 days when the class system in Britain was much more rigid. My dad was a factory worker turned house painter and my mum a housewife and charwoman (house cleaner). My education was, well, basic, at a non-academic school where the boys were not expected to be anything grander than some sort of artisan and the girls a shop assistant or office minion.

Consequently, aged fifteen, I found myself apprenticed at a printer with a pedestrian career as a compositor (typesetter) mapped out ahead. But I wanted more than that and asked to be allowed to do ‘day release’: one-day-a-week technical education at Leicester College of Art. This led to a full-time course in graphic design at the same establishment and a career in that for twenty-six years. I hadn’t become fabulously successful in terms of earning power, but I’d risen above the early minimal expectation. I’d become (lower) Middle Class.

And then in 1991 I abruptly changed course, leaving the drawing board behind and moving to Wales to do the thing I really loved: renovating old houses. I did that for eleven years, not earning a great deal of money but loving every minute of it, and then in the final five years of working life devoting my building skills to landscape gardening. After retiring I took another plunge into the unknown, into writing, and produced six novels. None were particularly successful and I earned little, but I really enjoyed writing them.

So although I never earned big bucks, I realised three ambitions. Well, four if you count achieving life-satisfaction, fulfilment and happiness.

In my terms at least, I ‘got on’.