Archive

Monthly Archives: February 2013

If you find yourself in Amsterdam at some point before April and need a break from the freezing wind or gliding bicycles, be sure to visit the Stedelijk museum. A three minute walk from the much celebrated Rijksmuseum, De Stedelijk is the curious setting which houses the late American artist Mike Kelley’s retrospective exhibition. Contained within the price of a general museum admissions ticket (around 7 Euros) is access to several rooms and galleries, pertaining Kelley’s life lived through art.

Mike Kelley’s exhibition presents an audio and visual experience accessible to many with his unique and bizarre take on America and its values, culture, ideology, ‘constructions in all their messy contradictions’ and crude representations of pop. Nothing is covered up from Kelley’s inspective eye, his dark and drastic sense of humour. An artist who freely interplayed with a vast manner of forms, the legacy of Kelley on show here features both the age-old practice of nude drawing and modern sculpture and light projection, as well as television constructions broadcasting a loop of unsettling YouTube clips – itself interspersed with minimalist dots and clinical beeps. It seems Kelley was a master of portraying 20th Century popular culture in art. Who else would burst the mythic bubble surrounding the question of supressed memory in his work (‘People tend to think about these works in a very generic way as, somehow, being about childhood. That was not my intent’[1]) but Kelley himself? Critics have made comparisons to the 1960’s avant-garde, but if Mike Kelley was the Frank Zappa of the art world, then their similarities end where deep emotion and personal anguish within it are concerned; for the serious decorum of authenticity, that of the oblique creator – is present throughout the show.

Sadly, following a bout of depression, Kelley committed suicide in January 2012 at the particular age of 57. This has enhanced his status as a troubled figure (as it undoubtedly will), but the work Kelley left us was always slightly alarming and at least interesting (I didn’t know about the nature of his tragic demise at the time of visiting). Whilst security will have to hear the sounds of feminine shrieks and indefinable circular clangs on a daily basis for the next couple of months, it is slightly irking to know that this distinctive collection will never be added to, enhanced or shown in a new capacity. When cruising around the gallery in De Stedelijk, I noticed some of the pieces on parade consisted solely of words on a page, affirming any suspicion that writing can be a form of art too. Where, though, is the line drawn -so to speak – between literature, or simply empty words, and art itself?

Words and the practice of using them can only ever be conservative in that one is restricted by the ‘box’ of the words meaning and the limitations of actual speech. Art by its nature rejects this limitation so where words fail, art can often suffice – permeating those depths of the soul that letters can never reach. But I’m convinced these magisteria can and do overlap, as I’m sure is Kelley; the written piece may not simply be just a craft. Who, that takes an interest in the practice, has failed to be affected by a high calibre essay or a timeless work of fiction in the same way that they are by music or painting? Kelley himself trod the fine line between the two positions, and between the role of spectator and partaker.

As I was backtracking through Kelley’s life in words I came upon an interesting quote in an interview from the Guardian page about critics: “Artist and critic are dependent on each other but have fundamentally different social positions and world views. As the story goes, the artist is uneducated but has a kind of innate gift for visual expression, which the educated and socialised critic must decode for the general population.”[2]

Kelley himself attained the unique position of being both artist and critic, releasing a book of art criticism called ‘Foul Perfection: Essays and Criticism’ with John Welchman in 2003. What’s more, he could remove himself further from the positions to gain perspective on the whole ordeal itself (hence the above quote); Kelley undermines his own words through his actions. Historically, the two positions are not a separate as is made out. The poet and writer John Betjeman divided his life’s work into both occupations: a profound lover of the English countryside and Anglican churches, you would never be sure if he was to write a verse of accolade for the country in its days of old, or a structured critique, albeit full of adoration. It also portrays a strange effect to the observer.

In his general idiolect Betjeman sort of spoke in the manner of the poet, an odd halfway position between general conversation and metaphorical interjection; videos of him speaking show him gazing up in slow candour, almost as though he were picking the words from some noble encyclopaedia in the sky. In an interview conducted by Betjeman on Down Cemetery Road[3] (1964), Philip Larkin makes the confession that “really one agrees with them that what one writes is based so much on the kind of person one is”. Larkin too was a critic, particularly of early jazz records and modernist literature and his polemical prose is collected in the fine publication ‘Required Writing’, though he asserts that he didn’t very much enjoy this hack job.

Is this split of roles, this cognitive dissonance at all praise worthy? It is certain that one is not required to be detached from the world of creativity to be able to assess it themselves. It can produce some odd or even negative results; apparently Martin Amis’ ‘Koba The Dread’ which deals with Stalinism fails boorishly where the writer neglects his familiar output of fiction. If the writer is painting with words then Kelley’s own words must be wrong.  The counter-examples show that even if the results are disingenuous or quirky (or brilliant), the two worlds do collide and often produce a lovely light in return.

Advertisements

We tried to find an obsession

Blue sky walls, rocking cots,

Men of good conversation

Or stories of the tunnels that run under town

 

I was busy

Writing Bukowski’s dreams by lamplight

You are here

Evenings turn to beery bar, authentic night

That word shouldn’t exist

Fart chamber bus, methane, Co2, Suffolk Orchestra journey, goo

I grow up and brood, like most men halfway to sensitive

 

Give me your obsession

Whilst I’m forcing in and out,

Growing and brooding.

I picture yolk-coloured flags

And watch the guy follow me through the door,

Which for some reason I do not hold.

Fifty years have passed almost to the day since Sylvia Plath was writing the last of her journals. Thoughts, feelings, stories and personal confessions of Plath were committed to paper in the last weeks of her life, only to be destroyed by her late husband Ted Hughes, in respect of the maxim of forgetfulness as ‘an essential part of survival’[1]. It will bring no surprise to the reader to learn that he himself was a poet, the artist lovers famously entwined in books, separated by their passion.

Plath’s suicide at the age of thirty in 1963 secured her reputation as the archetypal modern ‘suicide writer’, a hero of deep prose for the younger generation. So what is it of the enigma surrounding Plath which remains? A quick glance at the history of literature would show that she was hardly the first ‘troubled’ female author to gain such a reputation. Virginia Woolf had preceded Plath by about 35 years and was twice her age when she finally killed herself in 1941, not before declaring “A woman must have money and a room of her own if she is to write fiction.” Plath built upon this affirmation with her tale of the conflicting desires inherent in the young writer in The Bell Jar, published a month before her death in 1963, sometime ‘Between the end of the Chatterly Ban and the Beatles’ first LP’ ( if we are to take Larkin’s account of the year).

The Bell Jar, arguably Plath’s most acclaimed work and the only novel she actually completed, is constructed in remarkably readable prose with each turn of the page seeming more lucid than the last. Plath drew on her own youthful experiences for the novel and the main protagonist is undeniably hers; Plath, like her doppelganger Esther Greenwood, lost her father at the tender age of eight and was somewhat of a wondering nomad, securing a position as editor at Mademoiselle in New York City following her college education. Plath moved around different parts of Massachusetts throughout her childhood. Born to an Austrian mother and a German father, Plath seemed to carry a sense of Holocaust guilt and a minor preoccupation with Jewishness, mentioned several times in the novel and some of her seminal poems in Ariel (1965). In fact Plath’s father, Otto, shared the same first name as the tragic Anne Frank’s father, a name that might ring odd in the ears of the modern-day observer. Plath moved herself to Cambridge, England to study and then to Devon, finally settling in London, burying herself in the English winters.

One factor which makes The Bell Jar and Plath’s work in general widely accessible is her ability to relay both the obscure aspects of a life lived in capitalism and the more general, the universal. An example of this realism is contained on page 53 where Esther describes the predicament of having wished she’d said something different in reply to a comment or jibe: ‘These conversations I had in my mind usually repeated the beginnings of conversations I’d had with buddy, only they finished with me answering him back quite sharply, instead of just sitting around and saying ‘I guess so’. The metaphors employed by Plath for casual observations are succinct and mentally pleasing, ‘My secret hope of spending the afternoon alone in Central Park died in the glass egg-beater of Ladies’ Day’s revolving doors.’, raising the question of why Larkin (who was a contemporary poet, though slightly older) never conjured such an image considering his inevitable daily entry to Hull University’s Brynmoor Jones library.

Plath’s prose reads well on the page and gives one the impression that Plath barely had to try once pen had hit paper, an achievement most writers will understand to be harder than it looks. Plath’s writing contains mountains of clarity in the manner of George Orwell, though she leaves Orwell behind in her sometimes naive worldly observations, and attitudes surrounding mental illness, the charming nihilism for which she is often famous. Plath is rejected for this too often by those who feel fit to be critics and troubled Manic Street Preacher Richey Edwards even had the audacity to proclaim ‘I spat out Plath and Pinter’ in 1994, though he was undoubtedly influenced by her literary clout. One must keep in mind the fact that at the time of writing, manic-depressive heroes were in short supply and the approach of emptying the contents of the mind, un-judged, was relatively new in the pre-sexual revolution of the early 1960’s – the dark split between the social norm and personal reality.

A useful but humorous comparison may be drawn between The Bell Jar and Jean-Paul Sartre’s Nausea (1938) in that both contain characters dealing with the oppressive sensation of nausea. Sartre’s existential ‘masterpiece’ draws its strength from the sense of displacement one feels through the character of the lone wanderer, the individual who is incurably sick at the comprehension of nature’s reality and the movement away from a solipsism previously held (and an idea which had historically plagued so much philosophical thought). Esther too became sick in the social arena of hotels and cinemas. One presumes the reason for this is a similar sort of existential crisis – but we soon learn that flamboyant Esther had simply consumed too much bad crabmeat at someone else’s expense, as did many other girls in her company. The resulting impression (one that is welcomed) is that she too is a material being who is not to be singled out as any special exception, which relates in a curious way to the reader, who might’ve noticed such a distinction between fiction and reality on one’s own accord.

Plath’s poetry, for which she was most well known in her lifetime, is notable too for its fractured imagery and the ability to incorporate life experience into a simple turn of phrase. She returns to her upbringing and family in what is arguably her most famous poem (published posthumously) ‘Daddy’, with the displacing opener:

You do not do, you do not do,

Any more, black shoe’.

Reading the stanzas for a BBC program in October 1962 following a creative spurt in which she produced around 50 poems in just a few months, Plath has a remarkable tone of voice, which recalls Judy Garland’s Dorothy in point of directness and sweetness. This clashes irresistibly with the ghoulish content that permeates Plath’s work, establishing her as the first mentally ill, feminine housewife who wrote poetry for the masses. Plath fell forever in love with Ted Hughes and was consigned to a miserable last few months, looking after two children on her own in a cold London flat, following their fateful split. And what was left of Plath’s riddled conscience?

‘And I

Am the arrow,

The dew that flies

Suicidal, at one with the drive

Into the red

Eye, the cauldron of morning.’[2]

If there is any social statement which oversees Plath’s work it is surely a critique of the American dream, the Western dream; the fact that even pretty suburban girls like herself can fall far from grace. This is the real face of depression, a girl with the need to die. Plath wanted badly to caste out the monster of mental illness and portray it in the English language.

Half a century since her death, Plath is remembered for achieving this above all things, and The Bell Jar is a novel I’d recommend to anybody in pursuit of either education or pleasure. This type of writing had and has been done before. Only Plath did it with such a distinctive style.

sylvia

.

The excitement

Irresponsible

Of the closing ambulance door

Made me view the leaf-scattered pavement

And the cemented semi-detached,

The youth with a football

In a whole other shade.

Days are an endless horizon for most,

A curtain between acts

Where we never see its final fall

 

Now, I’m a stagehand

Aren’t we all?

Sweeping up the clutter,

To make the floor

Shine once more

Shine once more

For the kid with the ball

Shrouded by monarchy

Like us all.

 

That gin took the vigour out of me

All I have left is shoddy freedom

Driven to the land

Of those bound to fail

As they shave me

And the bridge takes the battering of my weight, yet again

Girls rouse from the bed-drawers,

Consultants laugh

At the futility of the crushed pigeon

Shrouded by fossil-dew,

They pick me up.

 

And we’re taking the stairs

Whilst they’re sweeping the clutter

The leaves they scatter,

As the shadow grows long.

In recent years the push-back against those who argue against religious faith in public arenas (those people commonly classed as the ‘new atheists’) has become clouded by what I class as a pseudo-intellectual way of thinking, where all too often the person arguing on behalf of faith will turn the tables on the sceptic and equate their rational, scientific beliefs with their own faith in the Gods and the Heavens. It is not uncommon to hear these people say things like ‘trust in science involves just as much faith  and susceptibility to dogma as religion’; such statements are not only asked by undergraduates. In the last couple of months, two respectively written articles have appeared in the New Statesman based around the topic of religion, faith, evidence and reason, which I argue are essentially guilty of what I’ve just talked about: one is titled ‘Giant Leaps for Mankind’[1] by John Gray. The other is ‘The Goebbels of the English Language’[2] by Alan Moore.

In his review of Brian Leiter’s book ‘Why Tolerate Religion?’, John Gray discusses the difficulty in defining religious belief: ‘there is nothing particularly irrational or otherwise lacking in religious belief. After all, what counts as a religious belief?’. Defining a nuanced idea of religious belief may certainly be no easy task, but we can at least form an idea of some of its necessary conditions if we are to get anywhere in the English Language: religion must involve some belief in a supernatural creator of the World, and/or Universe. If this is not so, the belief does not accord with any recognizable or traditional interpretation of the original three monotheisms, the ones with which I’m sure Gray is primarily preoccupied. Gray then goes on to rather strangely and irrelevantly conflate the motivations behind certain acts and events in history to those acts committed by people because of religious motivation. For instance, he says that the horrors of Soviet Russia imply that ‘faith’ claims about the workings of communism are flawed, and that the 2003 American intervention in Iraq was a secular ‘faith’ driven adventure. Meanwhile he also invokes the ‘hunger for oil’ argument. But surely there either was an evidential reason to go into Iraq or there wasn’t, regardless of whether it was the right moral decision; Gray wants to affirm both at once, and in addition to this seems greatly confused about what we might term as the ‘a-religious’ faith that is supposedly the motivation behind this . The arguments have nothing to do with what secularism in the philosophical or intellectual sense means and Gray is determined not to acknowledge that some ‘faith’ is more justified than others. This may be because he doesn’t believe this to be true. But the point is elementary; the faith I have that I shall be nourished by my lunch today contains far more merit than the faith that an overseeing, all-powerful spaghetti monster awaits my death so that I can transgress into heaven (…just for example). So there are different kinds of faiths and they can be judged on their weighting and merits on a case-by-case basis.

Is Gray seriously claiming that belief in a God who created the world and everything in it, observes our earthly movements and who judges us upon our death (for sins which were brought upon us without our having any say in the matter), contains the same level of rationality or faith as the study of empirical, observable evidence to make judgements and decisions in the here and now?  Gray, to me, rather condescends the layman in bringing what are often absurd religious claims on a par with complex but reliable scientific ones (that is to say, these claims are brought about through a reliable method). Aside from annoying this ‘militant’, ‘new’ atheist, mainly by employing the facile oxymoron in the first place (how can one be ‘militant’ in their unbelief of something? What classes as religious militarism and atheistic militarism is considerably different in public terminology), Gray never actually explains how and why ‘most of  our beliefs are always going to be unwarranted’, one of his mains failures in the article.

This leads me onto the second article I mentioned by Alan Moore. The subtitle of Moore’s piece is ‘We cannot state conclusively that anything is true’; this is a fairly accurate summary of the theme of the piece and intentions behind it. His main beef with the concept of evidence seems to be that its validity relies on, well, evidence. This appears at first to be true – such a proclamation is indeed, self-evident and in a sense grants itself – but in terms of pragmatics, real life day-to-day stuff, the concept is not so circular. We could not live without evidence. We need it for helping to solve crimes, create life-saving medicines and conduct scientific experiments. And yet Moore seems to define the concept of evidence in strange, anthropomorphic terms, as though it were an individual event or quantifiable foe: ‘A glance at evidence’s back-story reveals a seemingly impeccable and spotless record sheet…’. What? All things in the world can be evidence; literally anything. Precisely what is he pointing to when he says ‘evidence’s back-story’? Is it evidence for things he doesn’t like?

Moore is within his rights to make the distinction between ‘evidence’ and ‘proof’ (though the former often constitutes the latter), because proof can be had without evidence. But when Moore invokes philosopher Karl Popper’s theory of falsifiability, he commits an error of categories. It is certainly true that nothing can conclusively be proven to be true, for we would have to have infinite time and ability to assess it all, and that the principle of falsifiability – that we can only demonstrate at most that a hypothesis has not yet been falsified – is the best way to go about conducting scientific enquiry. However, religion is primarily a scientific question, for it makes bold empirical and perhaps eventually, testable claims; one should not take the jump of making a truth claim about god’s existence simply because it hasn’t been proved he does not exist. This is the principle of falsifiability in action; the burden of proof is not to show that things do not exist but that they do. Doubly so with grandiose claims about the nature of the Universe and the things that happen to us when we die. Again, there is nothing new about what is being said here. Evidence is crucial and it is absolutely right to ask for its consideration, especially when so much is at stake as it surely is with religion.

The verification principle is useful for questions of scientific enquiry, but it cannot really be put into practice with regard to supernatural claims. Such questions certainly are not meaningless (there has to be a truth-function to these claims, unless you are a relativist), but by ignoring the distinction between ‘faith’ and beliefs based on reason, Moore falls into the same trap as Gray. Do these two Gents not trust science, or the concept of evidence? If not, they are kindly invited to climb to the top of a ladder and jump off to test their ‘unreasonable’ faith in gravity – ah, but of course neither would want to do any such thing. The basis for their attempts to create a level-playing field for reason and faith, for skepticism and credulity are flawed, and they really ought to not be so disingenuous. If they want to be relativists about truth, they should be consistent and come out with what they really mean.

As I have learned over the course of studying philosophy to some degree in the past two years, each generation of thinkers (philosophers, scientists, artists) has owned a community inclined to reject the supernatural. Socrates, whose trial included indictments for blasphemy, Spinoza, who had to go on the run owing to a similar charge, Nietzsche’s maxim that ‘God Is Dead’ and Russell’s ‘Why I Am Not A Christian’ are all textbook examples.  In 1920’s Europe, a new fashion in philosophy emerged, the composers of which became known collectively as the ‘Vienna Circle’. Influenced by writers in the analytic tradition like Frege, Schlick and the earlier Kant (that is, philosophical enquiry concerned with logical and mathematical substance rather than metaphysics) , a group consisting of notables like Moore, Carnap, Schlick, Russell, Wittgenstein and A.J. Ayer would hold regular meetings to discuss the metaphysical and logical issues of the time.

Following the publication of Ludwig Wittgenstein’s ‘Tractatus Logico-Philosophicus’ in 1921, in which he claimed to have solved the fundamental problems of philosophy by showing the relationship between language and the world and the limits of it, a newly assembled band of thinkers known as the ‘Logical Positivists’ began to assert their dominance in the philosophical dialectic. The intentions of the logical positivists and the Vienna Circle are perhaps more obvious now than they were at the time. Beginning with the study of language, Wittgenstein established that only statements (propositions) which could be broken down into elementary propositions reflecting the reality of the world contained any meaning. The logical structure of language reflected the logical structure of the ‘state of affairs’ present in the world. Any discussion of ethics, metaphysics, aesthetics and religion were therefore rendered meaningless. Those of us who study philosophy might have felt this all along, but this was a significant shift in thought amongst the hangover of ‘transcendental’ idealism (the theological noise of the likes of Bishop Berkeley), which was so dominant in the late 18th Century.

The logical positivists took Wittgenstein’s thesis (and perhaps slightly misinterpreted it), turning it into a condensed version known as the verification principle, which states that ‘a sentence has literal meaning if and only if the proposition it expressed was either analytic or empirically verifiable’[1] (i.e., it is a mathematical or logical truth, or a tautology, or a statement which could be proved by analysing the external world). The logical positivists were well aware of the weight of public consideration that was placed in religion/the supernatural in post-Victorian Britain, and it seems as though they were determined put an intellectual, scientific muzzle on such talk. (Although the position of logical positivists could only be described at most as ‘agnostic’. Just as one cannot utter a meaningful statement about the existence of a creator, the assertion that there is not is equally nonsensical, ‘since it is only a significant proposition that can be significantly contradicted’). A.J. Ayer, whom I have just quoted, and Bertrand Russell, who took the religious question further than Ayer in claiming that the logical step was to positively affirm god’s non-existence, were both atheists and their work radiates a clear notion of common sense. This is partly why they are so well known outside of philosophical circles. Religious claims are not A Priori or analytic, nor can the god hypothesis be empirically verifiable (though one must be careful in declaring this to be impossible in the future, owing to the problem of induction), ergo, the topic is meaningless to even discuss. This was an interesting attempt to approach the religious question in the context of language, regardless of whether its merits remain substantial today.

The so-called ‘new atheists’ are making similar claims through their work, with the inclusion of modern discussions regarding the actual consequences of religion in the world at present. Whilst metaphysical and scientific questions on the topic are not disregarded – Richard Dawkins extrapolates nicely the scientific claims of religion in The God Delusion (2006), partly by invoking Russell’s ‘teapot’ analogy – the issues of globalization, modern warfare and the gradual evolution of technology are brought helpfully into the debate, to create an up-to-date, substantial account of the way in which religions operate today.  I have claimed before this is a false tag because there is nothing remotely ‘new’ about repudiating the supernatural; this claim is supported by the existence of the thinkers I named at the beginning of this essay, and by the claims of the logical positivists.

Were they the ‘new atheists’ of their era? Perhaps so, but they approached the issue from a radically different perspective to the current era of sceptics and free-thinkers.  The zeitgeist of the logical positivists was possibly a reaction to the culturally conservative values of Christianity in Northern Europe at the time (Nietzsche was certainly fully aware of this, especially regarding his own doctrines); indeed, I am reminded of the story told by Richard Dawkins about ‘Freddie’ Ayer’s admission to saying ‘Grace’ at the dinner table whilst he was non-religious, who responded by saying ‘I won’t utter falsehoods, but I have no objection to uttering meaningless statements’. This in a nutshell sums up the attitude of the logical positivists.

The existence of such periods of unbelief in the intellectual discourse of each generation shows how false, shady and lazy the tag really is; those who invoke it usually have a particular, peculiar way of thinking, which I am inclined to address in the next section.

As a side note, below is a fascinating interview with A.J. Ayer conducted by Bryan Magee. They don’t make them like this anymore.


[1] Ayer, A.J., Language, Truth and Logic, First Published in 1936.