A Rare Linky Post

Usually I think of my blog as the place where I put down my thoughts. But things have been so hectic of late that I haven’t really had any that are worth noting. Instead, I’m going to link to three posts that have caused me lately to stop and think.

 

Andrew Blackman: The Future of Books: Reactive?

This fascinating post reports on advances in technology in ‘reactive media’ in which we get to be hooked up to a machine that stimulates the storyline we’re reading if we get bored, or dials it back if we’re overreacting.

I guess that whether your reaction to all this is “Wow, that sounds cool” or “Please shoot me now” depends on what you want from your media,’ Andrew writes.

No prizes for guessing which camp I’m in.

 

Dutch Courage (written by my friend Ingrid): Proving Yourself

This is a beautifully written post in which Ingrid considers the subtle difference between ‘justifying yourself’ and ‘proving yourself’, a distinction linked to gender identity that she becomes aware of while supporting her young son as he grows. Masculinity, she learns, consists in part of:

The unshakeable drive to prove oneself worthy of a higher and nobler calling (love), the need to have one’s action’s approved by a band of brothers, that all-in-allness that men establish between each other through competition and the fair fight is absolutely hardwired into them. They could no more let go of it than they could drop down and walk on all fours. To laugh at this drive is to wound a man profoundly.’

 

The Guardian: Top Five Regrets of the Dying

This is an old post that Mr Litlove alerted me to a while back and which I return to every now and then to check in with and check myself against. It arose out of a book written by an Australian nurse who spent several years working in palliative care. The regrets are:

1. I wish I’d had the courage to live a life true to myself, not the life others expected of me.

2. I wish I hadn’t worked so hard (apparently every single man said this).

3. I wish I’d had the courage to express my feelings (many felt that buried resentment and bitterness had played a part in their illnesses).

4. I wish I had stayed in touch with my friends (ironically, while typing this my neighbour came to the door for a chat and after catching up with the headlines I had to shoo her away because I had so much work to do).

5. I wish that I had let myself be happier.

What a salutory lesson those five regrets encompass. I find myself particularly drawn to the last one, although I think that, taken wrongly, it can be made into an excuse for suppressing problems that really need to be dealt with. I’d probably change it into ‘I wish I’d let myself fully recognise what emotions were appropriate to any given situation, and let myself experience them.’

On that note, I will just say that I think my son is beginning to find more emotional equilibrium, and my back is a great deal better. Thanks to the splendid heated band-aid, I did make the event in Heffers last week with Jill Dawson (who turns out to be absolutely lovely). I was not what you’d call comfortable, but I was there. One less thing to regret. :)  Thank you all for your amazing, invaluable support; I certainly couldn’t manage without my virtual friends.

How Not To Learn The Hard Way

Erickson, looking worryingly like Robbie Williams' granddad

Erickson, looking worryingly like Robbie Williams’ granddad

In the first half the twentieth century, a psychotherapist called Milton Erickson had a gift for teaching people in strange and unusual ways. All Erickson’s patients wanted to do was something supposedly quite normal – lose weight, make love, travel without fear, or develop a new skill – but it was as if some kind of enchantment held them hopelessly in place. Bewitched by fear or insecurity, they lived lives of confinement, until Erickson and his bizarre methods succeeded again and again in releasing them from their spell. His therapies often looked contentious, but what he did have was insight into the obstacles we like to erect in the path of the learning process.

Erickson knew all about being stuck. As a teenager he had nearly died from an attack of polio that left him paralysed and mute. Using body memories and an unfeasible amount of determination, he re-learned how to access his muscles and eventually regained control of his speech and his arms. Dissatisfied still that he could not use his legs, he decided to embark, alone, on a thousand-mile canoe trip, taking with him only a few dollars. He returned home able to walk with the help of a cane, the ordeal having taught him how to push himself beyond what he believed to be his physical, mental and emotional capacities. These experiences restored his body to him, but they also gave him much insight into the complicated process of getting people to learn things to which they have an inbuilt resistance. He knew that minds are bewitched by the magician’s sleight of hand and powerfully affected by the experience of an ordeal, and he made use of these different mental triggers in his therapeutic process with great cunning and invention.

He was particularly successful at treating sportsmen who were struggling to reach new levels of achievement. One of his case histories concerns a young American high school boy who won a gold medal at the Olympics under his tutelage. When Erickson first met Donald Lawrence, he had been practicing the shot put for a year and theoretically had everything going for him. He was six foot six, 260 pounds of pure muscle and trained by an ambitious coach. But he was still significantly short of attaining a national high school record. Erickson told him the story of how Roger Bannister found the right frame of mind to break the four-minute mile by recognizing that he only needed to shave a tenth of a second off the previous record. He said to Lawrence, ‘You have already thrown the shot fifty-eight feet. And Donald, tell me honestly, do you think you know the difference between fifty-eight feet and fifty-eight feet and one-sixteenth of an inch?’

Over the next few sessions, Erickson would repeat this technique, lingering over the hard to conceptualise difference between fifty-eight and fifty-nine feet on an athletic field, or as he put it, enlarging the possibility for the young man. Two weeks later, Lawrence set a national high-school record.

Lawrence wins his Olympic bronze medal

Lawrence wins his Olympic bronze medal

Having proven himself to be a magician, Erickson had the boy in the palm of his hand. A few months later, he came to Erickson for advice about the Olympics. ‘You are just an eighteen-year-old kid,’ Erickson told him. ‘It would be all right if you bring home the bronze medal.’ Which Lawrence promptly did. Four years later, Erickson advised him that it would be fine for him now to win gold. By the time he stopped working with him, Lawrence was throwing the shot put sixty-eight feet and ten inches, all on the basis of a potent cocktail of numerical confusion, self-belief and a dogged devotion to the magic of authority.

Erickson’s success was based on the recognition that the conscious mind has really very little say in what we actually end up doing. All those motivational talks, all that pumping oneself up, all that pleading and scolding that goes on inside our heads is so much white noise. What’s actually in control is a small, piggy part of the self, stubborn, well-defended and unwilling to budge. Erickson’s methods depended on implementing change by tiny, tiny increments. The natural inclination is to rush towards change, trying to attempt far too much in one go and ensuring failure.  Instead, he encouraged his patients to consider how to make a two percent change to their situation. It had to be something negligible, something almost ludicrous in order to evade all those internal censors, hell-bent on assuring continuity. For once a little change has been made, change itself became a more acceptable concept, and another step in the right direction would be much easier to undertake.

But aligned with this insight was Erickson’s covert use of authority. Authority is generally what most of us appeal to in order to get the piggy part moving. Do it, or else, is the classic default setting for action. But Erickson’s authority was benign when he worked with Lawrence. Erickson was known as a shrewd judge of character, quick to exploit a patient’s foibles, and when he saw the docile, hard-working Lawrence steered into his consulting rooms by a determined coach with his eye on high school glory, he must have recognized a personality that would readily and willingly submit.

A relationship to authority resides at the heart of any learning process. The fear of the teacher’s wrath, the fear of the exam, the fear of public humiliation are undoubtedly motivating factors. But the stick isn’t enough on its own – there must be a carrot too. And the flip side of authority, its gentle alter ego, is the act of belonging. We submit to education in the first place in order to belong to our world, to a particular culture or society and its ways of thought. Belonging is a hidden, stealthy part of the things we learn, but it is all the more powerful for being understated. The young shot-putter belonged entirely to Erickson, as his faithful and loyal disciple. The sheer power of that belonging gave him the confidence to do whatever it was that Erickson said he could do.

For most of us, the point of thinking is to reach a point where we don’t have to think any more. A point where our ideas are organised, fixed and justified. And that point is usually one that is terrifically satisfying in relation to belonging – our ideas please our parents or our teachers, they seem in line with the famous figures we admire, the class we aspire to, the religion or political party that impresses us. It’s why intellectual arguments, no matter how brilliant they are, rarely persuade people to think otherwise, even in situations where objective, rational arguments might be recognized as extremely valuable. We have already thought ourselves into a position that feels secure and correct. To have to move on from it, to undermine all we have learnt to master, to face challenges, new ordeals, opposing thoughts, well, it’s no wonder that it’s a ghastly, unnerving prospect for anyone.

Erickson showed how knowledge is not just an acquisition based on logic, but one fraught with emotion and the need for security. We become emotionally attached to what we think we know, and so the greater the change in our knowledge, the more emotionally challenging it feels.

This post is a sort of indirect response to two fantastic articles:  Laura Miller’s brilliant continuation of Eleanor Catton’s article on literature and perceived elitism (after another twitter storm over the use of the word ‘crepuscular’ in the Paris Review).

The Riddle of the Labyrinth

RiddleOfThe LabyrinthMuch as I love words and enjoy arranging them in pretty patterns, I am completely hopeless when it comes to crossword puzzles and anagrams. I just don’t have the kind of mind that can crack a code. So I wondered how I’d get on with The Riddle of the Labyrinth by Margalit Fox, the story of an academic relay race to solve the mystery posed by Linear B, the oldest discovered language on earth. Learning it had made the New York Times Notable Books list spurred me to pick it up (I am so shallow), and I was so glad I did. It’s a fascinating account of three eccentric and gifted individuals who shared a violent obsession and who, between them, proved that the impossible just takes a little longer.

The story begins in 1900 when the Victorian archaeologist, Arthur Evans, broke ground at Knossos in the wild northern reaches of Crete. He came looking for a bronze age settlement, unpersuaded by current thinking that the magnificent race of the Ancient Greeks had sprung into being as fully formed as one of their own Gods. And he was amazingly lucky. Before the week was out, his workmen had hit upon huge building blocks of gypsum, the walls of a vast prehistoric building that predated the earliest recorded Greek settlement by a thousand years. Evans believed he had found the palace of Minos, famous for its labyrinth and the Minotaur that lurked in its depths. The building had contained hundreds of rooms, linked by a complex mass of passages; surely the literal foundation on which the legend would grow.

What was certain was that the excavation had hit the administrative centre of a sophisticated civilization. The fire that had destroyed it had served to bake hard and preserve its records, over two thousand clay tablets inscribed in an unknown language. The find was of immense proportions, the kind of discovery that would see an archaeologist through to the end of his life. Arthur Evans then did what any ambitious academic would do – he sat on his laurels, not allowing anyone else to view his finds, determined to crack the language as the prime achievement of his career. But he was a busy man, and an unknown language in an unknown script offers a fairly daunting obstacle. He published a little, revealing pictures of two hundred or so of the tablets, and made very little progress in decipherment before his death at the ripe old age of 90.

Now I would have absolutely no idea where you would begin with such a task, but it turns out that languages come in a limited range of sizes and varieties. Essentially they are either logographic (little pictures), syllabic (symbols for each different syllable) or alphabet based, as English is. You can tell quite quickly which you are dealing with because of the number of symbols encountered. If every word requires its own picture, then you end up with thousands of symbols. When it comes to syllables, you’ll need 80 or so, and an alphabet is the most economic with symbols, our own a mere 26. Linear B as the language was called, was syllabic, with a few hieroglyphs thrown in for good measure. These were pretty helpful in recognising basic words like man and woman, horse, goblet, cauldron, your average Bronze Age necessities. Arthur Evans also figured out that a mark like a straight comma was used to separate words, and that the script read from left to right.

The meagre publications Arthur Evans made unleashed a coterie of impassioned linguists and classicists onto the problem. But this was the middle of the twentieth century, telecommunications networks were in their infancy and two world wars had left nations poor in resources. Scholars were forced to work in isolation. This was one of the reasons why the middlewoman in the chain of decipherment, the woman who did all the hard graft, has received no recognition for it before Margalit Fox’s book, being somewhat lost to history. Alice Kobel, a hardworking classics teacher in Brooklyn, painstakingly wrote out every word from the 200 or so tablets in the public domain on the few scraps of paper she could get her hands on – the backs of greeting cards, hymn sheets, checkout receipts (wartime rationing left everyone in this era short on paper), then she filed them in boxes made out of the cardboard from cigarette cartons. She noted all the patterns she could find and punched holes in her index cards appropriately, so that when lined up, matches could be found. It was, in essence, an early database. Kober’s approach was rigorous and scientific – no fun guessing at what the language could be. She would work purely with its form alone, teasing grammatical rules out of it, and eventually plotting syllables on a grid, rather like an enormous sudoko puzzle.

It was this painstaking work that left the way open for a British architect, Michael Ventris, who was something of a linguistic genius on the side, to eventually crack the code. Kober and Ventris were both doomed individuals, people who did amazing things and who seemed to have to pay a price for that. Polite and helpful Alice Kober, out of a pure love for scholarship and an absence of competitive spirit, ended up appallingly abused by the male scholars she admired (that bit made my blood boil), and the reader is reminded, once again, that even those who should know better mistake fervent belief for knowledge. This was a surprisingly compelling book, though I shouldn’t have been surprised, given that when humans overreach themselves the results are always hypnotic. And it’s fairly mindblowing to consider a literate civilization in existence some three thousand years ago. If you know a fan of cryptic crosswords, thrust this one into their hands, and for even the most anagramatically-challenged (like myself) it’s a wonderful story.

Critical Theory; A Life

Early in October 1988, I rocked up to the inaugural lecture of the modern critical theory paper, a module I’d signed up for because it sounded new and exciting. Cambridge agreed. The lecture hall was packed out, with most of the English faculty crowded into the front rows and, quite shockingly, my own lecturers and supervisors hogging all the seats at the back. I had never seen the grown-ups, as it were, attending undergrad lectures before. The handful of modern linguists who were actually going to sit the paper, myself amongst them, were submerged by a sea of interested parties. Cambridge had toyed with theory for a while, famously inviting the French Daddy of deconstruction, Jacques Derrida to give a guest lecture, in which he infamously spent the hour discussing the white space between the title of a work and its first lines. But this was the first time that the university had decided to create a syllabus, teach the theory and examine it. For a place that in its Tudor infancy spent a couple of hundred years dedicated to the works of Aristotle before moving onto anything else, this represented swift progress.

It was the Modern Languages faculty that sponsored the paper because theory, as we were about to learn it, had exploded out of the Left Bank of Paris at the end of the 50s. In 1958 the literary journal Tel Quel was founded, and over the next 24 years it attracted a swarm of cultural and literary theorists. Postmodernism, post-structuralism, psychoanalytic theory, feminism, postcolonial theory, reader response theory, these were the ideas setting the intellectual world alight.

At almost the same time in Cambridge (1959 in fact), the biggest ever fight between the sciences and the arts was taking place. In the red corner was C. P. Snow, who criticized the ‘snobbish’ culture of intellectuals for holding back the progress of science and technology, which he believed were about to change the world. In the blue corner was literary critic F. R. Leavis, who laced up his gloves and declared that literature was the place where everyone got to discuss what was actually happening in the world, unlike the sciences which belonged exclusively to those with advanced degrees. Everyone could read and have an opinion on the new books by Graham Greene and Kingsley Amis, but only a handful of people could understand the latest developments in quantum electrodynamics.

There was no clear winner to the debate, but over the next 25 years science and technology gained the upper hand in the cultural imagination. Scientists were increasingly seen as the saviors and pioneers of Western society, literature a leisure pursuit for a minority. Hardly surprising, then, that theory, the closest literature would come to a science of its own, should look so enticing as a way of perking up any flagging interest in the arts.

But theory was exciting, too. I loved the ideas in it, and how audacious and challenging they were. I enjoyed the process by which those ideas went from being ludicrous at first glance to naggingly plausible. Psychoanalytic and feminist theory were the areas that interested me the most. I was intrigued by the challenge the feminists faced to represent a group of people who wanted above all else to be seen as individuals. After centuries of an imposed identity as sweet, nurturing, charming, useless creatures, women longed to be different, but not instantly shoved into another set of adjectives: strong, competitive, dynamic, resilient, whatever. It’s an issue that, as far as I can see, has never yet been resolved. Women still get trapped into a ‘story’ by their cultures and forbidden from diverging from, or subverting, the party line. In my psychoanalytic studies, I was fascinated by the notion that a book, emerging from the mind of a writer, had the same characteristics as that mind: there was an evident surface meaning to it, but also an unconscious one, hidden in the shadows and ambiguities of the writing. Just that idea alone put paid to the belief that authorial intentions were the most important way to view a story. The author had as much chance of seeing his intentions come to fruition in narrative as he did making them come good in real life.

There were so many ideas thrown at me in that course, and I found it fun to play with them. I learned that theory was at its best when being applied to a book. Theory and practice struck sparks, and I grew adept at hunting down the places where they contradicted one another, or created a strange paradox. This was the point of theory for me – if it fit perfectly over literature and life, then we would be robots and our stories nothing more than a vast instruction manual. It was the very places where theory and practice buckled and fought one another that showed up what it was to be human, and how slippery and strange and surprising art could be.

My career at the university lasted as long as the modern critical theory paper did. It was retired a year or so before I stopped teaching, though it continues to this day to be part of the graduate syllabus. A couple of years after that, I noticed the tide turning and a surprising amount of hostility being directed against theory, as if it were in some way responsible for spoiling the field of literary criticism. The anger seemed to arise from the way some theory texts were written, essentially those heavily influenced by the discourse of philosophy. This was a bit unfair, given just how much theory there was available, and how much of it – including all my chosen areas of psychoanalysis, feminism and reader response theory – was perfectly accessible. Books by the likes of Jacques Derrida and Julia Kristeva suffered from being read in translation; I always found them much better in French. And then I think in the States, theory was taught in a vacuum, outside its historical context and away from its natural interaction with literature, which can’t have helped.

But it was hard to get away from the feeling that people were upset with theory because it made them feel stupid. Which says more about the stranglehold of insecurity than it does about theory (and more about the stranglehold of the grade over the notion of an education). I mean, I loathed algebra, which certainly made me feel stupid, but I didn’t believe it wasn’t useful to someone, somewhere. Without those decades of academics working on literary theory, we wouldn’t have the canon of women’s writing we do now, nor literature written by oppressed people of colour, both championed by intellectuals, studied in universities and finally merged with the mainstream. Political correctness wouldn’t exist, and our understanding of history would be infinitely poorer. Hundreds of novels and films and buildings and pieces of music and adverts wouldn’t have been inspired or influenced by theory.

But I wonder whether the ultimate reason for the anger against theory lay back in that debate between Snow and Leavis. Leavis had argued that literature was for everyone in a way science was not. Literature has the power to bring us together to discuss what is happening in society, and maybe we are wired up to want that. We don’t seem to mind the inaccessibility of science, but we do mind if stories get talked about in ways that seem exclusive. If that’s the case, then it’s up to the general reader to keep the discussion going.