How Not To Learn The Hard Way

Erickson, looking worryingly like Robbie Williams' granddad

Erickson, looking worryingly like Robbie Williams’ granddad

In the first half the twentieth century, a psychotherapist called Milton Erickson had a gift for teaching people in strange and unusual ways. All Erickson’s patients wanted to do was something supposedly quite normal – lose weight, make love, travel without fear, or develop a new skill – but it was as if some kind of enchantment held them hopelessly in place. Bewitched by fear or insecurity, they lived lives of confinement, until Erickson and his bizarre methods succeeded again and again in releasing them from their spell. His therapies often looked contentious, but what he did have was insight into the obstacles we like to erect in the path of the learning process.

Erickson knew all about being stuck. As a teenager he had nearly died from an attack of polio that left him paralysed and mute. Using body memories and an unfeasible amount of determination, he re-learned how to access his muscles and eventually regained control of his speech and his arms. Dissatisfied still that he could not use his legs, he decided to embark, alone, on a thousand-mile canoe trip, taking with him only a few dollars. He returned home able to walk with the help of a cane, the ordeal having taught him how to push himself beyond what he believed to be his physical, mental and emotional capacities. These experiences restored his body to him, but they also gave him much insight into the complicated process of getting people to learn things to which they have an inbuilt resistance. He knew that minds are bewitched by the magician’s sleight of hand and powerfully affected by the experience of an ordeal, and he made use of these different mental triggers in his therapeutic process with great cunning and invention.

He was particularly successful at treating sportsmen who were struggling to reach new levels of achievement. One of his case histories concerns a young American high school boy who won a gold medal at the Olympics under his tutelage. When Erickson first met Donald Lawrence, he had been practicing the shot put for a year and theoretically had everything going for him. He was six foot six, 260 pounds of pure muscle and trained by an ambitious coach. But he was still significantly short of attaining a national high school record. Erickson told him the story of how Roger Bannister found the right frame of mind to break the four-minute mile by recognizing that he only needed to shave a tenth of a second off the previous record. He said to Lawrence, ‘You have already thrown the shot fifty-eight feet. And Donald, tell me honestly, do you think you know the difference between fifty-eight feet and fifty-eight feet and one-sixteenth of an inch?’

Over the next few sessions, Erickson would repeat this technique, lingering over the hard to conceptualise difference between fifty-eight and fifty-nine feet on an athletic field, or as he put it, enlarging the possibility for the young man. Two weeks later, Lawrence set a national high-school record.

Lawrence wins his Olympic bronze medal

Lawrence wins his Olympic bronze medal

Having proven himself to be a magician, Erickson had the boy in the palm of his hand. A few months later, he came to Erickson for advice about the Olympics. ‘You are just an eighteen-year-old kid,’ Erickson told him. ‘It would be all right if you bring home the bronze medal.’ Which Lawrence promptly did. Four years later, Erickson advised him that it would be fine for him now to win gold. By the time he stopped working with him, Lawrence was throwing the shot put sixty-eight feet and ten inches, all on the basis of a potent cocktail of numerical confusion, self-belief and a dogged devotion to the magic of authority.

Erickson’s success was based on the recognition that the conscious mind has really very little say in what we actually end up doing. All those motivational talks, all that pumping oneself up, all that pleading and scolding that goes on inside our heads is so much white noise. What’s actually in control is a small, piggy part of the self, stubborn, well-defended and unwilling to budge. Erickson’s methods depended on implementing change by tiny, tiny increments. The natural inclination is to rush towards change, trying to attempt far too much in one go and ensuring failure.  Instead, he encouraged his patients to consider how to make a two percent change to their situation. It had to be something negligible, something almost ludicrous in order to evade all those internal censors, hell-bent on assuring continuity. For once a little change has been made, change itself became a more acceptable concept, and another step in the right direction would be much easier to undertake.

But aligned with this insight was Erickson’s covert use of authority. Authority is generally what most of us appeal to in order to get the piggy part moving. Do it, or else, is the classic default setting for action. But Erickson’s authority was benign when he worked with Lawrence. Erickson was known as a shrewd judge of character, quick to exploit a patient’s foibles, and when he saw the docile, hard-working Lawrence steered into his consulting rooms by a determined coach with his eye on high school glory, he must have recognized a personality that would readily and willingly submit.

A relationship to authority resides at the heart of any learning process. The fear of the teacher’s wrath, the fear of the exam, the fear of public humiliation are undoubtedly motivating factors. But the stick isn’t enough on its own – there must be a carrot too. And the flip side of authority, its gentle alter ego, is the act of belonging. We submit to education in the first place in order to belong to our world, to a particular culture or society and its ways of thought. Belonging is a hidden, stealthy part of the things we learn, but it is all the more powerful for being understated. The young shot-putter belonged entirely to Erickson, as his faithful and loyal disciple. The sheer power of that belonging gave him the confidence to do whatever it was that Erickson said he could do.

For most of us, the point of thinking is to reach a point where we don’t have to think any more. A point where our ideas are organised, fixed and justified. And that point is usually one that is terrifically satisfying in relation to belonging – our ideas please our parents or our teachers, they seem in line with the famous figures we admire, the class we aspire to, the religion or political party that impresses us. It’s why intellectual arguments, no matter how brilliant they are, rarely persuade people to think otherwise, even in situations where objective, rational arguments might be recognized as extremely valuable. We have already thought ourselves into a position that feels secure and correct. To have to move on from it, to undermine all we have learnt to master, to face challenges, new ordeals, opposing thoughts, well, it’s no wonder that it’s a ghastly, unnerving prospect for anyone.

Erickson showed how knowledge is not just an acquisition based on logic, but one fraught with emotion and the need for security. We become emotionally attached to what we think we know, and so the greater the change in our knowledge, the more emotionally challenging it feels.

This post is a sort of indirect response to two fantastic articles:  Laura Miller’s brilliant continuation of Eleanor Catton’s article on literature and perceived elitism (after another twitter storm over the use of the word ‘crepuscular’ in the Paris Review).

Critical Theory; A Life

Early in October 1988, I rocked up to the inaugural lecture of the modern critical theory paper, a module I’d signed up for because it sounded new and exciting. Cambridge agreed. The lecture hall was packed out, with most of the English faculty crowded into the front rows and, quite shockingly, my own lecturers and supervisors hogging all the seats at the back. I had never seen the grown-ups, as it were, attending undergrad lectures before. The handful of modern linguists who were actually going to sit the paper, myself amongst them, were submerged by a sea of interested parties. Cambridge had toyed with theory for a while, famously inviting the French Daddy of deconstruction, Jacques Derrida to give a guest lecture, in which he infamously spent the hour discussing the white space between the title of a work and its first lines. But this was the first time that the university had decided to create a syllabus, teach the theory and examine it. For a place that in its Tudor infancy spent a couple of hundred years dedicated to the works of Aristotle before moving onto anything else, this represented swift progress.

It was the Modern Languages faculty that sponsored the paper because theory, as we were about to learn it, had exploded out of the Left Bank of Paris at the end of the 50s. In 1958 the literary journal Tel Quel was founded, and over the next 24 years it attracted a swarm of cultural and literary theorists. Postmodernism, post-structuralism, psychoanalytic theory, feminism, postcolonial theory, reader response theory, these were the ideas setting the intellectual world alight.

At almost the same time in Cambridge (1959 in fact), the biggest ever fight between the sciences and the arts was taking place. In the red corner was C. P. Snow, who criticized the ‘snobbish’ culture of intellectuals for holding back the progress of science and technology, which he believed were about to change the world. In the blue corner was literary critic F. R. Leavis, who laced up his gloves and declared that literature was the place where everyone got to discuss what was actually happening in the world, unlike the sciences which belonged exclusively to those with advanced degrees. Everyone could read and have an opinion on the new books by Graham Greene and Kingsley Amis, but only a handful of people could understand the latest developments in quantum electrodynamics.

There was no clear winner to the debate, but over the next 25 years science and technology gained the upper hand in the cultural imagination. Scientists were increasingly seen as the saviors and pioneers of Western society, literature a leisure pursuit for a minority. Hardly surprising, then, that theory, the closest literature would come to a science of its own, should look so enticing as a way of perking up any flagging interest in the arts.

But theory was exciting, too. I loved the ideas in it, and how audacious and challenging they were. I enjoyed the process by which those ideas went from being ludicrous at first glance to naggingly plausible. Psychoanalytic and feminist theory were the areas that interested me the most. I was intrigued by the challenge the feminists faced to represent a group of people who wanted above all else to be seen as individuals. After centuries of an imposed identity as sweet, nurturing, charming, useless creatures, women longed to be different, but not instantly shoved into another set of adjectives: strong, competitive, dynamic, resilient, whatever. It’s an issue that, as far as I can see, has never yet been resolved. Women still get trapped into a ‘story’ by their cultures and forbidden from diverging from, or subverting, the party line. In my psychoanalytic studies, I was fascinated by the notion that a book, emerging from the mind of a writer, had the same characteristics as that mind: there was an evident surface meaning to it, but also an unconscious one, hidden in the shadows and ambiguities of the writing. Just that idea alone put paid to the belief that authorial intentions were the most important way to view a story. The author had as much chance of seeing his intentions come to fruition in narrative as he did making them come good in real life.

There were so many ideas thrown at me in that course, and I found it fun to play with them. I learned that theory was at its best when being applied to a book. Theory and practice struck sparks, and I grew adept at hunting down the places where they contradicted one another, or created a strange paradox. This was the point of theory for me – if it fit perfectly over literature and life, then we would be robots and our stories nothing more than a vast instruction manual. It was the very places where theory and practice buckled and fought one another that showed up what it was to be human, and how slippery and strange and surprising art could be.

My career at the university lasted as long as the modern critical theory paper did. It was retired a year or so before I stopped teaching, though it continues to this day to be part of the graduate syllabus. A couple of years after that, I noticed the tide turning and a surprising amount of hostility being directed against theory, as if it were in some way responsible for spoiling the field of literary criticism. The anger seemed to arise from the way some theory texts were written, essentially those heavily influenced by the discourse of philosophy. This was a bit unfair, given just how much theory there was available, and how much of it – including all my chosen areas of psychoanalysis, feminism and reader response theory – was perfectly accessible. Books by the likes of Jacques Derrida and Julia Kristeva suffered from being read in translation; I always found them much better in French. And then I think in the States, theory was taught in a vacuum, outside its historical context and away from its natural interaction with literature, which can’t have helped.

But it was hard to get away from the feeling that people were upset with theory because it made them feel stupid. Which says more about the stranglehold of insecurity than it does about theory (and more about the stranglehold of the grade over the notion of an education). I mean, I loathed algebra, which certainly made me feel stupid, but I didn’t believe it wasn’t useful to someone, somewhere. Without those decades of academics working on literary theory, we wouldn’t have the canon of women’s writing we do now, nor literature written by oppressed people of colour, both championed by intellectuals, studied in universities and finally merged with the mainstream. Political correctness wouldn’t exist, and our understanding of history would be infinitely poorer. Hundreds of novels and films and buildings and pieces of music and adverts wouldn’t have been inspired or influenced by theory.

But I wonder whether the ultimate reason for the anger against theory lay back in that debate between Snow and Leavis. Leavis had argued that literature was for everyone in a way science was not. Literature has the power to bring us together to discuss what is happening in society, and maybe we are wired up to want that. We don’t seem to mind the inaccessibility of science, but we do mind if stories get talked about in ways that seem exclusive. If that’s the case, then it’s up to the general reader to keep the discussion going.

Publishing, A Writer’s Biggest Headache

Unsurprisingly, the issues surrounding publishing and our sense of ourselves as writers have provoked the longest discussion of the whole writing course. It seems to me that at no point in the past has publishing ever been such a problem to writers. If you were mad – or educated – enough to want to write, then publication was something that eventually happened down the line. It strikes me (and I may be wrong) that writers were a much more self-selecting band, and publishers were a much more adventurous bunch.

Nowadays you could be forgiven for thinking that everyone in the world writes and harbours some secret dream of superstardom. And publishers seem (and this may be an illusion) to have become more and more cagey and restrictive about what they will put out. Rather than simply accept these barriers to self-expression, those Darwinian technology types gave us the digital world. And paradoxically, the more platforms that appear for writers to publish on, the more problematic it all becomes. There are people out there drawing flow charts now to account for all the different choices that can be made. And still the question remains: who will actually read us?

It seems to me that the basic problem is that publishing is way too emotive a subject for writers to be allowed near. You say the ‘P’ word in authorial company and suddenly everyone is rushing to declare their practised speeches, composed during the small hours of the afternoon when writing is hard and recognition distant and somehow morale must be maintained. The other basic problem is that many writers talk about publishing before they have actually experienced it. In the same way that newly-formed partnerships fantasise romantically about having children, and university students imagine being rich, writers think about publication as a joyous event, and quite possibly one that will solve all their problems – financial, moral, existential. Whereas most of us who have published limp bloodied from the arena, humiliated by having failed to make the crowd go wild. My premise in this post is that – like so many modern phenomenon – publishing is an awful experience and yet still we want it beyond all reason.

Most of my publishing experience has been academic. I’ve published four books and about 20 articles and chapters, and I did this over the space of about 12 years. Which strikes me as quite a lot in hindsight, throwing my son and chronic fatigue into the mix. I rarely got paid for it (£200 was the only advance I was ever given), and my writings disappeared with the tiniest splash into the great ocean of academic tracts. Academics only read for what they’re researching – you don’t go and read someone’s book about Rabelais just because it’s supposed to be good – which to my mind is why academic writing has become so insular and unattractive to general readers. Everyone writes to be clever. Not everyone writes to be stylish, although the field of academic research even in my small literary corner is vast, and it makes no sense to either celebrate or condemn it. It has been formed by the forces of necessity and good intention.

So why did I take it upon myself to add to the trillions of words? It clearly wasn’t for fame or fortune. I didn’t do it ‘for myself’, whatever that really means. When the boxes of advance copies arrived I felt pleased for about two minutes, and then I found the immediate issues of the day more pressing. No, I did it because it was, for me, the root of my work as a literary critic. Everything grew out of that quiet moment when I thought about a book and shaped those thoughts into words. That was the beating heart of my discipline. How could I teach students about essay writing if I weren’t engaged with that process myself? How could I put good lectures together unless I had thought long and hard about the books I was discussing? I had to write about the book to get to the bottom of my reaction to it. And once I had written down to my satisfaction an interpretation of a book, I wanted to share it with my community, to contribute to the ongoing discussion and because (rightly or wrongly) I felt I had something to say.

It was always important for me to know whom I was writing for, not least because I could then be assured of saying something that might feasibly be useful. Essentially, the reason I write at all is to get my message across. I want to make people think about things they generally try to avoid thinking about. So I have to strategise a lot with regard to the audience if I have any hope of achieving that goal. What I learned pretty fast through teaching (and living) is that human beings make dreadful listeners. On the whole they hear only what they want to hear, or what they are afraid they will hear. It’s always seemed to me the most intriguing challenge to get readers to put their prejudices, their hopes, their anxieties aside and hear instead what I want them to.

So I have never been someone who writes ‘for myself’. I do not want, as one of my classmates brilliantly put it, to be engaged in nothing more than a monologue. I want to be talking to someone and not as a party bore who has importuned them in a corner, but as a voice saying something they might find illuminating to hear. Our course instructor said that he found the whole publishing malarkey much easier when he thought of his writing as a gift that he bestowed regularly on the world, without expectation of reward. This sounds to me like a convenient solution to a knotty problem, and one that makes us all look pretty. It’s a lovely image, isn’t it? This idea of the author writing in an isolated cell, then sending her work out into the world with never a glance to see where it lands, indifferent to its fate. Yeah, well, dream on. I’ve never met a writer who doesn’t long for praise and popularity. And I’ve never met anyone who wasn’t hurt by rejection. Writers do what they do because they feel close to the human predicament – we can’t suddenly turn around and become angels or saints. It’s better to face up to the large and ugly emotions that are part of the process.

I think the consolatory fantasy of self-fulfillment has risen in proportion to the difficulties encountered in actually getting published. Because so few of us are ever likely to have the bonus of an audience for our writing, we have to work our way around that emotionally. And whether we publish traditionally or independently, an audience remains the most elusive commodity. As I said, it’s only once you’ve published and realised that it does not suddenly make your work desirable and praiseworthy, that you live in cold, hard clarity. I don’t pretend to have any solutions to this muddle and I’m not sure there are any. I think perhaps writers have to accept they are a kind of modern-day Sisyphus, condemned to roll the rock to the top of the mountain and watch it fall back down again, but enraged when it does so and longing every time for that rock finally to stay in place and become a monument to endeavour.

How To Survive Your University Course

It’s that time of year when the students go back, and partly because I am bio-rhythmically set up to expect students requiring help across the autumn, I thought I would extract the salient points from those years of study support and offer them here in a handy guide to the hopelessly at sea. If you are not yet accustomed to the life of the university, or can’t seem to get your expectations straight, I hope this will help.

 

1. Academics will not be like any other teachers you have met. Up until now, teachers have been in loco parentis – discipline, crowd control, your general behaviour and well-being, have been part of their concern. Now, although the set up seems more personal in some ways than it was before, it’s actually much more impersonal. Academics will rarely know who you are in any meaningful way. You will meet over the work, and that’s the place you interact with them. So, if they seem cold or distant, they do not ‘hate’ you, they are not focused on you in that way and are probably just thinking about other things entirely.

2. You will not get everything done that you are told to do. Again, academics are motivated by their subject rather than by the needs of the individual. They will tell you what you need to do/read/study in order to become what they consider proficient in their subject. You must bear in mind that for academics, the words anal, pedantic and obsessive are compliments. They will also be looking back from the vast heights of their experience and giving you the best of it, but compacted somehow. They will have forgotten how long it took them to read the books in question.

3. Do as much as you can, but don’t waste time and energy beating yourself up over what doesn’t get done. Life will take over, as it always does, and some things will happen and others won’t and some you will do well and some will be shoddy. Try to keep your eye on the big picture – what are you learning here? What do you need to master in order to make you feel comfortable with your subject? Your heroism lies not in scaling vast mountains of work, but in applying yourself to the things you do not understand. Understanding is all. You simply cannot know everything there is to know at this level.

4. Schedule free time for yourself and take it. Students head for the rocks as soon as they decide they have no time for anything but work. Then, the very real and necessary needs of the brain for downtime will start intervening every moment of the day. You will sit at your desk for hours at a stretch achieving very little out of a toxic mix of tiredness and resentment, entirely buried under the anxiety of not getting everything done. This is a disastrous state of affairs. Relax in your free time so that you can work when you want to be working.

5. The very biggest favour you can do yourself is to stop worrying about making mistakes. Mistakes are good. They are the way you actually learn something (rather that what we hope learning will be, which is a process of reassuring ourselves that there is nothing we need to learn). No one minds that you made a mistake apart from you. In fact, everyone who teaches you at university level will be expecting you to mess up, be confused, and completely fail to understand things. They will only be irritated if you do not speak up and say so. Speak up: this allows your teachers to do the work they are there for – helping you understand things they know to be complex.

6. Everyone struggles with the fact that, up until now, being ‘good’ at a subject has been a stable and reliable part of identity. And now you’ll have reached a level where you don’t feel ‘good’ at your subject any more. Psychologists have discovered that starting a course in higher education can provoke the same feelings in people as a major bereavement. Of course – you will have lost, to some extent, the person you used to be. To the best of your ability, let your old self go. Have confidence in the future and your own resources. You are in a transitional state, in the act of learning, and there’s a reason why university courses are at least three years long. You’ll need that time to move up to the next level, so be patient with yourself.

7. Up until now, learning has too often been about pleasing other people – pleasing your parents, pleasing your teacher, pleasing some abstract god of achievement. This is all going to have to stop. What you really need to find is intellectual curiosity. Whatever you are working on, try to find the part of it that actually appeals to you, that makes you curious to know more. Wherever you have choices, take time to make ones that correspond with what you are truly interested in, rather than what you think you ‘should’ do. Follow your gut instincts. It’s time to discover what you really like and what properly intrigues you.

8. Do not compare yourself to others. That way madness lies.

9. When people tell you that you are responsible for your learning now, what they mean is that you are responsible for finding the pleasure, the satisfaction and the reward in it. For some reason these are things we stubbornly sit back and expect others to provide for far too long (often in the form of good grades if all else fails). You probably will be among some of the best teachers you’ve ever had, but unless you engage with the work and find your curiosity, it won’t make the slightest bit of difference. You really do get back what you put in willingly.

10. There is only one thing you really do need to find time for, and it’s the thing that my students always used to insist they simply could not spare time for: you need time to think. Yes, think. Let a problem roll around in your mind, reflect back over what you’ve read, mull over what you’ve been asked to do and why, just let all the information you’ve squeezed in have a chance to digest a bit. Everything goes smoother and better if you spare time for thinking. You can think doing the washing up, or going for a walk, or doing the laundry. You’ll find yourself thinking if you arrange to have coffee with a few friends from class and chat over what you’ve been doing. You’ll think when you are an arts student explaining something you are reading about to a science student. Or vice versa. Thinking: I cannot recommend it highly enough. Will you be insisting mid-term that you absolutely cannot find a moment to think? Undoubtedly, sigh. But trust me, it’s the answer.