Unraveling the Mysteries of the Writing Brain

The Creating Brain: Click on the image to watch a TED video about how the brain works

The Creating Brain: Click on the image to watch a TED video about how the brain works.

In writing process theory’s recursive “Which comes first, the chicken or the egg?” discussion, cogitation occurs before the act of writing.

It might not be conscious cogitation, but your brain is most definitely engaged, and it’s precisely the cognitive complexities involved in writing that make it not only the most complex skill humans possess, but also, not coincidentally, the most glorified.

Society puts writers on pedestals, granting them what French philosopher Pierre Bordieu calls ‘cultural capital,’ largely because we’re impressed when anyone can make something so seemingly difficult look so easy. We are also daunted by the mystery of writing; since we don’t know where inspiration comes from, we imbue it with magical properties.

In terms of what you’ve been told is true about writing, you are both conscious and unconscious of the societal influences you’ve been raised with. Your beliefs provide a mirror for the things you’ve been told. You aren’t going to believe or agree with everything you hear, but quite often, you’re not consciously aware of all of the messages you received, most of which are subsequently reinforced by society, just as they might be reinforced by your own life experiences.

The question when it comes to the brain, though, is where do these beliefs reside? It turns out that memory and memory-retrieval play a large part in forming our unconscious thoughts. These unconscious thoughts influence us much more than was previously believed.

Long story short, you and I, everyone in Western society, as a matter of fact, were taught to believe quite a few things about self-expression and creativity. Some of those things aren’t true, and in fact, the things that aren’t true only serve to limit and restrict you when you want to express yourself. Nowhere is the stigma cast by society as strongly negative as it is on the writer.

Once you honestly believe that only some rare and lucky people are innately talented, you’ve closed the door to your own potential. Although it is clear from research on the brain there is such a thing as an innate ability, that isn’t the endpoint, it’s only the rawest of beginnings. Our brains are all innately programmed for language skills, for example; how will those language skills be developed? How many neural connections will be encouraged by the people around us as our brains form?

One of the myths we’re told to believe is that it’s a mystery how creative geniuses can be significantly more inspired than the average person, with prolific flow that somehow never falters until the day the person dies. One of the things that’s wrong with this idea about how creativity functions is that the people who continue to write about “genius” not only sound star-struck, they typically write about the “genius” ahistorically and acontextually. This one simple fact does everyone who is creative a disservice, and I’ll explain why.

A recent example of this fault in researcher’s perceptions about creativity is to be found in the otherwise extremely helpful book about neuroscience’s role in understanding creativity, The Creative Brain by Nancy C. Andreasen.  

Andreasen provides a handful of examples of what genius looks like. Unfortunately, the creative people she chooses all come from the Romantic Era, which automatically means they describe their own creative abilities in the requisite dramatic, inexact, and emotionally-laden metaphors of their era. (To rephrase in once-popular parlance, the Romantics were extremely emo.)

Andreasen, predictably, relies on Amadeus Mozart as an example of creative genius. Not for the first time in creativity research is Mozart used as a fertile example of what we mean when we use the word ‘genius’. Unfortunately, Andreasen, a medical doctor and Ph.D., but not a “Freudian or psychoanalyst,” as she points out, cannot expand her subject to include environmental influences Mozart was surrounded by as he matured.  

In spite of the fact that it’s less popular to assert, it is nonetheless accurate to say that competition and collaboration, the kind found in a family of musicians, leads to spurred creation. In addition, no one has ever been able to analyse the myriad emotions that led to Mozart’s renowned desire and interest in learning everything he could about music.

It’s rare for these environmental factors to be taken into account when individual genius is assessed, because, for one thing, it’s less glamorous to revision the creator as anything but a demi-god. However, it’s also much more complicated to speculate that Mozart might have been more competitive than his siblings. He also clearly had one important personal characteristic that has more to do with creative output than any other: he was curious, with an insatiable desire to learn everything he could about music and composition.   

Everyone starts somewhere, even musical prodigies. Mozart's earliest compositions were vetted and criticized by other musicians, and the young composer was surrounded each day by music.

Everyone starts somewhere, even musical prodigies. Mozart’s earliest compositions were vetted and criticized by other musicians, and the young composer was surrounded each day by music.

Surface clues into the ‘mystery of genius’ Andreasen focuses on about Mozart are those we have all been taught to privilege when it comes to what ‘genius’ means: productivity and uniqueness.

It’s as though, when assessing Mozart’s creative force, it’s assumed he never left his room, never listened to anyone else’s music, never practiced, never had a bad day of composing when he threw everything away. In other words, his real life experiences are elided to perpetuate the myth of smooth, flowing, “pure” genius.

Later in the book, however, Andreasen does begin to unravel at least some of the mystery surrounding how the mind creates that all-too-elusive moment of inspiration that has been imbued with metaphor, myth, and mystery for far too long, when she gets to the meat of her neurological research. Andreasen tells us that

“most of the time we speak, we are producing a sequence of words that we have not produced before—in fact, that no one has produced before . . . we are producing language that is novel.  We make up coherent sentences “on the fly,” listening to ourselves speak while we are speaking, and planning what the next words will be as the words and sentences are produced” (Creative Brain 63-64, emphasis mine).

This one fact alone has profound consequences for writers. If we took our speech acts more seriously, we’d naturally do what I’ve suggested to writers for many years, which is read each other’s writing aloud, so that we could consciously hear what we’ve written, and respond verbally, all the while taking notes. We’d also take collaboration more seriously, since speaking our thoughts aloud is part of a series of necessary events in the writing process, but it is one that is not privileged by the old paradigm of the writer writing in isolation, speaking only to herself.  

Even though neuroscience is still unsure about the role each region of the brain controls, we all rely on various forms of memory-retrieval. The memory-retrieval skill of particular interest to writers is called “episodic memory,” which is used for free association, and it may be the source of

“information that is stored deeply and is therefore sometimes less consciously accessible. It draws on those freely wandering and undirected associative thoughts that constitute primary process thinking. It is a resource not only for the creative process but also for meditational states, religious experiences, and dreams” (Creative Brain 71-72).

The implications for writers in the above information has to do with understanding ourselves as creators. Instead of seeing the various acts of creation, especially the moment of inspiration we’ve imbued since the Ancient Greeks with the mystery of the Muse sitting on our shoulder, we can begin to free ourselves of our superstitions about writing, and replace our doubts with wisdom. No longer is the ‘wandering mind’ a negative state; it is, instead, crucial for creativity.

No longer will we think that information we’d otherwise wait to passively receive exists outside of ourselves; instead, now we can take responsibility for the fact that although we don’t know precisely where the thought came from, it is, nonetheless, stored in a part of our mind. Inspiration no longer resides in the ‘divine moment,’ it doesn’t belong to some long-dead lyrical Ancient Greek ‘Muse’—it is ours, it was always ours. This knowledge gives us ‘agency,’ which, not coincidentally, means we have power to act, speak, write, and best of all, to be free from limiting myths about creativity.

Woven together by an internal, seamless socially-inscribed ‘logic’, the myths that control how we think about writing are nonetheless not transparent, not natural, not fact. They are cultural artifacts, tattered remnants of a tapestry woven long ago by people who attempted to explain a phenomenon that seems mystical because it is so poorly understood: how a human being learns language and then uses that language to reflect emotion, impart wisdom and acquired knowledge, entertain with humor, incite a populace to war or to tears. And the myths exist in your unconscious memory; they influence your beliefs even now, unless you consciously choose to erase them and reframe them.

Grid structure of the major pathways of the brain, created by using a scanner that's part of the Human Connectome Project. Click on image for more information.

Grid structure of the major pathways of the brain, created by using a scanner that’s part of the Human Connectome Project. Click on image for more information.

When Albert Einstein’s brain was autopsied, it was found he had more neural connections between both hemispheres than the average person. It is important to recognize that he wasn’t born with those neural connections—they developed over time, and with effort on his part to constantly learn new things. The reason his brain could develop in this way has to do with the innate plasticity of our brains, and in this plasticity lies hope for anyone who wants to unlearn what you were told when young.

For example, recent discoveries have demonstrated that “cortical maps are subject to constant modification based on the use of sensory pathways” (Kandel & Hawkins 86).

This means that learning how to do something new literally changes the architecture of the brain.

We not only grow more neurons in response to learning (and the creation of a memory); we create an entire neural network that facilitates future learning, changing the brain’s cortical ‘map,’ or network of neurons. This has ramifications for those who believe our neural paths are fixed or predetermined, for it indicates that the act of learning itself changes the brain’s functioning.

Once the brain has learned and has formed new neural networks, the possibility for interaction between spheres increases, adding to the potential for increased intelligence. Increased neural networks allow for increased categorization and subcategorization of conceptual linguistic material such as metaphor and abstraction. 

What makes this learning possible in adolescents and adults is the brain’s neuroplasticity:

“[c]ontrary to the notion that the brain has fully matured by the age of eight or twelve . . . it turns out that the brain is an ongoing construction site . . . [m]aturation does not stop” (Schwartz and Begley, 128).

The tripartite interconnection between areas of the brain is facilitated, not by mystical intervention, but by learning, memorization, and experience, which creates “abilities that stick around if they’re used but wither if they’re not” (128).

Please contact me for details about the above references; since I have been researching this subject for more than 20 years, there’s a lot of data I can share with you, from the writer’s perspective.

To Write or Not To Write?—That Is The Question

I don’t want you to worry so much about procrastination, and I will explain why.

japanese garden

It’s not possible to expect the brain to produce at the same rate, the same quality, every single day. We need downtime, and we need rest and distraction. Inspiration is not encouraged when we’re feeling pressured; studies indicate that in fact, creativity requires an incubation period.

There are studies that make it clear that focusing on a highly complicated cognitive/emotional task like writing (which is not, after all, merely mechanical; it requires sorting, valuing, putting into hierarchies, ordering, memory-retrieval; plus all the emotions that come up and have to be worked through) puts tremendous strain on the brain.

And that’s on a good day when everything is working like clockwork and we experience ‘flow.’ On a bad day, there are deep wounds to recover from, as perhaps we stumble into yet another of what I call “subduction zones,” the memories that must be cleansed, but first must be accepted and worked with, if we are to write them out of our system.

Where is the simple need for calm, in all of that? I don’t think we can be writing machines. Thinking it’s possible to do this with no pause for rest and reflection denies our complicated humanity, psychology, and ignores how the brain actually works while we write.

Although we’ve been told to think of the brain as a kind of ‘computer,’ I prefer to use a non-mechanistic, and therefore, organic, metaphor. I start from the Shinto principle I learned when I was a kid, living in Japan, of waiting (with as much peace as I can muster); in this case, it’s waiting for a thought to build.

Japanese garden

In Japan you see rainwater collect at one end of a hollowed-out piece of bamboo until it’s heavy enough to spill down into a larger bowl. This is an organic process, and it’s not one that anyone labors over or agonizes about; it’s a simple and elegant approach to controlling and containing rainwater, which otherwise would spill out onto the ground and be wasted, or worse, flood the ground of their tiny little gardens.

The fundamental principle that forms my metaphor is that my brain is like that garden; I spend a lot of time pruning and working in the garden of my mind. When the “rainwater” collects, I am ready to write, but I won’t be able to write until the rainwater has had a chance to collect. This could take awhile.

So, here’s the thing: the metaphors we usually use to define procrastination in relationship to writing stem from a masculinist rhetoric of agonism, which is anti-woman, anti-feminist, and anti-human being, if you ask me. It’s also Western, and “yang,” instead of being Eastern and “yin,” and therein lies what’s wrong with it. Rushing like a freight train toward its goal, it is mostly concerned with achievement and “getting there,” rather than slowing down to appreciate the ride.

Focusing on being finished means you don’t get to focus on craft. Focusing on achievement means you don’t get to appreciate the work as you perfect it. Focusing on product means you ignore the human demands of process, which are organic and complex. Where is procrastination in this equation? To procrastinate means we’re over-focused on the finish line. It’s the end-point of any task we haven’t yet begun that appears so daunting, so much of a challenge, we lack the courage to commit ourselves to any wrong word that might lead us astray.

Procrastination too often stems from a fear of what we can’t see: the endpoint. But ask yourself how you can possibly see the finish line before you’ve even begun the race?

Procrastination implies (or perhaps it demands) that we think about “finishing.” It doesn’t take recursivity into account; it absolutely ignores process, and it denies the reality of being in the body, needing time to sleep, needing time to experience the totality of one’s emotions. I think the biggest problem with the concept of procrastination, as we frame it in the West, is that it relies on a metaphor of linearity, and it foments agonism (which is the foundation of everything we think of when we write, including the idea of creating a protagonist versus his antagonist—you wouldn’t have that terminology if it weren’t for agonism).

Agonism makes procrastination a given, because by definition, it means we are constantly “struggling” with our writing. Agonism underlies competitiveness; it mocks the possibility of collaboration; it idolizes the isolated genius as the sole author of his creation. 

Without agonism, we wouldn’t buy into the Western ideal of “progress,” which is always portrayed as linear, constantly in motion, and, needless to say, masculinist (if you don’t like this word, I understand, but throughout history, the Author was male, and this kind of rhetoric continues to underlie our beliefs about writing).

Agonism limits not only the way we think, but how we think about ourselves, especially if we have a hard time being that which the mainstream culture wants us to be. For too long, the accepted image of the writer was always masculine, which is not going to happen for me in this lifetime. Does that mean I don’t get to be thought of as a writer? I think you know the answer.

You see how little you should worry about procrastination? It’s not the highest concern on my list of concerns for writers. If you hold back from writing forever, and never give yourself permission to write, then, yes, I’d agree there’s a problem, but if you’re resting while you tend your mental garden, then you’re not alone.

Women and the Personal Pronoun

Having trouble with the letter “I”?

silenced

Much is written about the silencing of women, without understanding the ways in which we silence ourselves.

If someone asks you directly what you want from life, can you answer? Equally difficult might be the question, why do you want to be a writer? Sometimes, these answers lie so deep in our inner being, that reaching the reasons why requires gaining access to the “I” self we simply don’t talk to often enough, largely because we’ve been conditioned not to.

The problem with being conditioned not to ask ourselves direct questions (“Is this what I want?” “Am I happy?” “What is it I need from life?”) is that we become voiceless; we silence ourselves, and so we become complicit in our inability to be heard by a society that isn’t necessarily encouraging us to have specific needs focused on what’s going on in your inner landscape.

You might even feel lost in your inner landscape; I know I’ve been without a map often enough. Whereas you might know precisely what your kids or significant other wants, you might not be able to put your finger on your own wants and needs.

Even in this day and age, even in a society that promotes women’s issues, it’s rare to be asked a direct question about who you are, what you want, and what you need. If I ask you to tell me about yourself, your life, and why you want to write, what will you answer? You might be stymied. It might be the first time in your life someone has asked you directly to account for what’s going on inside of you—and that’s understandable, but it’s not acceptable for our society to ignore this about ourselves, and I’ll tell you why.

The reasons you or I might have some trouble with the use of the personal pronoun, making it so we are effectively silenced when someone asks us about ourselves, are complicated—much more complicated than the feminist movement has ever been able to get at the root of, in my experience.

The reasons we all have trouble from time to time with the personal pronoun has to do with how we’ve been socialized. This is less a gendered issue than it is a societal one—meaning that everyone, female and male, is affected by this problem, to a greater or lesser extent.

Men are silenced, too, in somewhat different ways.

Men are silenced, too, in somewhat different ways.

I noticed quite awhile ago that in spite of numerous literary works and academic studies regarding women’s voices, being silenced, and the inherent differences between the way the sexes communicate, the feminist movement—brave, bold and daring at its best—has never been able to make substantive changes in the ways we communicate. 

The problem? Our very language itself, as well as the nature of the culture we’re raised in. You’ll find that you can’t have one without the other; I’ll explain.

We are all raised in a culture that uses certain, very specific, metaphors to describe life experiences. Our culture privileges language use that is “straight as an arrow.” We like to “get right to the point.” If you live in a culture that doesn’t do this, it’s likely you’re not from the West, and, most likely, you’re not an American. In America, in particular, everyone, male or female, is raised with the same set of expectations. It’s a linear culture. We expect our answers to be simple and straightforward.

In fact, I’m having trouble writing this sentence without using the standard metaphors we usually rely on. How do I find another way to say “straightforward”? I’ll have to use my thesaurus. How do I find another way to say “stay on track”?

The key to our language use are the ways in which language choices are determined by cultural values. Although we aren’t usually consciously aware of the underlying “why” of why we say the things we do, those sayings are, not-so-subtly, in my experience, shaping not only how we speak, but also how we think. We are never set free from the expectations of our culture as long as we use language unconsciously.

How does this affect you as an individual, especially one who is, perhaps, confused about what you want from life? If I had you sitting in front of me, would you beg me to “get to the point?” Even feminists say these things, which leads me directly to my point (for which my linear readers will, no doubt, thank me). Even those of us who promote humanist and feminist agendas are not free of our language use, because we’re never free of our culture and all its expectations.

Here’s the core of the problem: We might speak using the metaphor of linearity, but it isn’t how we think. We’re curvature-type people, we humans. We tell stories that go around and around. When asked to explain something that just happened to us, we might not start “from the beginning”. We’re maddening like that. It isn’t just women, either; men do it too. The human brain doesn’t work precisely the way our slapdash “time is money” culture would like.

And so those who need more time, more words, more creativity, are silenced in the face of the tapping foot of the impatient, narrow-minded linear metaphor. Another factor that contributes specifically to the silencing of women, however, is more pernicious and less easily spotted, and it feeds into the linear metaphor neatly. It is the culture we exist within, the culture of scientism, which is inherently distancing, mechanistic, and dehumanizing.

In case you don’t know what I mean, consider that until fairly recently, it was considered bad manners to speak in the first-person pronoun. One used the less personal pronoun, “one,” to describe one’s wants or needs. It was (and still is) considered grammatically correct, and although that’s useful when grammatically-correct writing or speaking is called for, it symbolizes the problem, which has to do with impersonalization, the distance between “I” and “one”.

Further, if you listen carefully enough, you’ll begin to notice that we not only live within a culture that privileges linearity; we also rely on a vocabulary of numbers, weights, measurement, and mathematics—the vocabulary of science. Although this part of my argument is too large to adequately address in one short piece, I will suggest that if you eliminate the vocabulary of science, as well as linear metaphors, from our language, there wouldn’t be much left to say.

Try it for a day; see if you can condense (a word borrowed from scientific experiments) your conversations into that which does not rely on science or linearity. It will prove (a science word!) to be a challenge. This is especially true in the land of academia, which is imbricated (my favorite academic word, which simply means “bricked in,” as in, “the woman was bricked into the wall, buried alive”) in scientific terminology, over-relying as academia does on proofs, hypotheses, and problems to solve.

So, if we are raised in a culture that uses two particular methods to express itself—one, the metaphor of linearity, and the other the vocabulary of science—then what should those who are caught in between do when they are voiceless in response to this impersonalized, mechanistic, linear methodology of thought? Feeling like you’ve been absorbed by the Borg yet? You should.

Silencing-Women

This societal silencing has been going on for a long, long time.

When a woman encounters a direct question about her inner landscape, therefore, everything she’s been taught to think is at war with the one metaphor that makes sense, the metaphor of organicism.

The metaphor of organicism includes the body, and does not exclusively privilege the mind. You will notice a few things about scientific vocabulary and the metaphor of linearity: they both came to social prominence during the rise of scientism, otherwise known as the Enlightenment, the era of Reason.

It was called the ‘enlightenment’ because its role was to cast light into the darkness that came before it, including the darkness of superstition, paganism, and what was perceived as the ‘ignorance’ of faith. This included the lack of knowledge about how the body, but most particularly the mind, worked.

Many important ideas were swept away during the Enlightenment, however. When society emerged from what became known as ‘the dark ages,’ we were no longer allowed to think like Plato (after the rise of Christianity, considered a pagan), who gave us the metaphor of organicism (an idea later appropriated by the Romantics, which only served to deepen the divide between that which is produced by nature, from all that is ‘man-made’).

The Romantics rediscovered Plato, the body, and emotion. Without them, I doubt we'd be having this discussion.

The Romantics rediscovered Plato, the body, and emotion. Without them, I doubt we’d be having this discussion.

Instead, Western society started thinking like Bacon, Locke, and Descartes, all of whom much preferred applying reason and, most importantly for my argument, logic, to life questions. In swept the rise of linearity and scientism.

Unfortunately for those of us who are not inherently scientific and linear, however, when we lost the organic metaphor, we also lost all that went with it, including metaphors relying on our bodies as a way of explaining reality. If you don’t buy into the metaphor of linearity (you don’t perceive the value in it) and you’re not inherently interested in the scientific way of approaching reality, where do you stand, especially if, now that logic is the dominant trope, we have no bodies, only brains?

I used to teach English composition at a research and development university. Frequently, my students were pursuing a science-related degree. Nothing about the training they’d received, or their life experiences, for that matter, prepared them for my style of teaching. I had one memorable day in particular when a student asked me why they had to use the personal pronoun “I” in their papers. Her question led to a 15-minute diatribe from me about the depersonalization in society brought about by the sciences and its perpetuation of emotional distancing.

For me, the reason scientism is such a problem is because it tells us that it’s okay, even desirable, not to use the personal pronoun—this means we forget to think in terms of our inner “I”. The prevalence of the metaphor of linearity reinforces the idea that we must ‘keep to the path,’ ‘keep to the straight and narrow,’ that we must not diverge from ‘the norm.’

Women's speech has always been a concern. This tarot card draws on a folk tale from The Blue Fairy Book (1889). Tarot cards are an example of non-linear uses of metaphor, as are folk tales and "old wive's tales."

Women’s speech has always been a concern. This tarot card draws on a folk tale from The Blue Fairy Book (1889). Tarot cards are an example of non-linear uses of metaphor, as are folk tales and “old wive’s tales.”

Who, under the influence of a society adhering, unconsciously, to these rigid, linear, rules, would allow themselves to meander a little, to stray from the path, watch daisies grow, or imagine himself in another, more colorful reality? To, heaven forfend, daydream aimlessly?

Finally, consider this: I think we’d agree that most, if not all, women in what we’ve come to think of as third-world countries lack what we’d call a ‘voice.’ We have no idea how an individual Pashtun woman, for example, thinks or feels about her life. We rely on educated men and women to tell these otherwise silenced women’s stories, just as the tribal woman herself relies on those from the West to tell her story, until or if she becomes educated, and self-confident enough, to tell her own story.

One thing is certain: those in the West will tell her story their/our way, using the dominant vocabulary and metaphors we all rely on to convey meaning.

And yet, these isolated, tribal women, nameless and faceless to most of us, are no more or less silenced than a woman in the West, if that Western woman, with every privilege, every social advantage, feels voiceless; that she is, effectively, silenced, by a culture that has given her a vocabulary she doesn’t identify with, and a set of metaphors she doesn’t believe in. Under those circumstances, you’re not using the language; the language is using you.

Feminism located one source of the problem for women: that we try to express ourselves while using the language of ‘the patriarchy.’ What feminism couldn’t accomplish, however, was to undo the prevailing beliefs and values that created that dominant language and vocabulary in the first place. Using a language unconsciously, we are stuck within it. Knowing that the metaphors and vocabulary binds you helps you break free of them, as well as some of their more pernicious expectations. What do you replace them with? That’s the challenge we all face, in my opinion: we must come up with a more inclusive language, one that more accurately reflects human reality—mind and body.

Ask yourself what’s preventing you from being heard, being seen, being known. The answers might surprise you; but what shouldn’t surprise you is that, if you’re a woman, you’ve been taught to think in such a way that prevents access to this complex inner world, for it’s a world that is recursive, not linear, and not necessarily bound by logic or language, either. Many of our deepest truths occur without ever attaching themselves to words. Much of what we know at the unconscious level are things we learned before we ever learned language. If you insist that it’s easy to give voice to places in your mind that are pre-verbal, therefore, you’re fighting an uphill battle.

The truth is, we’re not ‘straight as an arrow.’ In many ways, we’re curved. Only one one of those ways is physical.

Creativity and individualism, and how education squelches it

Creativity is a huge subject for me; I think about how to inspire it in writers a lot of the time.

One of my ‘rules’ about inspiring writers is to encourage them and provide open doors, rather than closing doors by telling them what they cannot do, or criticising them.

I am rather adamantly against criticism for the sake of critcising. I can understand wanting to make something you’ve done better, but if all you’re really doing is telling someone they’re deficient according to your idea of perfection, could you keep that to yourself, please? Because all we do when we criticise someone is let them know about ourselves and our needs, rather than help them. This includes yourself. Let yourself off that hook, okay?

I’ve worked with far too many aspiring writers who tell me some version of, “When I was a child, my teacher/parent/friend told me it would be pointless to continue writing, that I didn’t have what it takes,” etc., largely because the person doing the criticising of your early efforts was caught up in the ancient paradigm of what I have learned to call the Divinely Inspired Author myth.

The individual who wants to write is too often challenged in this way, and therefore might never pick up her pen again, only to regret this choice later in life. Believing it to now be “too late,” she will give up on her dream of writing “someday.” If there’s something I don’t want to see people doing, it’s giving up on their dreams because one time when you were twelve, your English teacher gave you a ‘C’ on a paper you thought was pretty great—until you got that ‘C’, of course. Many years ago, I worked with one man in his 70s who never forgot the ‘C’ he received in high school; that’s how powerful authority figures are in our young lives.

I’d like for you to watch the following video, because Sir Kenneth Robinson, an English creativity expert, discusses the ways in which education discourages children from holding on to their creativity.

Why don’t we get the best out of people? Sir Ken Robinson argues that it’s because we’ve been educated to become good workers, rather than creative thinkers. Students with restless minds and bodies—far from being cultivated for their energy and curiosity—are ignored or even stigmatized, with terrible consequences. “We are educating people out of their creativity,” Robinson says.

This loss of one’s belief in their own creative ability is my primary concern as a writing coach, because my focus is on how to get adults to reconnect with the creativity they were once forced to abandon in favor of scholastic achievement.

In this talk, Sir Ken discusses the needed revolution in education; his perspective is that it’s time to reform educational practices so that people will learn to be themselves and do what they love, not what’s practical. We have to change our industrial model to an agricultural model, he says, and change the metaphor we use to create our concept of why we need an education from mechanistic, based on the needs of a bureaucratic society, to organic, based on the needs of the individual.

He thinks we’re obsessed with getting people to go to college, as though going to college now is the answer to everything, which isn’t true. I learned as an educational consultant and teacher that students too often attend college or university for someone else (usually their parents), and that it wasn’t the right choice for them. Sir Ken agrees that college isn’t necessarily the best choice for everyone, and it isn’t something everyone has to do at any one given time (e.g. the moment you leave high school or secondary school, for those readers not in the States).