Ole, Ole, Ole!

April 1, 2012

I have listened to this TED talk by Elizabeth Gilbert more times than I can count:

Elizabeth Gilbert on Nurturing Creativity

At first, I listened because I felt a resonance with her statement that goes…

… I’m only about 40 years old. I still have maybe another four decades of work left in me. And it’s exceedingly likely that anything I write from this point forward is going to be judged by the world as the work that came after the freakish success of my last book, right? I should just put it bluntly, because we’re all sort of friends here now — it’s exceedingly likely that my greatest success is behind me.

I understand that kind of neurosis. Despite never having been mistaken for a genius nor experiencing any sort of success that could be seen as being in the same vein, same circulatory system even, as Gilbert with her memoir, Eat, Pray Love, I have felt the anxiety, the fear, the feeling that I did something pretty well in one instance, and I’ll never be able to do it as well again. More often though, I feel that the best of me is behind me. I feel that often.

In this talk, which I suggest you watch before you read this post in order for the post to make the most sense, Gilbert is speaking of the inexplainable “thing” that happens to people at times, and that in its happening, yields remarkable creativity. Her question is, can you ever capture this thing? Can you harness it? Can you establish some sort of routine or environment that brings it about? She doesn’t use the word, but her question is one that artists and creatives have shared as long as there have been artists and creatives, i.e. “Where is my muse?”

On the fourth or fifth time I listened to the Talk, I found myself focusing much more on the latter part of it. It’s during this part that she equates and/or compares the creative process, or more rightly the creative product, with the Divine. She speaks of artists or writers or anyone seeking to produce something born of a creative place, as people on a mission to catch something. I have heard and read others use the same language to describe it. Rosanne Cash does so in her interview with Krista Tippet. Annie Dillard in her writings on writing. I’ve written a few things in my life that I can look back upon and say to myself, “Did I really write that?” I don’t ask the question with a sense of incredulity, but more of wonder. “Did I really write that?”

The past few days I’ve been engaged in a discussion on a good friend’s Facebook page that grew from her simple “contest” to poke fun at something a relative of hers had said. My friend is a self-avowed, staunch atheist. Her relative is a self-avowed, staunch, fundamentalist Christian. I add those qualifiers on purpose, for I know many people who would call themselves atheists in the sense that they hold no belief in any divine being, but who would more rightly be described in the way Joyce Carol Oates describes herself, i.e. indifferent. Similarly, I know people with very assured convictions regarding their faith, but for them too, these beliefs are simply a given, woven into the fabric of their being, not anything they feel the need to ever talk much about, let alone defend outwardly in any dialogue or argument.

My friend and her relative cannot be described that way. They both hold strong beliefs and they have not a whit of a reservation in sharing them. And thanks to these kind of folks, we can get discussions on Facebook that last for days and with comments that number into the hundreds. Depending upon your own convictions, you might find such an activity as funny or offensive, as disturbing or intriguing, or as either disconcerting and disheartening. Personally, I found it a little bit of all of these, less offensive. I was never offended. And I also found it predicable; the comments of everyone, including myself.

And it was really funny.

And also inspiring.

I found myself drawn in to it. I found a part of me that hadn’t engaged in any theological debate over the issues of women in ministry or the problems of unwavering dogmatism in many years, awakened. I trod it out and took my stabs at trying to somehow say, somehow show, where the problems lie in the arguments people make to defend certain narrow understandings of God or god or anything related to the divine.

And I awoke this morning with the need to watch that TED Talk one more time. I wanted to hear the exact words Elizabeth Gilbert used. How again did she describe the poet Ruth Stone’s creative process? I needed to hear it. I cued it up and … oh yes, it goes like this:

I had this encounter recently where I met the extraordinary American poet Ruth Stone,who’s now in her 90s, but she’s been a poet her entire life and she told me that when she was growing up in rural Virginia, she would be out working in the fields, and she said she would feel and hear a poem coming at her from over the landscape. And she said it was like a thunderous train of air. And it would come barreling down at her over the landscape.And she felt it coming, because it would shake the earth under her feet. She knew that she had only one thing to do at that point, and that was to, in her words, “run like hell.” And she would run like hell to the house and she would be getting chased by this poem, and the whole deal was that she had to get to a piece of paper and a pencil fast enough so that when it thundered through her, she could collect it and grab it on the page. And other times she wouldn’t be fast enough, so she’d be running and running and running, and she wouldn’t get to the house and the poem would barrel through her and she would miss it and she said it would continue on across the landscape, looking, as she put it “for another poet.” And then there were these times — this is the piece I never forgot — she said that there were moments where she would almost miss it, right? So, she’s running to the house and she’s looking for the paper and the poem passes through her, and she grabs a pencil just as it’s going through her, and then she said, it was like she would reach out with her other hand and she would catch it. She would catch the poem by its tail, and she would pull it backwards into her body as she was transcribing on the page. And in these instances, the poem would come up on the page perfect and intact but backwards, from the last word to the first.

(Really, if you’ve gotten this far in reading this post and NOT listened to the Talk, stop now and do so.)

I listened to Elizabeth Gilbert again this morning. I also read Karen Armstrong again. Just a little bit. The introduction to her last book, The Case for God. For me, Karen Armstrong is the clearest voice we have today for explaining the history of the monotheistic religious traditions. She likewise gives the surest argument, solid and grounded, for the existence of anything divine. She wrote this last book, in part, as a response to the modern voices of atheism, the late Christopher Hitchens, Richard Dawkins, and Sam Harris. They all dismiss her work, as expected, but I’m still waiting for one of them (the latter two, now) to counter her in the same manner, i.e. with the same level of scholarship and objectivity, with the same nod to the history of humanity.

The reason I’m drawn to Gilbert’s description of creativity and Armstrong’s description of the divine is because for me, they strike upon the characteristic and quality of these “things” that I understand best and that I appreciate most. Armstrong traces humanity’s experience of both logos and mythos, and how they were once – and for a VERY long time – held in parallel, held not as opposites, but as complements. They both operated within and throughout the history of us together:

In most premodern cultures, there were two recognized ways of thinking, speaking, and acquiring knowledge. The Greeks called them mythos and logos. Both were essential and neither was considered superior to the other; they were not in conflict but complementary. Each had its own sphere of competence, and it was considered unwise to mix the two. Logos (“reason”) was the pragmatic mode of thought that enabled people to function effectively in the world. It had, therefore, to correspond accurately to external reality. People have always needed logos to make an efficient weapon, organize their societies, or plan an expedition. Logos was forward-looking, continually on the lookout for new ways of controlling the environment, improving old insights, or inventing something fresh. Logos was essential to the survival of our species. But it had its limitations: it could not assuage human grief or find ultimate meaning in life’s struggles. For that people turned to mythos, or “myth.” (p. xi)

Then humanity entered the modern age. The entrance itself is pretty much defined by a rise and/or preference for logic, rationale, scientific understanding, and the subsequent dismissal of myth. One tragic consequence of this, for Armstrong’s and my own belief, is that the monotheistic religious traditions chose to follow suit, and theologians of each began to argue for their faith within the parameters of logos. They, too, dismissed the biggest reason we had religious beliefs in the first place, i.e. to explain what couldn’t be explained. Theologians began to argue with words, because words are the foundation for reason, over experience. They insisted upon describing, defending, and explaining the words of their traditions with more words.

But the experience of the divine is not captured in words. It’s the poem roaring at you across the field. It is the feeling of the song. It is the sound of joy. It is the silence of grief. It is the essence of love. It is the thing that cannot be described with words. It’s the experience of “words fail me.”

As I typed those last two paragraphs, the song “Valentine” by Ruth Moody shuffled across the My Space player running in the background of my morning writing:

I must have been crazy,
Lost in your blood-shot eyes again,
But love she marches in,
And takes us like an army now and then.

I could dissect the words, do an exegesis of them as we’d say in theological circles, but doing so would not describe for you the feeling that I experience when I hear the song. There is Ruth’s voice, the guitar, the words, the mood, the context, the ritual; there is all of this and more. There is all that words fail.

For me, “all that words fail” is the divine. It is why I believe in something that cannot be explained. And I don’t particularly want it explained. I want it experienced.

This is what was missing in the Facebook discussion. It’s what is missing in all of the dialogue that goes around the talk of religion, of whether or not God exists, of theology and atheism (which could never exist, one without the other). It is what leads us to arguments, to misunderstandings, to name-calling, and in the saddest circumstances, to violence. Our dependance upon logic, our necessity for ego, and our dogged determination to understand and explain everything fails us. Just like the words.

I love words. I am a fairly verbose person and I enjoy talking about this topic more than most, but at the end of it all, my 284th comment on my friend’s discussion thread ought to be no words at all. Because words fail us in this debate. Thank goodness.


What’s in a Hoodie?

March 26, 2012

When I was in the 5th grade, I lobbied my mom for two peer-pressure-induced things:

  • a pair of Levis blue jeans
  • to be able to watch the new television show, “Happy Days”

Both took a lot of persuasion, begging, groveling, bemoaning, and probably some crying, but eventually I made a strong enough argument (or else drove my mother to the breaking point) to succeed. I was told that I could watch “Happy Days” with my parents (evidently they felt the need to define things like “hickey” for me) and while I could not get a pair of blue jeans per se, I was allowed to pick out a pair of light blue Levis corduroys that I wore with pride all the way through, I think, the 8th grade. All the way until they wore through in the seat and were no longer fit even for cut-offs.

I was thinking about those Levis this morning as I walked my dog through the park wearing a hooded sweatshirt. Being a middle-aged white woman, I imagine I’m not very threatening in the latest demonized wardrobe. Sadly for Trayvon Martin, this rule – this stereotype – didn’t apply to him. He was a young black man in a hoodie and as such was a threat to the fearful, trigger-happy George Zimmerman. Trayvon is dead due to his hoodie. I was only warm. Bill Belichick is… well, he’s a slob, but that’s another story.

Or is it? Why did it take such persuading on my part to convince my mom to let me wear those Levis all those years ago? Why was she worried about “Happy Days”? Strangely enough, there was a connection. Levis, to her, represented something that young girls didn’t need to project, i.e. they didn’t need to dress like boys. You had to buy Levis in the boys department at that time. They didn’t make jeans for girls yet. You had to go to the boys department and pick out your size by waist and inseam, like boys and men do, not in non-measurable sizes like girls and women. They were pants for boys, not girls.

They were also jeans and jeans represented a level of dress that was unacceptable for school. Mind you, I didn’t go to a Catholic school or a boarding school or any type of private school with a dress code. I went to the public school in the neighboring county. I went to school with kids from farms and suburbs; low- to middle-class, all of us. But my mom taught at the same school and she felt strongly that there should be a difference between what a person wears when s/he is going to school, compared to going in the backyard to play. School was time in public and as such, you needed to look presentable. She defined this as something better than jeans.

Over time, of course, I wore more jeans. I even wore blue jeans before I finished high school. Still, the lesson of how I was to look in public stuck with me. It sticks to this day. I iron my clothes in the morning, I rarely wear jeans to work (and only this year started doing so on Fridays), I won’t go anywhere other than the gym or the dog walk in sweatpants. Like my mom, I believe very much in the importance of the difference between public and private, and I believe in maintaining that differentiation as best I can.

But no doubt, I am in the minority.

I concede that one can make a strong argument that appearance is overrated (ironically, since we are a culture obsessed with fashion and looks). In an ideal world, we would see beyond the clothes one wears or the car one drives. In an ideal world, we would look beyond the color of a person’s skin, their sex or gender, their socioeconomic status. In an ideal world we would look beyond all of these things to see only the person. In an ideal world, George Zimmerman would have seen a young man named Trayvon Martin. That’s all. He would have drawn no conclusions, nor made any assumptions about Trayvon based upon his jeans and his hoodie and his hands tucked in his waistband. But surely we all know how very far from any ideal world that we are today.

That said, I wonder where our individual responsibility lies in terms of the messages we send to people by the way we present ourselves to the world. I was riding the light rail to the airport in St. Louis a couple of weeks ago. I was sitting next to an African-American woman about my age. A young man, who looks a lot like the pictures of Trayvon I’ve seen on the news, got on the train at one of the stops and for the next 15 minutes or so, stood by the door alternating scrolling on his iPod with pulling up his pants. This kid was not threatening to me in any way. He was a kid in his jeans and his hoodie and a jacket, riding to wherever he needed to be that morning. When his stop arrived, he got off and the woman next to me said under her breath, “I’m so glad I don’t have any boys.” I shared that I was thinking the exact same thing and we laughed about the ridiculous fashion statement one’s pants falling off one’s ass makes.

We laughed. George Zimmerman fired a gun. Thus I return to my argument that when we live in a world where people will in fact shoot you because of the clothes you chose to wear that day, is it not maybe time to think a bit about the choices we’re making? I listened with interest at Tom Ashbrook’s March 20th episode of “On Point” where his guests, Mychal Denzel Smith and James McBride, along with several callers to the show, talked about some of the things they were taught as youngsters, as young black men, to help them survive in a world that hated them for no reason beyond the color of their skin. The lessons in no way serve to condone the hatefulness. They aren’t meant to encroach upon anyone’s civil liberties and/or free will to wear whatever the hell they choose to wear in public. No, they are lessons in survival.

They were taught not to run away from a possibly harmful scene for fear of looking like a guilty black man (a lesson that Trayvon evidently had heard, for when his friend on the phone told him to run, he refused). They were taught not to cross the tracks. They were taught to be home when it got dark. Are these all lessons and/or rules that imply one is submissive to an established order? Maybe. But more, they are lessons that kept those young men alive.

My mother said, “Don’t come crying to me if someone mistakes you for a boy when you’re dressed like one.” It’s a point well taken. And I know it’s a L-O-N-G way from being mistaken for the wrong sex to being shot on someone’s front lawn, but it’s still the same world. It’s still a world where people make assumptions, where people hold wrong ideas, where people stereotype, and where people are inundated with messages to fear and hate anyone who looks or talks or acts different from us. We hear it from political candidates, television and radio talk shows, movies, books, and even the “red alert” terrorist rating scale brought to us complements of our Department of Homeland Security. We simply cannot escape the message, the idea, that we are threatened. Constantly. We live in a heightened state of fear because we’re taught that fear will keep us safe. The logic escapes me, but it is there all the same.

So I’d like to suggest that until we find ourselves in a better place, a better world, that maybe we look again to some of the rules we let go awhile ago. Maybe we can start to think again about how we look to others, how we present ourselves in the world. Maybe moms and dads, aunts and uncles, friends and family, athletes and superstars – adults – can model again for our young people some behaviors that will keep them safe; behaviors like, if you’re not a gangster, don’t dress like one. If you want people to take you seriously, stop looking like a clown. If you want the respect of your teachers or your boss, show up dressed in such a way that says, “I’m here to listen and pay attention.” Sweatshirts and baggy gym shorts and pants falling of your hips don’t say that. Neither do tight blouses that show off your lingerie or jeans low and tight with “princess” written across the ass. What else does this say besides, “Look at my ass!”? Look at my ass, my breasts, my crotch. That’s what it says.

We can argue until the cows come home that it shouldn’t be this way. We can say with every good, civil libertarian, high-browed and educated liberal mindedness that such thinking is just wrong. We SHOULD be seen for who we are, not what we wear. And to that I will answer, “Yes, we should.” But we are not. And Trayvon is dead, not entirely, but in part, because he looked like a young man who might cause trouble. He fit a profile. And if I was his mom, as much as it’s against my core beliefs to tell him to dress differently, I’d have rather done that than attend his funeral.

 


If a door closes in the forest, does anybody hear?

March 3, 2012

ImageI joke a lot about my life in the cubicle. I joke so that I don’t cry. Cubicle life is harmful. Literally. If I were to spend more time on this post, I’d document that statement with numerous references. Lacking such, if one chooses to do a brief search of the Web on the topic of developing and nurturing creative environments, being more efficient at work, the troubles with 24-hour connection to work, to computers, to people… well, you’d quickly find credible sources to back up my statement that spending 8+ hours each day in a  3-walled space sucks away your life, your soul, your entire being. Bit by bit.

But this aside, in my joking about the cube earlier this week, I had an “Ah Hah!” moment (actually it was more like a “Holy smokes!” moment) when I realized that in my life I do not have a single door. Not one. I have no door on my office at work and while there are doors on my bedroom, bathroom, other rooms in my home, like my office space, my home is a space that I share with others. I share my workplace with my colleagues and I share my home with my partner and our pets. Shared spaces.

I do have space in my home that’s deemed “mine” – a room with my books, drafting table, desk, and other assorted tools for creativity, and recently Lynn and I began rearranging the guest bedroom so that it can be a place for the sewing machine, computer, musical instruments and such. We have a wonderful home, overflowing with good space, literally and metaphorically. Still, within all of that good space, there is no space that is mine in the sense that I can go there, close the door behind me, and be alone with myself.

The same holds true at work. If I really need to have focused thought – to read an article or write a report or even just think thoughtfully about a problem or project – I have to leave my cubicle and find a quiet place, perhaps one of the study carrels upstairs, to do so. And even then, there is no door. These are not my spaces, but simply quiet public spaces.

Is this such a bad thing, this not having a room of one’s own? Virginia Woolf made the saying popular in her collection of writings and lectures, A Room of One’s Own:

“Women, then, have not had a dog’s chance of writing poetry. That is why I have laid so much stress on money and a room of one’s own.”
 
But as I had my realization this past week, I challenged Ms. Woolf by saying, “People need a room of one’s own… with a door.” After all, isn’t it the door that makes the room? The door is what gives the privacy. It’s the door that gives the permission for the space to really become one’s own. It closes out the rest of the world so that we’re able to make the space our’s alone and thus, allow us to be ourselves alone. And fully. In the sense of fullness in solitude.
 
I’m grateful for all of the spaces in my life that are safe and conducive to good thought and creativity. It is most true that I loathe my cubicle, but I am thankful for the bulletin board and the whiteboard and the desk; all things that allow me to make the open space something of my own. Still, it is not good for work. There’s no denying that. 
 
And I’m grateful for my car and for occasional trips that last longer than 10 minutes; trips that allow me to close the door on the world and be alone. It’s not the same as a room, but it’s close. I told someone recently that I do not talk on my phone in my car. This is not because it’s unsafe, nor because I rarely talk to anyone on the phone, anywhere. But I followed up by saying, “My car is my space. It is my time alone. If I find the chance to be alone in my car, I don’t want any intrusions.”
 
A room of one’s own with a door. This is what I look for. I think it’s what we could al use. And I don’t think we’ve a dog’s chance of writing any poetry – or being truly healthy – without one.