Tuesday, February 12, 2013

Sylvia Plath: The Stigma of Writing and Mental Illness

February 11, 2013
You’ll Love Her! She’s Crazy!
Posted by

h_3.00027557-465.jpg
I’m told that one of my grandmothers suffered from what must have been postpartum depression. She was prescribed Miltowns in the forties, and hid an opiate addiction for more than fifty years. On the same branch of my family tree is an aunt who ended her life. Everyone who would know the details of either story is dead.
Many somber words have been intoned about the taboo surrounding mental illness, recently and notably by former congressman Patrick J. Kennedy this past January, soon after the shootings in Newtown. “If we’re going to get rid of the stigma—one of the great civil-rights challenges of our time—we need more discussion in the real world, and less shame by those suffering with mental illness, or the loved ones around them,” he wrote, in an essay published by The Daily Beast.
Until recently in human history, mental illness was indeed a stigma, discussed in whispers with the vocabulary of shame. To varying degrees, however, these whispers have always been accompanied confidently by the vocabulary of pride.
In her 1978 essay “Illness As Metaphor,” Susan Sontag wrote about the received ideas that surrounded tuberculosis in the nineteenth century, and cancer in the twentieth. The tubercular character was vaunted as “sensitive, creative, a being apart.” She added, “In the twentieth century, the repellent, harrowing disease that is made the index of a superior sensitivity, the vehicle of ‘spiritual’ feelings and ‘critical’ discontent, is insanity.”
Indeed, wherever I go in the twenty-first century, people are proudly mentally ill, and conversations about mental illness invoke the idea of specialness and the stereotypical mad genius. Contemporary scripted TV advertises the benefits of disordered thought, perception, and behavior, from the associative manias of the bipolar C.I.A. officer Carrie Mathison on “Homeland” to the precise memory of the phobic, obsessive-compulsive private detective on the eponymous “Monk.” Unusual brains are shown to correlate with creative intelligence and exceptional cognitive sensitivity. Stereotypes of shameful weakness come far behind, if at all.
Most educated people can name half a dozen poets who are more famous for their messy lives and deaths than for their poems. The short lives of Shelley and Byron comprised several suicidal lovers and a half-dozen unfortunate children, all adopted or dead by age five. Deaf, miserable Beethoven; van Gogh and his severed ear; Hemingway and his shotgun; Poe in his gutter; Woolf in her heavy raincoat. The narratives endure because they align with the popular understanding of what it is to be an artist.
* * *
Sylvia Plath, who died fifty years ago today, attended my high school, Gamaliel Bradford Senior High, in Wellesley, Massachusetts. She graduated in 1950, and when I graduated in 1992 she was still the most famous person ever to have gone there. Her long shadow remained, decades after her death, and the writing prize was named for her.
She’d sat in the back right-hand corner of Room 200, the room where Wilbury Crockett had taught his English courses. We all knew it. I often ate lunch by myself, in Sylvia’s seat, when the room was empty—not because it was her seat but because it was the seat furthest from the door. I never read her poems. I didn’t like the idea of poetry. I liked the idea of long books that were impossible to understand, and I read Pynchon’s novels laboriously, consulting multiple reference books as I inched down the dense pages. Plath had been dead longer than I’d been alive, but we didn’t count the years. She was ageless and occupied all history.
Mr. Crockett, a legendary teacher whose written comments on Plath’s poems allegedly first encouraged her to become a poet, retired when I was in kindergarten, but when he was seventy-eight he visited my eleventh-grade English class. Our English teacher had prepared us to receive his great wisdom. Most important of all, she reminded us that he had been the teacher of Sylvia Plath.
What never seemed strange to me until much later is that Plath’s poems weren’t taught to us in high school; only her suicide was taught to us. A lady, who had lived on Elmwood Road, across the street from my elementary school, had become a poet and become inconsolable and stuck her head in an oven. The books we were assigned to read for our English classes were tedious novels about boarding school and dated plays about the American Dream. Our frowsy English teacher who had invited Crockett to speak assigned each of us to read a different Dylan Thomas poem, and we each presented our poem to the class, and that was it for our education in poetry.
A minute into Crockett’s presentation, a straight-A student made a sound. Did he mutter something? Whatever Crockett thought he’d heard, it lit a fuse. We sat silent while the great man raged. In our shame we knew Crockett had chosen the wrong boy to castigate—he was humorless and inoffensive. That the boy would have insulted an honored visitor is unimaginable. Crockett screamed that we had rejected a great gift, and that we were worthless. Worthless! He strode out of the room. Two years later, he died, and our sparse little school library was named in his honor.
* * *
Despite having begun college determined to become a physician, I failed Chem 10 and, after a cascade of results, went to writing school instead. My first poetry collection was published modestly by a small press when I was twenty-seven. A few poems found their way into anthologies. I worked part-time as a copy editor and ate a lot of oatmeal.
After my book came out, my former college boyfriend said, “At least you can go nuts, now that you’ve become a real writer.” Like every recent college graduate I knew, bringing up the rear of Generation X, he yearned to check out and waste some serious time. Despite his classics degree he’d become a management consultant, though, and, as such, he simply couldn’t find his way into the seemingly exclusive and glamorous milieu of mental illness. Was he depressed? Perhaps, but he couldn’t conceive of it as a possibility—not because of the taboo but because he didn’t believe he’d fulfilled the prerequisites. Management consultants drank. They didn’t take antidepressants. They weren’t interesting enough to go nuts. Going nuts was a point of pride. You had to train for it.
One of my graduate-school colleagues used to boast about his antidepressant prescription. “I’m crazy!” he’d squeak at parties. A little depression? It probably was the most interesting thing about him. Fifteen years later, he publishes workmanlike best-sellers. Several of the poets with whom I went to school, clinging to modest functional abilities, are too mentally ill even to know they could be boasting about being mentally ill. You will never hear of them.
Shortly after I earned my degree, caught in a constellation of simultaneous disappointments, I found myself in a locked psychiatric ward. One of the social workers spoke excitedly about the therapist he wanted me to see after I was released: “You’ll love her! She’s crazy, just absolutely crazy!”
I remember responding to the social worker as coolly as I could while pushing down hard on a weeping rage: “I’m not sure we share the same tastes.” “What do you mean?” he asked in his best therapist’s voice, his little eyes open wide to indicate he cared. I tried to explain why standing around in a circle holding hands and talking about my feelings made me want to hang myself. Squinting, as if calling out from a high pulpit, he said, “Standing around in a circle holding hands is my favorite thing to do.”
Treating mental illness is an economic, and therefore practical, problem. But more fundamentally it is a problem of rhetoric and therefore also an abstract one. Before we can address it, we must speak about it, and the vocabulary we use is highly polarized. On one hand, the sufferer is responsible for getting over the shameful condition; on the other, the sufferer is a mad genius whose quirks and foibles demand respect. Seldom is mental illness just illness.
In order to develop workable policy serving those functionally impaired by mental illness, we need to learn to talk about it without recourse to the broad brushes of its existing metaphors. What if we could imagine a mentally ill person as neither a potentially violent simpleton nor a mad genius but simply a person with an illness that might be diagnosed, treated, even cured?
I expect that history might solve the problem all by itself, now that the very condition of illness has moved from a strictly medical milieu to a capitalist one. As far as the drug companies care, mental illnesses provide just another opportunity to sell pills to impressionable consumers. When I visit my psychiatrist, more often than not there sits a smartly suited young person with a full briefcase. Sometimes it is a man, sometimes a woman, but the suit is always navy blue. The person does not look tempted to sit upon the lap of the enormous stuffed bear I call Flat-Bear, who sits in the corner, against the wall, his lap increasingly grubby and compressed. The person enters and leaves the doctor’s office briskly, in a few minutes. On my bad days, I am sure I would buy whatever he is selling, and that psychotropic medications will become the twenty-first century’s bottled shampoo.
That the medical establishment is in league with the pharmaceutical companies seems inevitable and in fact has been widely observed. It seems dubious that the language of commerce could be a positive influence, but brisk business feels like progress beyond the language of myth.
And, even without the help of commerce, time wears away at myth and everything else. Plath’s suicide at thirty, after publishing just one volume of poems, invited the stereotype of the mad poetess, the wife betrayed; it was impossible to read the posthumous publications without considering the biography. But in the fifty years since her death the myth has dimmed; the work endures.
The woman is perfected.
Her dead
Body wears the smile of accomplishment.
Though the facts of her life won’t soon fade from historical memory, Plath is now, at least, more poet than suicide.
Sarah Manguso is the author of five books, most recently “The Guardians: An Elegy for a Friend,” which will be published in paperback next month.
Photograph: Contrasto/Redux.

"The Paradox of Quantum SpaceTime" for Emergent Theology Today




I read the Scientific American article below not many days hence when reviewing the topic of spacetime (Where Does Space and Time Come From?) and thought a complimentary piece on (quantum) Theology might well be in order too. Obviously, the naturalist point of view from Scientific American will be one of mundane experience void of any biblical commentary per the divine element of creation, life, or death. But still, our purpose here is to simply try to conceive of Time in it's origin, conservation, causality, connection, need, necessity, and quantum nonexistence to multi-dimensional spacetime. And once conceived, perhaps one day better commentate about the corollaries we might find between quantum science and that of its sister philosophy, quantum theology.

Statedly, God's universe is as often paradoxical as it is incomprehensible. And we would do well to bear this in mind when attempting to understand this same God theologically. Too often we have simplified God by our plethora of philosophical and biblical statements when we might do better to sever all ties to the church's outdated (and outmoded) mundane expressions of God while attempting to re-imagine this same God through the refracting lenses of postmodern science and theology. Add this to the errant idea of trying to qualify any postmodern discoveries against the past judgments of any pre-modern, medieval, ancient, or pre-civilized, cosmological sentiments of bygone eras (grammatical, contextual hermeneutics not withstanding) and you'll get the point I'm trying to make. That our past may inform us to previous enlightenments but cannot disengage us from making our own more relevant insights and discoveries for post-modern society today.

By attempting to re-describe our theology of God we might, perhaps, be more attune to developing a theology that is more in line with contemporary thought and expression. A theology that connects yesteryear's orthodoxies to today's poststructural language and idioms, thoughts and ideas. Which might reach across the humanistic divides of all scientific disciplines and find the biblical God of the Old and New Testaments re-expressed in today's 21st century contextualized understanding. A theology which would be more robust in philosophical and scientific thought while more compellingly re-expressed through comparative biblical terminology using contemporary postmodern paradigms. Some few examples that come to mind would be that of Deconstructionalism, Open Theism, Relational-Process Thought, Existential Behavioralism and Cultural Anthropologies, Evolutionary Creationism, and so forth. A labyrinth, as such, of open sourced theological study forming the literary backbone of a postmodern, Emerging (or, Emergent) Theology.


But do not imagine for one moment that the church's biblical theologians can look away from today's scientific discoveries and not be impacted to think new thoughts previously unthought of God differing from the ages past. Ages that did not have the science and learning availed to it as we do today. Which cultural event is nothing new since the early days of the church influenced first by Hellenism, then by later medieval and Renaissance thought, which then evolved once again in the Age of Enlightenment and Modernism. Even so, today's postmodernism will be similarly affective upon the church's many outdated doctrines, dogmas, and conceptual understandings of God, the Bible, and even of ourselves.

This is not heresy but the natural progression of human endeavor and enterprise. Moreover, the heresy would be to retain our older concepts of God as unmoved and unmovable, rather than remitting to the living God of humanity found through all time and expression in the ages of man. Who urges the aegis of the church to master its relevancy of mission and message by progressing into every age of man actively, dynamically, humanely, and hopefully. Not negatively, fearfully, acrimoniously, or uncertainly in despair and frustration.

Even so, does God Himself move forward apocalyptically in contemporary lockstep with man's overeager, sometimes Babel-like, societies unfailing in their aspirations and attainments to overreach their potential by sinful means, war and domination. Requiring some sort of divine caution and brakes to this world's aggressive humanism, and godless re-conception, of itself. For we are graciously invited into God's creational temple-space not to replace Him but to enjoy His creation with Him. As exampled by the fall of Adam and Eve who rejected this hallowed templed space by wishing to be like God by their own will and admission. Which was quite out-of-plan and out-of-sorts with God's overall objectives of sharing and fellowship, rest and peace. And to God's great wisdom to not let it proceed any further towards its ultimate eternal destruction by removing humanity's reach of the Tree of Life.

Let us likewise be not so hasty to do this same thing. For we are mortal, though joined immortally with one another through time eternal. Showing wisdom then in seeking God's design and blessing without usurpation, greed, envy or ill-will. Using the Spirit's wisdom to update our old-line church doctrines and dogmas in re-evaluation of God's overall plans and design for contemporary relevancy of mission and message.

Who plants within the restless hearts of men His laudable insights and searching knowledge towards the discovery of newer medicines, ecological practices, technologies, and societal improvements. Who opens man's eyes to the beauty, the majesty, the paradox, and mystery, of His divine Being, creation, and even to ourselves, spun in the web of moiling existence, turmoil, and dark meaning. Each temperament lifted like a silken filament leading towards unexplored shores heralding novel frontiers rich in unfathomed insight, progress, and societal development. For this is the hope of the Kingdom of God by its inauguration into the life of humanity and its civilizations through Jesus' death and resurrection. An earthy kingdom with a heavenly design. A kingdom filled with life and light, not death and destruction. A kingdom God ushers us all towards as honored guests and adoptive children enjoying all the rights of privilege and position, pedigree and purchase.

Even so, it would behoove us as Emerging Christians to develop an emerging theology of God in a postmodernistic age of endeavor and challenge, accomplishment and despair. Providing an emergent theology granting relevancy to the mission and message of Jesus our Savior, Redeemer, Sovereign Creator, wise Ruler and gracious Lord. Who seeks with us a country not of our own making, but one filled with the wisdom and glory of the God of the Ages. Who clothes Himself with the garments of Time and Space, and strides across the Heavens in holy adoration, breathless wonder, and inscrutable council. To this God we bow our hearts in contrition to whisper, in the waning silences of the deepest voids of our hearts and the quantum universe, Amen and Amen.

R.E. Slater
February 12, 2013


 


The Paradox of Time:
Why It Can't Stop, But Must
http://www.scientificamerican.com/article.cfm?id=could-time-end
By this thinking, time’s demise is no more paradoxical than the disintegration of any other complex system. One by one, time loses its features and passes through the twilight from existence to nonexistence.

The first to go might be its unidirectionality—its “arrow” pointing from past to future. Physicists have recognized since the mid-19th century that the arrow is a property not of time per se but of matter. Time is inherently bidirectional; the arrow we perceive is simply the natural degeneration of matter from order to chaos, a syndrome that anyone who lives with pets or young children will recognize. (The original orderliness might owe itself to the geometric principles that McInnes conjectured.) If this trend keeps up, the universe will approach a state of equilibrium, or “heat death,” in which it cannot get possibly get any messier. Individual particles will continue to reshuffle themselves, but the universe as a whole will cease to change, any surviving clocks will jiggle in both directions and the future will become indistinguishable from the past [see “The Cosmic Origins of Time’s Arrow,” by Sean M. Carroll; Scientific American, June 2008]. A few physicists have speculated that the arrow might reverse, so that the universe sets about tidying itself up, but for mortal creatures whose very existence depends on a forward arrow of time, such a reversal would mark an end to time as surely as heat death would.

Losing Track of Time

More recent research suggests that the arrow is not the only feature that time might lose as it suffers death by attrition. Another could be the concept of duration. Time as we know it comes in amounts: seconds, days, years. If it didn’t, we could tell that events occurred in chronological order but couldn’t tell how long they lasted. That scenario is what University of Oxford physicist Roger Penrose presents in a new book, Cycles of Time: An Extraordinary New View of the Universe.

Throughout his career, Penrose really seems to have had it in for time. He and University of Cambridge physicist Stephen Hawking showed in the 1960s that singularities do not arise only in special settings but should be everywhere. He has also argued that matter falling into a black hole has no afterlife and that time has no place in a truly fundamental theory of physics.

In his latest assault, Penrose begins with a basic observation about the very early universe. It was like a box of Legos that had just been dumped out on the floor and not yet assembled—a mishmash of quarks, electrons and other elementary particles. From them, structures such as atoms, molecules, stars and galaxies had to piece themselves together step by step. The first step was the creation of protons and neutrons, which consist of three quarks apiece and are about a femtometer (10–15 meter) across. They came together about 10 microseconds after the big bang (or big bounce, or whatever it was).

Before then, there were no structures at all—nothing was made up of pieces that were bound together. So there was nothing that could act as a clock. The oscillations of a clock rely on a well-defined reference such as the length of a pendulum, the distance between two mirrors or the size of atomic orbitals. No such reference yet existed. Clumps of particles might have come together temporarily, but they could not tell time, because they had no fixed size. Individual quarks and electrons could not serve as a reference, because they have no size, either. No matter how closely particle physicists zoom in on one, all they see is a point. The only sizelike attribute these particles have is their so-called Compton wavelength, which sets the scale of quantum effects and is inversely proportional to mass. And they lacked even this rudimentary scale prior to a time of about 10 picoseconds after the big bang, when the process that endowed them with mass had not yet occurred. “There’s no sort of clock,” Penrose says. “Things don’t know how to keep track of time.” Without anything capable of marking out regular time intervals, either an attosecond or a femtosecond could pass, and it made no difference to particles in the primordial soup.

Penrose proposes that this situation describes not only the distant past but also the distant future. Long after all the stars wink out, the universe will be a grim stew of black holes and loose particles; then even the black holes will decay away and leave only the particles. Most of those particles will be massless ones such as photons, and again clocks will become impossible to build. In alternative futures where the universe gets snuffed out by, say, a big crunch, clocks don’t fare too well, either.

You might suppose that duration will continue to make sense in the abstract, even if nothing could measure it. But researchers question whether a quantity that cannot be measured even in principle really exists. To them, the inability to build a clock is a sign that time itself has been stripped of one of its defining features. “If time is what is measured on a clock and there are no clocks, then there is no time,” says philosopher of physics Henrik Zinkernagel of the University of Granada in Spain, who also has studied the disappearance of time in the early universe.

Despite its elegance, Penrose’s scenario does have its weak points. Not all the particles in the far future will be massless; at least some electrons will survive, and you should be able to build a clock out of them. Penrose speculates that the electrons will somehow go on a diet and shed their mass, but he admits he is on shaky ground. “That’s one of the more uncomfortable things about this theory,” he says. Also, if the early universe had no sense of scale, how was it able to expand, thin out and cool down?

If Penrose is on to something, however, it has a remarkable implication. Although the densely packed early universe and ever emptying far future seem like polar opposites, they are equally bereft of clocks and other measures of scale. “The big bang is very similar to the remote future,” Penrose says. He boldly surmises that they are actually the same stage of a grand cosmic cycle. When time ends, it will loop back around to a new big bang. Penrose, a man who has spent his career arguing that singularities mark the end of time, may have found a way to keep it going. The slayer of time has become its savior.

Time Stands Still

Even if duration becomes meaningless and the femtoseconds and attoseconds blur into one another, time isn’t dead quite yet. It still dictates that events unfold in a sequence of cause and effect. In this respect, time is different from space, which places few restrictions on how objects may be arranged within it. Two events that are adjacent within time—when I type on my keyboard, letters appear on my screen—are inextricably linked. But two objects that are adjacent within space—a keyboard and a Post-It note—might have nothing to do with each other. Spatial relations simply do not have the same inevitability that temporal ones do.

But under certain conditions, time could lose even this basic ordering function and become just another dimension of space. The idea goes back to the 1980s, when Hawking and Hartle sought to explain the big bang as the moment when time and space became differentiated. Three years ago Marc Mars of the University of Salamanca in Spain and José M. M. Senovilla and Raül Vera of the University of the Basque Country applied a similar idea not to time’s beginning but to its end.

They were inspired by string theory and its conjecture that our four-dimensional universe—three dimensions of space, one of time—might be a membrane, or simply a “brane,” floating in a higher-dimensional space like a leaf in the wind. We are trapped on the brane like a caterpillar clinging to the leaf. Ordinarily, we are free to roam around our 4-D prison. But if the brane is blown around fiercely enough, all we can do is hold on for dear life; we can no longer move. Specifically, we would have to go faster than the speed of light to make any headway moving along the brane, and we cannot do that. All processes involve some type of movement, so they all grind to a halt.

Seen from the outside, the timelines formed by successive moments in our lives do not end but merely get bent so that they are lines through space instead. The brane would still be 4-D, but all four dimensions would be space. Mars says that objects “are forced by the brane to move at speeds closer and closer to the speed of light, until eventually the trajectories tilt so much that they are in fact superluminal and there is no time. The key point is that they may be perfectly unaware that this is happening to them.”

Because all our clocks would slow down and stop, too, we would have no way to tell that time was morphing into space. All we would see is that objects such as galaxies seemed to be speeding up. Eerily, that is exactly what astronomers really do see and usually attribute to some unknown kind of “dark energy.” Could the acceleration instead be the swan song of time?

Your Time Is Up

By this late stage, it might appear that time has faded to nothingness. But a shadow of time still lingers. Even if you cannot define duration or causal relations, you can still label events by the time they occurred and lay them out on a timeline. Several groups of string theorists have recently made progress on how time might be stripped of this last remaining feature. Emil J. Martinec and Savdeep S. Sethi of the University of Chicago and Daniel Robbins of Texas A&M University, as well as Horowitz, Eva Silverstein of Stanford University and Albion Lawrence of Brandeis University, among others, have studied what happens to time at black hole singularities using one of the most powerful ideas of string theory, known as the holographic principle.

A hologram is a special type of image that evokes a sense of depth. Though flat, the hologram is patterned to make it look as though a solid object is floating in front of you in 3-D space. The holographic principle holds that our entire universe is like a holographic projection. A complex system of interacting quantum particles can evoke a sense of depth—that is to say, a spatial dimension that does not exist in the original system.

But the converse is not true. Not every image is a hologram; it must be patterned in just the right way. If you scratch a hologram, you spoil the illusion. Likewise, not every particle system gives rise to a universe like ours; the system must be patterned just so. If the system initially lacks the necessary regularities and then develops them, the spatial dimension pops into existence. If the system reverts to disorder, the dimension disappears whence it came.

Imagine, then, the collapse of a star to a black hole. The star looks 3-D to us but corresponds to a pattern in some 2-D particle system. As its gravity intensifies, the corresponding planar system jiggles with increasing fervor. When a singularity forms, order breaks down completely. The process is analogous to the melting of an ice cube: the water molecules go from a regular crystalline arrangement to the disordered jumble of a liquid. So the third dimension literally melts away.

As it goes, so does time. If you fall into a black hole, the time on your watch depends on your distance from the center of the hole, which is defined within the melting spatial dimension. As that dimension disintegrates, your watch starts to spin uncontrollably, and it becomes impossible to say that events occur at specific times or objects reside in specific places. “The conventional geometric notion of spacetime has ended,” Martinec says.

What that means in practice is that space and time no longer give structure to the world. If you try to measure objects’ positions, you find that they appear to reside in more than one place. Spatial separation means nothing to them; they jump from one place to another without crossing the intervening distance. In fact, that is how the imprint of a hapless astronaut who passes the black hole’s point of no return, its event horizon, can get back out. “If space and time do not exist near a singularity, the event horizon is no longer well defined,” Horowitz says.

In other words, string theory does not just smear out the putative singularity, replacing the errant point with something more palatable while leaving the rest of the universe much the same. Instead it reveals a broader breakdown of the concepts of space and time, the effects of which persist far from the singularity itself. To be sure, the theory still requires a primal notion of time in the particle system. Scientists are still trying to develop a notion of dynamics that does not presuppose time at all. Until then, time clings stubbornly to life. It is so deeply engrained in physics that scientists have yet to imagine its final and total disappearance.

Science comprehends the incomprehensible by breaking it down, by showing that a daunting journey is nothing more than a succession of small steps. So it is with the end of time. And in thinking about time, we come to a better appreciation of our own place in the universe as mortal creatures. The features that time will progressively lose are prerequisites of our existence. We need time to be unidirectional for us to develop and evolve; we need a notion of duration and scale to be able to form complex structures; we need causal ordering for processes to be able to unfold; we need spatial separation so that our bodies can create a little pocket of order in the world. As these qualities melt away, so does our ability to survive. The end of time may be something we can imagine, but no one will ever experience it directly, any more than we can be conscious at the moment of our own death.

As our distant descendants approach time’s end, they will need to struggle for survival in an increasingly hostile universe, and their exertions will only hasten the inevitable. After all, we are not passive victims of time’s demise; we are perpetrators. As we live, we convert energy to waste heat and contribute to the degeneration of the universe. Time must die that we may live.

ABOUT THE AUTHOR(S)
George Musser is a staff editor for Scientific American.