Quotes & Sayings


We, and creation itself, actualize the possibilities of the God who sustains the world, towards becoming in the world in a fuller, more deeper way. - R.E. Slater

There is urgency in coming to see the world as a web of interrelated processes of which we are integral parts, so that all of our choices and actions have [consequential effects upon] the world around us. - Process Metaphysician Alfred North Whitehead

Kurt Gödel's Incompleteness Theorem says (i) all closed systems are unprovable within themselves and, that (ii) all open systems are rightly understood as incomplete. - R.E. Slater

The most true thing about you is what God has said to you in Christ, "You are My Beloved." - Tripp Fuller

The God among us is the God who refuses to be God without us, so great is God's Love. - Tripp Fuller

According to some Christian outlooks we were made for another world. Perhaps, rather, we were made for this world to recreate, reclaim, redeem, and renew unto God's future aspiration by the power of His Spirit. - R.E. Slater

Our eschatological ethos is to love. To stand with those who are oppressed. To stand against those who are oppressing. It is that simple. Love is our only calling and Christian Hope. - R.E. Slater

Secularization theory has been massively falsified. We don't live in an age of secularity. We live in an age of explosive, pervasive religiosity... an age of religious pluralism. - Peter L. Berger

Exploring the edge of life and faith in a post-everything world. - Todd Littleton

I don't need another reason to believe, your love is all around for me to see. – Anon

Thou art our need; and in giving us more of thyself thou givest us all. - Khalil Gibran, Prayer XXIII

Be careful what you pretend to be. You become what you pretend to be. - Kurt Vonnegut

Religious beliefs, far from being primary, are often shaped and adjusted by our social goals. - Jim Forest

We become who we are by what we believe and can justify. - R.E. Slater

People, even more than things, need to be restored, renewed, revived, reclaimed, and redeemed; never throw out anyone. – Anon

Certainly, God's love has made fools of us all. - R.E. Slater

An apocalyptic Christian faith doesn't wait for Jesus to come, but for Jesus to become in our midst. - R.E. Slater

Christian belief in God begins with the cross and resurrection of Jesus, not with rational apologetics. - Eberhard Jüngel, Jürgen Moltmann

Our knowledge of God is through the 'I-Thou' encounter, not in finding God at the end of a syllogism or argument. There is a grave danger in any Christian treatment of God as an object. The God of Jesus Christ and Scripture is irreducibly subject and never made as an object, a force, a power, or a principle that can be manipulated. - Emil Brunner

“Ehyeh Asher Ehyeh” means "I will be that who I have yet to become." - God (Ex 3.14) or, conversely, “I AM who I AM Becoming.”

Our job is to love others without stopping to inquire whether or not they are worthy. - Thomas Merton

The church is God's world-changing social experiment of bringing unlikes and differents to the Eucharist/Communion table to share life with one another as a new kind of family. When this happens, we show to the world what love, justice, peace, reconciliation, and life together is designed by God to be. The church is God's show-and-tell for the world to see how God wants us to live as a blended, global, polypluralistic family united with one will, by one Lord, and baptized by one Spirit. – Anon

The cross that is planted at the heart of the history of the world cannot be uprooted. - Jacques Ellul

The Unity in whose loving presence the universe unfolds is inside each person as a call to welcome the stranger, protect animals and the earth, respect the dignity of each person, think new thoughts, and help bring about ecological civilizations. - John Cobb & Farhan A. Shah

If you board the wrong train it is of no use running along the corridors of the train in the other direction. - Dietrich Bonhoeffer

God's justice is restorative rather than punitive; His discipline is merciful rather than punishing; His power is made perfect in weakness; and His grace is sufficient for all. – Anon

Our little [biblical] systems have their day; they have their day and cease to be. They are but broken lights of Thee, and Thou, O God art more than they. - Alfred Lord Tennyson

We can’t control God; God is uncontrollable. God can’t control us; God’s love is uncontrolling! - Thomas Jay Oord

Life in perspective but always in process... as we are relational beings in process to one another, so life events are in process in relation to each event... as God is to Self, is to world, is to us... like Father, like sons and daughters, like events... life in process yet always in perspective. - R.E. Slater

To promote societal transition to sustainable ways of living and a global society founded on a shared ethical framework which includes respect and care for the community of life, ecological integrity, universal human rights, respect for diversity, economic justice, democracy, and a culture of peace. - The Earth Charter Mission Statement

Christian humanism is the belief that human freedom, individual conscience, and unencumbered rational inquiry are compatible with the practice of Christianity or even intrinsic in its doctrine. It represents a philosophical union of Christian faith and classical humanist principles. - Scott Postma

It is never wise to have a self-appointed religious institution determine a nation's moral code. The opportunities for moral compromise and failure are high; the moral codes and creeds assuredly racist, discriminatory, or subjectively and religiously defined; and the pronouncement of inhumanitarian political objectives quite predictable. - R.E. Slater

God's love must both center and define the Christian faith and all religious or human faiths seeking human and ecological balance in worlds of subtraction, harm, tragedy, and evil. - R.E. Slater

In Whitehead’s process ontology, we can think of the experiential ground of reality as an eternal pulse whereby what is objectively public in one moment becomes subjectively prehended in the next, and whereby the subject that emerges from its feelings then perishes into public expression as an object (or “superject”) aiming for novelty. There is a rhythm of Being between object and subject, not an ontological division. This rhythm powers the creative growth of the universe from one occasion of experience to the next. This is the Whiteheadian mantra: “The many become one and are increased by one.” - Matthew Segall

Without Love there is no Truth. And True Truth is always Loving. There is no dichotomy between these terms but only seamless integration. This is the premier centering focus of a Processual Theology of Love. - R.E. Slater

-----

Note: Generally I do not respond to commentary. I may read the comments but wish to reserve my time to write (or write from the comments I read). Instead, I'd like to see our community help one another and in the helping encourage and exhort each of us towards Christian love in Christ Jesus our Lord and Savior. - re slater

Showing posts with label Postmodernism v. Modernism. Show all posts
Showing posts with label Postmodernism v. Modernism. Show all posts

Thursday, November 11, 2021

The Birth, Death, and Rebirth of Postmodernism




The Birth, Death, and Rebirth of Postmodernism

by 10 Contributors
June 14, 2019

What was Postmodernism? In the 35 years since Fredric Jameson’s New Left Review essay “Postmodernism, or the Cultural Logic of Late Capitalism” — and the 40 years since the publication of Jean-François Lyotard’s The Postmodern Condition — it’s fair to say that no other idea from the academic humanities has had so vast, if murky, an influence on the broader culture. (Often assumed to be a proponent of postmodernism, Jameson is rather its diagnostician.) [The TV Series] Seinfeld was said to be “postmodern,” and so was the architecture of Frank Gehry. So, too, was the “fashionable nonsense” targeted by Alan Sokal in his infamous 1996 Social Text hoax. As the recent “Sokal Squared” hoaxes showed, the specter of postmodernism continues to be a useful cudgel wielded against the university.

Debates about postmodernism have returned, in both vulgar and sophisticated forms — from Jordan Peterson’s crusade against “postmodern neo-Marxism” to Bruno Latour’s defense of climate science to Rita Felski’s probing of “the limits of critique.” And the accelerating media bombardment enabled by our proliferating devices has only intensified what Jameson, all those years ago, called the “transformation of the ‘real’ into so many pseudo-events.”

In that spirit, we asked 10 contributors to reflect on the continuing relevance — or irrelevance — of postmodernism to the academy and the larger culture.


Jean-François Lyotard | BRACHA L. ETTINGER

You’re So Paranoid, You Probably Think
This Conspiracy Is About You
by Moira Weigel

Postmodernism has long been an object of conspiracies. But I had forgotten that, in the middle of his canonical essay “Postmodernism,” Fredric Jameson himself turns to the subject of conspiracy theorists. What’s more, he acknowledges a kinship between their methods and his own.

Such an acknowledgment might sound like precisely the confession that the self-described opponents of postmodernism have been waiting for. Since the Culture Wars of the late 1980s, these opponents have come from many and politically varied quarters, from bow-tied defenders of Great Books courses to class-first Marxists who argue that emphasis on discourse and representation has been distracting and destructive for left politics.

In the era of Donald Trump — and YouTube — the most fevered version of the case against postmodernism has become increasingly visible. That is, the claim that a coalition of critical theorists, poststructuralists, multiculturalists, feminists, queer theorists, and African-American and other “studies” professors have successfully conspired to take over educational institutions, the media, and the U.S. government, and even to establish a new International World Order. (Why those same masterminds so often prove unable to secure the minimal funding needed to keep their departments open, or university presses running, does not come up. Nor do the many intellectual and political differences that we have among ourselves.)

As the “cultural Marxism” conspiracy has migrated from Holocaust-denier conferences and terrorist manifestos to White House memos and the opinion pages of The New York Times, its exponents have continued to deploy a rhetoric of reaction that at once abjures and identifies with the targets of its criticism. The “studies” department politicized culture, so “we” must start a culture war. The Social Justice Warriors have made the transgression of gender norms so normal that “we” must transgress them. Joe Biden doesn’t like you either. Look what you made him do.

Now: The writers who expound the Cultural Marxism story have not necessarily read Jameson. The most popular accounts — books like Pat Buchanan’s Death of the West (2001) or Andrew Breitbart’s Righteous Indignation (2011) — end in the 1960s, with Herbert Marcuse. In 2017, when Jordan Peterson told an audience at the Manning Centre that “you need to understand postmodernism, because that’s what you’re up against,” he recommended that people buy Understanding Postmodernism, by Stephen Hicks.

The refusal to engage with power makes mainstream
critics of postmodernism often sound conspiratorial.

But Jameson is one of relatively few thinkers who has ever actually described himself as a “cultural marxist.” And both the style of thinking that he practices and the “cultural logic” that he describes — his insistence that everything is historical, and therefore political — is what stands accused.

And not just by self-described white nationalists. Since the election of Donald Trump, even reputable cultural critics who identify as liberals and leftists have begun to say that they, too, blame “postmodernism.” The MLA has sponsored panels of earnest self-criticism on the responsibility that literary scholars bear for creating the conditions of “post-truth.”

That is, there has been a growing consensus that postmodernism destroyed a necessary capacity to draw distinctions, created epistemic conditions in which left and right, it is all the same. Rereading Jameson, I found conceptual resources for countering such claims. The sense that there is no outside, that everything connects to everything, is one that Jameson really does share with Glenn Beck. But at the heart of his essay on postmodernism, anticipating the charge that he is similar to conspiracists, Jameson articulates the differences between them.

The passage I mean comes at the end of the section on the “technological sublime.” Jameson has made his argument that, whereas Edmund Burke and Immanuel Kant conceptualized the sublime as an encounter with divinity or nature, by the 1980s those forces had been replaced by technology. It’s a concept prescient enough to seem banal, now. If the sublime is that which overwhelms the subject with the finitude of his own reasoning capacities, anyone who has ever attended a Silicon Valley Demo Day or called their health-insurance company to dispute the denial of a claim by “the computer” knows that Big Data does exactly this.

According to Jameson, a new genre of writing and filmmaking he calls “high-tech paranoia” captured this situation best. The narratives he had in mind were stories of conspiracy: “Yet conspiracy theory (and its garish narrative manifestations) must be seen as a degraded attempt — through the figuration of advanced technology — to think the impossible totality of the contemporary world system.” Degraded why? Because while it went through the motions of revealing truth — the shadowy agency behind the global plot, the men in smoky rooms who had made everything happen — ultimately conspiracy theory preserved the invisibility that it thematizes.

At the end of “Postmodernism,” Jameson proposes that the task of art and of criticism in his time is to do something that he calls global cognitive mapping: to create “a pedagogical political culture which seeks to endow the individual subject with some new heightened sense of its place in the global system.” Superficially, the work of cognitive mapping resembles conspiracy theorizing. Both abandon “depth reading” in favor of tracing connections. Both aver that there is no “outside.”

The key difference is that the first activity aims to demystify. The frantic activity of Glenn Beck at his chalkboard or Jordan Peterson tweeting about “cultural Marxists” does not ultimately enable the reader or viewer to recognize the forces that keep her in her place. Rather it exaggerates their incomprehensible sublimity. The Big Boss’s Boss stays offscreen — for the sequel.

This refusal to really engage with power is what makes the mainstream critics of postmodernism, cultural Marxism, and allegedly related movements — intersectionality, “safetyism,” etc. — so often sound conspiratorial themselves. While they point to many interrelated cultural and political changes, they provide no concrete account of how these changes happen. They have no theory of power. They like it that way.

It is common, for instance, for contemporary anti-postmodernists to claim that students with “illiberal” ideas have gotten their professors fired, or that a Twitter mob has “censored” a major magazine or book. Sure, student unrest can lead to the dismissal of a teacher, and social-media outrage can be a reason that a publishing conglomerate decides to halt the presses. But to say that the overheated feelings of a few teenagers can make professors destitute — or that snark broadcast to a few dozen followers can oppress a presidential candidate — is to skip over a few key steps.

We cannot actually understand the relationship between students’ embrace of certain ideas and the precarity of their teachers — just for instance — without mapping the academic and corporate bureaucracies, the government-backed lending institutions, that shape their lives. The widespread accusation that “theory” has produced our situation may be so popular precisely because it allows existing power structures to remain as they are. Pay no attention to that debt bureaucracy behind the curtain!

Whether you think exposing power this way is the only legitimate task of art or literature or criticism is another question. But to account for culture at the present will require descending from the online marketplace of ideas into the workshop of their production.

Otherwise, at the end of this thriller, the anti-theorist valued for his “viewpoint diversity” will fly to the next campus, give his next talk on how he is being silenced by teenagers, and collect his $30,000 — and the next installment of the Culture War franchise will remain just as predictable.

Moira Weigel is a postdoctoral fellow at Harvard University.


Tracing Fossils of the Old Beast
by Justin E.H. Smith

All things come to an end, not least the coming-to-an-end of things. And so it had to be with the end of modernism, and the couple of decades of reflection and debate on what was to come next. For me, postmodernism is the copy of Jean-François Lyotard’s The Postmodern Condition, which I bought in English translation in 1993. It’s sitting in a cardboard box, its pages slowly yellowing and its cover design receding into something recognizably vintage, in my old mother’s suburban California garage. I stowed it there when I moved to Paris, in 2013. And in the past six years I have seen only fossil traces of the old beast said to have roamed here in earlier times, eating up grand narratives and truth claims like they were nests full of unprotected eggs.

A few living fossils, coelacanth-like, survived from French philosophy’s âge d’or and could still fill lecture halls. But the survivors were mostly known for their non-representativity, in part because they loudly proclaimed it. Alain Badiou, for example, talked about the transcendental forms of love and beauty. Bruno Latour, not long after 2001, began regretting what his own brand of truth-wariness had done to stoke the “truther” conspiracy theories that had quickly spread to the villagers who worked his family’s vineyards, in Bourgogne. And for the most part, as Perry Anderson observed, by the early 21st century French philosophy had gone the way of French cinema: the way of nostalgia, safe formulas, and whatever special effects could be pulled off in the absence of adequate funding.

So much for France, then. In the Anglosphere, new technology was meanwhile ensuring that whatever had been said a few decades prior about simulacra and spectacle may as well have been said not about television, or even the early internet, but about telegraphs and semaphore. I, a clueless normie, first noticed the full extent of the epochal shift in 2015. Old institutions were crumbling — media, entertainment, education, democracy. Along with them, old ideas about what constitutes authority came to seem laughable. An American president was propelled into office through trolling. Mobs emerged to shame and ostracize anyone, no matter how distinguished or eminent, who did not agree with them. The internet, which should long ago have been transformed into a public utility, had instead been transformed into a weapon of war: not just a metaphorical “war of words,” but actual war, an asymmetrical vigilante war (with covert support from a recrudescing superpower) bent upon bringing old institutions to the ground. We are in the midst of this war as I write.

Old institutions were crumbling — media,
entertainment, education, democracy.

By the time of this epochal shift, academic postmodernists had become fully identified with their institutions. They might have entered into them in a spirit of play and subversion, but try telling the precariously employed and prospectless grad students that their professors’ health insurance and retirement plans are just so much play.

This shift has also hailed a stark return to old ideas about truth and falsehood, ideas that the postmodernists had been dismissing for my whole conscious life as unsophisticated and passé. No moment for me is more emblematic of this than Avital Ronell’s implosion in the wake of her harassment scandal last year. She thought she could pass it all off as “camp,” that is, an expression of free play unbound by simple rules of right and wrong. Slavoj Žižek and a few other coevals offered up variations on the importance of not rushing to judgment. After all, the heart is a dark forest and those of us on the outside of an emotionally charged affair are in a poor position to judge of its true nature, so be quiet and let my friend Avital keep using NYU office space as her deconstructionist romper room. The young people were not having it.

One of the preferred phrases in internet-based debate is “full stop.” For instance, “Trans women are women, full stop.” Conservatives such as Jordan Peterson, who continue in the habit of believing that the problem with the left is its “postmodernism,” have been so floored by what they take to be the disregard for scientific or transcendental truth in the first part of this sentence that they have completely failed to notice the utterly un-postmodernist spirit of the second part — the “full stop.” But the kids mean what they say, and in this they are a world away from the postmodernists, to whose spirit nothing could have been more contrary than the suggestion that there is a final word, a full stop, on anything. The future, at least the near future, belongs to them. Playtime is over.

Justin E.H. Smith is a professor of history and philosophy of science at the University of Paris Diderot and the author of Irrationality: A History of the Dark Side of Reason (Princeton University Press, 2019).


Jacques Derrida | ULF ANDERSEN/GETTY IMAGES

The End of the World
By Mark Greif

Postmodernism” referred to three different things.

Postmodernism1 asked a question about art. The question was: Had the story of modern art ended? The answer seemed to be yes. “Modern art,” in this sense, meant a sequence of stepwise advances in style, begun in the 1860s, each novelty making its predecessors obsolete. The story of modernism in art had been told such that it could have an ending, once the possible new steps were reduced to minimal, repetitive, or chance gestures (in the fine arts), and no improvements in function could be conceived (as in architecture). Now, in the 1970s and 1980s, art ended before critics’ eyes. Minimal music. Rephotographed Marlboro Men. Decorative party hats on skyscrapers. Fredric Jameson recorded some of the changes in his great essay “Postmodernism” (1984), as Arthur Danto celebrated others in “The End of Art” (1984) and elsewhere. Artists kept making works, but knew they all came “after.”

Postmodernism2 asked questions about society. Considering economics, politics, and technology, it asked: Had a “modern” order, begun in the 1500s or the 1700s, ended in the rich nations? Industrial capitalism underwent deindustrialization. The democratic utopias — of bourgeois revolutions for rights and freedoms, and socialist revolutions for equality and plenty — seemed to be running out of steam. Information and computation might liquidate and concentrate all spheres of knowledge and value — or just multiply and clutter them. In different ways, these were the questions asked by Jean-François Lyotard, David Harvey, and Francis Fukuyama in the 1980s and 1990s. But the answer here to “postmodernity” seemed to be: No, modernity hadn’t ended after all; it had accelerated, metastasized, reorganized. On this view, we still dwell in the familiar “modernity” that launched with the Renaissance, scientific revolution, or industrialization, just more pessimistically.

Postmodernism3 was less coherent. It helped to name a conflict in philosophy. A strange struggle had occurred in English-language university departments after 1945, when philosophy and some social sciences lost touch with Europe and refashioned themselves as progressive and anti-historical imitators of the “hard” sciences. Postwar European social science and philosophy found alternate homes in literature and anthropology departments, as “theory.” Sometimes it was called, by its opponents, “postmodernism.” (Perhaps because “modernism” in thought could mean rigorism, scientific unification and purification. But philosophical “modernism” had also meant relativism, pragmatism, and phenomenology, and the objected-to portions of Foucault, Derrida, Lacan, or Bourdieu preceded them in modernists like Dewey, Boas, Wittgenstein, and Heidegger, not to mention Weber, Freud, and Marx.) During the period from 1970 to 2000, the disciplinary musical chairs upset, or inspired, all sorts of people, quite needlessly. The continuing use of “postmodernism” as a pejorative, today, is a know-nothing usage, and should decline in importance.

“Postmodernism” has become a historical term,
naming forms of debate from the last fin-de-siècle.

Thirty-five years after its best formulations, “postmodernism” has become an essentially historical term. It names forms of debate from the last fin de siècle. All the phenomena underlying the debates still live, however. The major critical statements are each in their own way quite great, and worth reading. Jameson’s contributions, especially, have held up best because of his indestructible Marxist commitment to overlaying the artistic and cultural (his métier) upon the social and economic, even where they mismatch.

My own position is that the explanation for the “postmodern,” and its role as a mysterious vortex for intense energies of the late 20th century, can only be understood within the larger proliferation of “posts” in the period, including also “posthistory” and the “posthuman.” The energizing feature of all these debate-containers, which also named lively fantasies (of intellectual messianism, of world destruction), belongs to that gap you can see between Postmodernism1 and Postmodernism2, a magnetic dynamo (for those who flopped between culture and political economy) repeated in many other intellectual locations.

Some forms of structured progress really did end, like modern art. Other forms, equally intellectual yet real, really didn’t end — like capitalism and socialism, even though the U.S.S.R. went kaput. Still others more unpredictably didn’t end — like the world itself, though it sometimes seemed it should have. The expected U.S.-U.S.S.R. nuclear war never happened, despite ample opportunities. Yet the world itself secretly embarked on another possible ending in unforeseen guise, and nuclear terror has been adapted for a slow climate catastrophe already underway. This persistence fits within the modern habit of thought that overvalues the new, feels the accelerated tempo of eventfulness, and can’t quite decide whether it should prefer to be in the last generation, or to abide. Whether, in other words, the pleasure of being the first to name, describe, or even help cause the end of the world and time would be worth the price of undergoing it.

Mark Greif is an associate professor of English at Stanford University.


Postmodernists Didn’t Go Far Enough
by Ethan Kleinberg

The Bonaventure Hotel still stands in downtown Los Angeles, 35 years after Fredric Jameson presented it as an exemplar of postmodernism. But walking through the Bonaventure today, one has the inescapable feeling of inhabiting a past promise of what the future was to be. One has a similar feeling re-reading Jameson’s essay “Postmodernism, or the Cultural Logic of Late Capitalism.” The building and the text each gestures to the hopes or fears of an inevitable future that never came to pass — though it is principally fear that has driven attacks on postmodernism, from the time of Jameson’s essay to now.

Jameson’s fear was that postmodern theory disabled Marxist critique and political action. More recent attacks from public intellectuals on the Marxist left conserve Jameson’s claim that postmodernism abandons the concept of “truth,” paralyzing the critic in negation and revolt. These are claims that I reject.

It is wrongheaded to assert that postmodernism abandons “truth.” While the postmodern position does hold truth to be socially constructed, it does not follow that truth does not exist. Instead, postmodernism seeks to understand why and how some truths are accepted while others are not. This is a deeply historical project which requires scholars to engage with the ways that epistemological commitments change at different times and in different places. Far from inciting paralysis, an understanding of historical contingency is liberating because it denaturalizes suppositions previously taken to be foundational or immutable. It opens up the possibility of change.

Surprisingly, aspects of Jameson’s critique have provided a template for critics of postmodernism from the political center and right. Jordan Peterson’s cartoonish characterization of postmodernism as “an infinite number of ways to interpret a finite set of phenomena” and his conclusion that under such a world view “no interpretation can be privileged above another” merely serve as a means of misdirection allowing him to privilege his own odious interpretations. Steven Pinker’s work has more academic credibility, but the family resemblance is visible, as when he asks an imagined postmodernist this rhetorical question: “If truth is just socially constructed, would you say that climate change is a myth?” But to assert that the truth is socially constructed is not tantamount to saying it is impossible to privilege any one truth above others. We can and we do. The potency of the postmodern approach lies in its ability to question why and how such privileging occurs.

The issue has never been that postmodernism has gone too far but that academics have yet to go far enough. Given the precarity of our current political, ecological, and epistemological climate, the time has come for academics to embrace postmodernism — to activate not only its potential for critique but also its power to create space for change.

Ethan Kleinberg is a professor of history and letters at Wesleyan University and editor-in-chief of History and Theory.


Judith Butler | MIQUEL TAVERNA, CENTRE DE CULTURA CONTEMPORÀNIA DE BARCELONA

Still Modernist, After All These Years
by Marjorie Perloff

Contradiction, disruption, dislocation, decentering: In the 1970s and ’80s, I was a confirmed believer in those defining attributes of what was known as postmodernism. From David Antin’s groundbreaking essay “Modernism and Postmodernism: Approaching the Present in American Poetry,” published in the first issue (1972) of boundary 2, to Jean-François Lyotard’s The Postmodern Condition (1979), to Fredric Jameson’s definitive “Postmodernism, or the Cultural Logic of Late Capitalism” (1984), with its elaboration of “the death of the author,” “the waning of affect,” the “new depthlessness,” the blank parody we call “pastiche,” and “the new spatial logic of the simulacrum” (the term simulacrum made famous by Jean Baudrillard), postmodernism was the order of the day.

Ihab Hassan’s elaborate charts, which pitted modernism against postmodernism, using such categories as urbanism versus the global village, elitism versus community and anarchy, and irony versus camp and the absurd, were widely cited as if their neat bifurcation was a fact of life in the second half of the 20th century. Derridean anti-essentialism demanded that we all scoff at the very possibility of transcendental value or external “truth.” The only difference was between those who thought the coming of postmodernism was a good thing — a liberating notion ushering in a new avant-garde — and those who saw it as the darker and inevitable logic of a late and brutal capitalism.

Jameson was of the second camp, I myself of the first. The artists I loved and championed — John Cage, Jasper Johns, Robert Smithson, Laurie Anderson, the Language poets — were doing things that seemed new and exciting; theirs were “breakthrough” performances (and indeed performance art was deemed to be a leading new practice), challenging the “old” modernist ethos, with its “elitist” canon of predominantly white European males. But whether one approved or disapproved of postmodernism, it was, well into the 1990s, considered a valuable period concept.

Then suddenly — or at least it seemed sudden to many of us — postmodernism was finished. The turning point came, I believe, with 9/11, although no one realized it at the time. The attack on the World Trade Center at the threshold of the new century, with its terrible death count, followed by the coming of ISIS and other political upheavals and culminating in the election of Donald Trump, in 2016, made it increasingly difficult to talk of simulacra and multiple truths, of the absurdity of master narratives and the refusal of all categorical imperatives. Derridean différance increasingly gave way to the concept of diversity, which, in practice, means the necessity for previously underrepresented communities to gain recognition. The notion of decentering now came to refer not to ambiguity or undecidability, but to a statistically based inclusiveness.

In the strangest way, Proust and Kafka,
Gertrude Stein and Marcel Duchamp,
remain unsurpassable.

Indeed, in 2019 the pendulum seems to have swung as far away from “postmodernism” as possible. The sentence “Trump is a racist” is now regularly pronounced on CNN as if it were a simple fact, equivalent to “Trump is 6'3".” The assumption is that racist means a specific thing and Trump is definitely that thing. Or again, when people today refer to “social justice,” a term postmodernism would have been reluctant to use, they see no need to define the term. No simulacrum here: We all know what social justice would and should look like.

Accordingly, if postmodernism can now refer to anything, it can only be a historical period, from around 1960 to 2000. What, then, about our own moment? Surely there is no use calling ours the post-postmodern, especially since — and this is the real complication — even as postmodernism has gone away, modernism is still very much with us.

Indeed, the art and literature once thought to be “postmodern” (say, the theater of Beckett) now seems more properly “late modern.” And the work of the great modernists — Joyce and Pound, Mahler and Malevich — continues to be hotly debated and discussed, even as their postmodern successors, like William Burroughs and Günter Grass, have largely been eclipsed. It seems we are somehow still in the orbit of that great revolution known as modernism. Indeed, one of the seminal books Fredric Jameson has published in this century is a fat volume called The Modernist Papers (Verso, 2007). In the strangest way, Proust and Kafka, Gertrude Stein and Marcel Duchamp, remain unsurpassable.

Given the digital revolution, given the displacement and migration of people around the globe, given the increasingly urgent call for political transformation and for the end of racial and gender inequality, why is it that, at least in the arts, modernist norms have remained so hard to shake off? I can’t answer that question. But time will surely provide an epithet or two for our own epoch. And it won’t be postmodernism.

Marjorie Perloff is an emerita professor of English at Stanford University.


It’s Back, Baby
by Jessica Burstein

Fashion explains everything. Hemlines, hummingbirds, architecture, the structure of scientific revolutions, universities. And I’ve got a party to get to, so I’m going to make this fast.

First, you have to understand that logic is a mug’s game. It works and all that, and I like it that bridges don’t fall down, but the Enlightenment was one among a number of trends. Also, by the way, you are an irrational being — Michael Lewis tells us that you just need a narrative to swallow and you’ll think anything is reasonable. Freud said that too, but that’s another story.

Second point: Academia is not only a very good idea; it is populated by human beings. Most of us were not invited to the prom, so we’re especially pleased to be included in “critical waves.” A critical wave is something that two people said and then there was a special session at the MLA (“Whither Postmodernism?”) where a professor started crying and The Chronicle did a story on it.

Third: Academic trends are just that.

Fourth: The only reason you’re wearing denim this year (you idiot) is because you weren’t last year. Ditto hemlines: Up then means down now. Get it?

Fifth — and here I’m getting indigenous, but my Uber is still three minutes away — the 1990s were investment bankers telling you at cocktail parties it would be great to be a professor since they have summers off, and I was concentrating on not throwing up at my qualifying exams. Modernism was fascism, and then there was the time I cajoled Derrida into telling me what he thought of the new book on Paul de Man. I can’t write what he said, and not just because my French lexicon of obscenities is underpopulated. You (don’t) remember de Man — deconstruction, Yale, and the grad student who found de Man’s newspaper columns in the 1940s collaborationist paper Le Soir? The handsomest man in academia had co-edited a book called Responses — go look it/him up. Belgium was actually in the news. There was, if not a feeding frenzy, at least an animated chow-down in which lots of folks aligned deconstruction with the idea that words meant nothing, that these guys (not sic) said there was no meaning, and so that led to the breakdown of a moral center, and next stop gas chambers.

Academia is not only a very good idea;
it is populated by human beings.

Sound familiar? It’s back, baby. And you look fab-u-lous dragged out in that vegan kidskin moral recrimination Mary Quant miniskirt, throwback Armani #MelanieGriffithheaddesk power-suit jacket, and neoliberal kidskin gloves. (By the way, are there any ladies aboard this little bus?) Anyhoo, the hair shirt comes in Gray Beard and Ivy-Hall, but either way it ain’t optional.

Jessica Burstein is an associate professor of English and gender, women, and sexuality at the University of Washington.


Jordan Peterson | CHRIS WILLIAMSON/GETTY IMAGES

Are Postmodernism and #MeToo Incompatible?
by Seo-Young Chu

There’s a so-called love scene in the 1982 film Blade Runner — usually considered a classic of postmodern art and a staple of the postmodern curriculum — through which I have often fast-forwarded or dissociated.

In this “love scene,” Rick Deckard (the film’s protagonist, a bounty hunter played by Harrison Ford) attempts to kiss Rachael (a “replicant,” or humanoid artifact played by Sean Young). At first Rachael rejects Deckard’s advances. She struggles against him physically. She tries frantically to leave his apartment. But when he commands, “Say, ‘Kiss me,’” she begins to relent: “Kiss me,” she echoes with reluctance. When Deckard goes on to instruct her, repeatedly, to say, “I want you,” she complies in a barely audible voice. Throughout this exchange, Rachael’s face is alive with fear while Deckard’s face is menacing. Meanwhile, the music — a slow yet vigorous and grotesquely tender diatonic saxophone melody that stands out in what is otherwise an explicitly eerie cyberpunk soundscape — announces that what we are witnessing is a triumphant romantic conquest.

I first saw Blade Runner as a naïve teenager in the 1990s. Even back then, I found the “love scene” disturbing, though I could not articulate exactly why. After my own encounter with sexual violence (I was raped and sexually harassed in 2000 by a man in a position of power), I continued to have trouble articulating why I would fast-forward or dissociate during the “love scene” while rewatching the movie.

The film’s spectacular cityscape still entranced me — despite or perhaps because of its Techno-Oriental aura. Thus I was able to teach and to some extent enjoy Blade Runner, telling students that it was one of my favorite films. It was, I thought, a postmodern cinematic masterpiece in which robot rights become trenchantly available for representation when the replicant Roy (played by Rutger Hauer) utters his dying soliloquy.

Recently, after #MeToo, I taught Blade Runner again. This time one of my students emailed me about the “love scene.” She was upset by it. Upon reading her email, I forced myself to watch the scene in its entirety while struggling not to dissociate. A word like “postmodern” would be obscenely irrelevant to a discussion of my reaction, which was visceral, raw, too real, too authentic, too present, the opposite of absent, the opposite of depthless. It was horrific. The coercion, the obvious unwillingness, the trauma — I don’t think I can teach Blade Runner ever again.

If postmodernism renders the replicant Rachael legible as a glossy simulacrum, then #MeToo renders her brutally legible as a victim of sexual violence.

I will conclude this piece the way I sometimes conclude my classroom lectures: with questions for discussion. If you disagree with my reaction to Blade Runner, how would you support your argument? In what ways, if any, is #MeToo postmodern?

Seo-Young Chu is an associate professor of English at the City University of New York’s Queens College.


The Best Lack All Conviction
by David Bromwich

Was there ever a more cheerless name for an art-historical movement? Yet the name fit the mood: Postmodernism wanted to be history before it wagered anything as art. “Neoclassicism” and “the Gothic Revival” went halfway to self-description by declaring that they aimed at a return. Whereas the very word postmodern spoke of exhaustion. It came after a something whose own name meant “of the mode; of the moment.” What could it mean to come after the mode, after the moment?

The idea that the postmodern bears a special relationship to high capitalism owed much to Robert Venturi, Denise Scott Brown, and Steven Izenour’s architectural primer Learning From Las Vegas (MIT Press, 1972). Their celebratory approach nonetheless signaled a radical intention, if you knew how to read the surfaces: “Learning from the existing landscape is a way of being revolutionary for an architect.” Of course it is also a way of being conventional. This bland evasiveness would become a characteristic manner for Pomo theorists.

The very word postmodern spoke of exhaustion.

The postmodern meant coming after and pointing to your location. Its key term of art was the word “reference,” transformed from a noun to a verb in sentences that would once have used “mention” or “allude to.” See how this design by Venturi references Gothic ornamentality. Or how Scorsese references the iconic taxi scene in On the Waterfront. For that matter, the Pomo word “iconic,” meaning generally known and known to be known, began its climb to pop currency as a synonym for “famous.” It imparted to fame itself a sacral overtone.

In universities, the new language carried its widest appeal in the visual arts, with some spillover in literary theory and a small allowed ration in political theory and philosophy. Another sample from Venturi, Scott Brown, and Izenour:

A choice between Perpendicular and Decorated for [19th-century] English churches reflected theological differences between the Oxford and Cambridge Movements. The hamburger-shaped hamburger stand is a current, more literal, attempt to express function via association but for commercial persuasion rather than theological refinement.

This blurring of lines between theology and commerce, art and commerce, high art and the existing landscape of motels and shopping malls, was a Pomo gesture; the authors preferred “roadside eclecticism” and “representational architecture along highways” to the hostile environment built up by modernists like Le Corbusier and Gropius. What was true in architecture could be extended as a democratizing move to literature, painting, music, film.

Ideas of individual talent or personal style were accordingly called into question or “decentered.” But the retrospective judgment of Jean-François Lyotard on his influential book The Postmodern Condition might be taken to qualify the importance of the result: “I made up stories, I referred to a quantity of books I’d never read, apparently it impressed people.”

Postmodernism is sometimes linked with relativism, a speculative theory no one has ever lived. You can’t believe you are right about the worth of an activity or the truth of a proposition and simultaneously believe a person who thinks the opposite is also right. Still, there may be an affinity here: The genuine disagrees with the fake, but Pomo theory called that binary (among others) into question. Postmodernism was a sophisticated stand-in for a lack of conviction, and it prospered most of all on the nonpolitical left of the 1980s and 1990s.

Fredric Jameson’s New Left Review essay “Postmodernism, or the Cultural Logic of Late Capitalism” was published five years before Francis Fukuyama’s National Interest essay “The End of History?” The politics of the two authors differed markedly, but the spectacle of hypnotic distraction, which Jameson described in ambiguous tones, was the cultural facade of the same weak-spirited luxury whose triumph Fukuyama celebrated after the fall of communism. In the academic mood of the arts today, it would be hard to find anyone with a good word for postmodernism. The pendulum has swung the other way. We are on the brink of a return of the social-realist doctrine of the 1930s. The arts are said to matter chiefly for their service to culture, and culture is understood to be a province of politics.

David Bromwich is a professor of English at Yale University.


Frederic Jameson | FRONTEIRAS DO PENSAMENTO

The End of History — For Real
by Anna Kornbluh

In the mid-1980s, when eminent theorists proclaimed “the golden age of criticism” in the university without recognizing the imminent economic restructuring of higher education, Fredric Jameson articulated anew the challenge of recognizing the present: Where are we in time? It was this question of historical periodicity — and not really aesthetic style — which he figured as “Postmodernism.” Although obviously a temporal question, Jameson also framed it as a spatial one: the problem of registering a “mutation” in global space — the globalization of capitalism — and subsequently registering how this mutation provokes new representational acts, both artistic and critical.

In the original essay, the paradigmatic medium of these new spatial relations is architecture. Tracking theoretically driven practitioners like Peter Eisenman, Frank Gehry, or Robert Venturi, Jameson reads their disjoining of style and context, building and surrounding, subject and object. Structures like the Bonaventure Hotel in Los Angeles evoke disorientation in the global environment: What is the space of the totalized economy, and what is the place from which it is mappable? Postmodernism may be an aesthetic of confusion and contradiction, but this is only because it often strives to become a synthetic projection of geopolitical coordinates. The task of the critic is to take up this striving.

Revisiting this periodization exercise with due respect upon its 35th anniversary requires asking how 2019 differs from 1984. The “mutation” in our burning present effects a different closure of the world than the one Jameson described then: the already arrived, unevenly distributed, ecocide. Which critics will interpret this world closure? A pitiful parallel to environmental wreckage is the wholesale destruction of the university, that precious environment where students work to exercise their critical faculty, and where faculty work to model it. Adjunctification of academic labor, withdrawal of state funding, booming administrative overhead, busting personal debt — and the pernicious apportioning of all these factors upon racialized and gendered constituents — all commenced in hideous earnest in the selfsame Bonaventure 1980s. Thirty-five years out from the golden age, there are, as this very platform repeatedly chronicles, only last critics standing.

If the critical task that Jameson so marvelously exemplified in the golden age was to eschew “moral judgements” in favor of “a genuinely dialectical attempt to think our present of time in history,” and ultimately to “think the cultural evolution of late capitalism dialectically, as catastrophe and progress all together,” then we few who remain in these worst of times in the ruined university must say that now this task has changed. What is the dialectic of too late capitalism, when it is too late to have altered the course of catastrophe, when no progress glimmers? Embrace the moral judgement: Too late is precisely when radical transformations are our duty. Remake the university, remake the infrastructures, produce other modes of production.

As for architecture, “of all the arts that closest constitutively to the economic,” in this epoch of double destruction it again furnishes some inducements for constructive criticism. Structures such as Sara Tamez’s Huiini House or LOT-EK’s Johannesburg apartment tower make inexpensive, sustainable, sometimes portable shelter from the byproducts of unsustainable commodity society, harkening resource de-intensifications and alternative scales of social space while also alluding to the imminent threats of civil violence and deluging storms. Today, rather than sublime systems subsuming everything through exploitative extraction, the monstrous ecosystem is purging the human from its swells. Among them, perhaps, some will have wished that they had dedicated their critical labors to building up the university, imagining projects for thriving, even after it was too late.

Anna Kornbluh is associate professor of English at the University of Illinois at Chicago and the author of the forthcoming The Order of Forms: Realism, Formalism, and Social Space (University of Chicago Press, 2019).


Beating a Dead Donkey
by Steven Weinberg

In 2001 I expressed the view that the arguments over science as a social construction seemed to be settling down. As one who enjoys arguing, I added that “perhaps it is still not too late to draw back from the brink of peace.”

Alas, it was too late. I may be just out of the loop, but it seems to me now that for scientists to argue against constructivism is beating a dead donkey. There is widespread skepticism about the judgments of science, on topics like climate change, but it has other sources — as far as I know, there are no social constructivists in the Trump administration.

Steven Weinberg is a professor of physics at the University of Texas at Austin. He shared the Nobel Prize in Physics in 1979.


* * * * * * * * *


A version of this article appeared in the June 21, 2019, issue.

We welcome your thoughts and questions about this article. Please email the editors or submit a letter for publication.

Justin E.H. Smith is a professor of history and philosophy of science at the University of Paris. He is the author, most recently, of Irrationality: A History of the Dark Side of Reason (Princeton University Press, 2019).

David Bromwich is a professor of English at Yale University. He is the author of Politics by Other Means: Higher Education and Group Thinking (Yale University Press, 1992).

Anna Kornbluh
Anna Kornbluh is associate professor of English at the University of Illinois at Chicago. She gratefully acknowledges communal conversations about faculty concerns in various social media venues.



Monday, March 30, 2020

The Postmodern John Cobb - What's to Deconstruct about Modernity?




The Postmodern John Cobb

deconstructive and constructive


What's to deconstruct about Modernity? 

​Eight things, says John Cobb, in the essay below:
  • Christianism
  • Modern Metaphysics
  • Modern Science
  • Modern Nationalism
  • Economism
  • Modern Defense
  • American Exceptionalism
  • The University

Does anything have "value"?

The world of people, animals, and the earth is filled with value. Its value does not lie simply in the usefulness of things but in their interiority, their subjectivity, their reality in and for themselves as they interact with one another. Human beings, too, have this value, and it is supremely important. The value of the world is appreciated and enfolded within the life of Abba, and it has powerful implications for how we treat one another and the more-than-human world. See Whitehead's Theory of Value and Economic Justice and Process Philosophy.

Is there a social and philosophical alternative to Modernity?

The alternative is the emergence in our world of "communities of communities of communities," each of which embodies the spirit and practice of Ecological Civilization, These communities are creative, compassionate, participatory, inclusive, ecologically wise, and spiritually satisfying, with no one left behind. Their focus is not on money or on imperial aspirations, or even on individual happiness at the expense of community health; but on the well-being and flourishing of people, animals, and the earth. See Five Foundations for an Ecological Civilization and Ten Ideas for Saving the Planet.

Can a postmodern world include belief in God?

Yes. But it is important to keep in mind that the word "God" has different meanings, and not all people in such a world will find it a helpful word. A constructively postmodern world includes people who believe in God, people who do not, and the many who are somewhere in between. For those who believe, John Cobb recommends that God be conceived as Abba, not as an all-powerful ruler or cosmic moralist. God is more concerned with the well-being and flourishing of life than in being worshipped or flattered. See God as Abba: John Cobb's Proposal.


* * * * * * * * * * *




​DECONSTRUCTING MODERNITY

by ​ John B. Cobb, Jr.

Whitehead’s followers have long called themselves “postmodern.” When the French postmodernists defined postmodernism as “deconstruction” of the modern, Whiteheadians distinguished themselves as “constructive postmodernists.” We prefer to emphasize the positive. When we learned from China to call the world we work for “ecological civilization,” this further accented the positive. 

​This has led us to mute our criticism of the modern. Our call for an organic worldview obviously implies criticism of the mechanistic one. Our call for multinational cooperation obviously implies criticism of nationalism and imperialism. Our call for orienting education toward ecological civilization obviously implies criticism of “value-free” universities. But the needed deconstruction has been muted.

​This emphasis on the positive has made it easier for people to join us. But many who now talk about moving toward an ecological civilization retain features of modernity that in fact prevent them from moving very far. Too often, affirming an ecological civilization means little more than being ecologically sensitive. In fact, ecological civilization calls for profound changes and significant sacrifices. 

This article joins the deconstructive postmodernists. It focuses on what must be changed and overcome. It opposes any effort to reassure and bring comfort. One may legitimately object to the harshness of its negations. They are not the whole story, but they are the part of the story that we “constructive postmodernists” have not done enough to highlight. So here goes with eight obstacles to an ecological civilization.


Christianism 

Not all Christianity is “Christianism.” There are people who seek to serve God and humanity by following Jesus and understand that affirmations of Christianity as the only way or lashing out at its opponents are damaging. But Christians have too often absolutized the church or the Bible or what they have understood by “God.” Christianism is a form of idolatry no better and no worse than the many other forms of idolatry that have informed so much of history. Charlemagne taught his soldiers that they would be rewarded for killing opponents of what he viewed as orthodox Christianity. This spirit fueled Crusades against Muslims and “heretics.
Modernity has largely liberated society from Christianism. But not entirely. It is still an obstacle to be recognized and opposed. We can move toward ecological civilization only if the great moral and spiritual traditions of the world work together. The last century has witnessed great progress, but Christianism remains as an obstacle. It is an obstacle to authentic response to Jesus, and just for that reason, to openness to humble sharing with others in working for the “divine commonwealth” about which Jesus testified.

The other great traditions have similar dangers and limitations. In this essay, I focus on the obstacles most important for Americans. Thus far, Christianism has been No. 1.


Modern Metaphysics 

By modern metaphysics I mean the metaphysical tradition that was initiated by Rene Descartes. It has taken many forms. Most philosophers today repudiate Descartes, yet the influence remains. Indeed, the first, and perhaps the greatest obstacle to deconstructing the metaphysics that still rules the world is that most people, and especially most philosophers and scientists deny that they have a metaphysics. When one’s metaphysics thoroughly shapes the way one perceives the world, it becomes nonconscious. 

Most modern people suppose that our best source of knowledge of the external world is through sight. “Seeing is believing.” Of course, they know that that world is not simply the patches of colors that are attained by sight. Those colors are the colors of something. The “something” is a substance, that, it is what the color inheres in. This is material in nature. It is material entities that science studies. Matter cannot be identified visually or by the data of any of the senses. It has no subjective reality. That is, it does not feel or have purposes. It is not an agent. When it moves, it is because something moves it. Thus nature, the world studied by science, is constituted by matter in motion.

Descartes taught that the only exception to this is the human soul. We know ourselves as feeling, purposing, acting beings. He thought the soul is not material at all. Charles Darwin confused moderns profoundly by showing that human beings are part of the nature that Descartes taught them to understand as matter in motion. Many moderns give lip service to this view, but in fact they do not think of themselves as simply machine-like. Today, many are recognizing that in addition to machine-like nature, including human bodies, there is also what Descartes called the human soul. Today we call it “consciousness.” 

A few philosophers have simply denied the existence of matter and held that everything is soul-like, psychic. There are others who say that we should stick to the appearances and not take any position on what is appearing. There are still others who don’t want to get into these issues and talk only about language. I cannot survey the history of modern metaphysical thought in a page. I am trying to identify what became in modern times a kind of common sense. I believe that has been the dualism of matter and mind introduced by Descartes. 

This dualism leaves open the question of where the line should be drawn. Even those who may verbally limit mind to the human soul are likely to actually feel that their pet dog is not adequately understood as matter in motion. It seems to have feelings and purposes. Descartes’ serious insistence that it has no feelings has never become part of modern common sense, but his teaching has nevertheless given license to vivisection of animals and to the industrial production of meat. Modern common sense has always been seriously confused.

The problem is not only confusion. It is also that this modern common sense justifies much that is unjustifiable and gives poor guidance in relation to much else. Instead of illustrating this now, I will take up the problems in subsequent sections.

Constructive postmodernism calls for the rejection of both pure matter and pure mind. It rejects the mechanistic model and calls for an organic one. Whitehead described his thought as a philosophy of organism. We constructive postmodernists prefer to attract people to thinking in these new ways, but it turns out that unless the hold of materialism and dualism is broken, vague feelings about organisms do not reshape thought.


Modern Science 

Modern science is the science that followed the guidance of Descartes. In the science of the high and late Middle Ages, the most influential philosopher had been Aristotle. He taught that to understand something we should pursue four questions: first, what is it made of; second, what is its form; third, what made it come into being; and fourth, to what end did in come into being? These are called the four explanations or causes: material, formal, efficient, and final. Scientists have always been interested in the first three causes, but under the influence of Aristotle, in the late Middle Ages they tended to focus on the final cause. For example, in their study of the human body they wanted to understand the role or function of the liver, the kidneys, the heart, and so forth.

Descartes was convinced that this function was only superficially explanatory. The scientific question was not whether the heart pumped blood but how it did so. That is the question about the efficient cause. What made the heart pump? Descartes insisted that nothing is explained by the final cause. Purposes play no role in nature. Descriptively, we may of course note that the circulatory system could not function without the pumping of the blood, but the task of science is not this description but an explanation of how and why the heart pumps.

To insure that we do not attribute purposes to the heart, Descartes insisted that it functions like a machine. The most impressive Medieval machines were clocks; so scientists sought the “the clockwork” that explained the behavior of things. We do not suppose that clocks have any experience or subjectivity; certainly, they have no feelings or purposes. They are matter in motion, and the task of science is to explain what makes the motions occur as they do. 

Modern science was and is brilliantly successful. Again and again, it predicted what had not been thought to be explicable apart from the introduction of natural or divine purposes. It success was amazing that moderns put science on a pedestal. It was recognized that scientific knowledge had a definitiveness that had been totally lacking before. Of course, there was always more to study, but the assumption took hold that in time science could explain everything. When Darwin showed that we are part of nature, it seemed that the human soul or consciousness could also be explained. 

This meant that in fact there is nothing but matter in motion. Ethics, values, morality, purposes, feelings, etc., etc. are fluff. They can be explained along with everything else as science advances. And science does advance. With its advance comes a vast improvement in technology and thus in the control of nature. Modernity is thus a vast improvement over what came before. That there has been progress can no longer be questioned.

This modern understanding of modern science has now become a major obstacle to progress when we understand progress as improving the lot and security of the human species. The occasional recognition of this fact leads to asking whether in fact the Cartesian view of nature as purely material is in fact needed by science, or even compatible with scientific finding. It seems not to work well not only in explaining conscious experience but also in explaining the nature and behavior of the quanta of which supposed matter is composed.

Interestingly, it turns out “matter” does not appear in the actual writings of science. The closest equivalent is “mass”. But not all the entities studied by science have mass. Few deny the existence of light, but light has no mass. Apparently if mass is what we mean by matter then matter is only one part of the natural world studied by physics.

Physics offers us a better candidate for universality. It is “energy”. Now for the most part “mass” and “energy” are convertible into each other. But we noted that light has no mass. Yet it has energy. Clearly the physical world consists of units of energy. One might think that this makes little metaphysical difference, but in fact the concept of “energy” is very different from the concept of “matter.” Energy cannot be pushed and pulled in the way we think of matter as being moved. Energy suggests agency, whereas matter requires some external act in order to change location or speed. 

Furthermore, it is not so difficult to think of human conscious experience also embodying energy. Just as evolution should lead us to suppose, the line between human experience and other parts of nature is no longer so sharp. We noted that materialist views of nature lead to either a dualism of matter and consciousness or a monism of matter in which no one can actually believe. Science supports the abandonment of the Cartesian view of nature.

Today’s science is showing us more and more about nature that does not fit with materialism. Information has become a central concept. Animals and even plants seem to behave intelligently and purposefully. Unicellular organisms respond to human emotions. Rejecting materialism and adopting organic models opens the door to including much in science that had been rejected on a priori grounds rather than because of evidence.

Indeed, our more openminded study of what we used to call primitive people now reveals that on many counts they were wiser than we. In the West we slaughtered many women who practiced ancient medicine that involves psychological as well as physical elements. In fact, they were better healers than the modern doctors who were more “scientific”. We now routinely use placebos to give some recognition to the role of subjective feelings that modernity still excludes from having efficient causality. We find that “primitive” people can gain knowledge of the location of animals, for example, that we regard as impossible.

I am saying little that most readers will find improbable. But the dominance of modern thought in our culture keeps all of this at the fringes. The truth is that indigenous communities have beliefs and practices that are far superior to ours in terms of developing a sustainable society. At the fringes a few people are telling us this. But the dominance of modern thought blocks any significant cultural assimilation. 

We are taught that our knowledge and understanding are far superior to that of indigenous people. The truth is that we do know a great many facts about the universe that they did not know. We can develop many machines they did not have. We can reshape nature in ways they could not. But an equal truth is that they understood how to live in a sustainable relation to nature. They understood that human beings are part of a community of subjects rather than simply a collection of objects so that our relations to other, both human and nonhuman, are subject to subject. 

Have we progressed? Yes, in some respects. Have we regressed? Yes, in some respects. But to accept the latter as having any truth at all is to reject modernity. Such rejection is urgent.

Perhaps even more urgent is the rejection of the late modern belittling of all questions about better and worse. This is the natural result of materialism, but only when human life is included in the world of matter in motion. That inclusion arose only after Darwin, and even then it was strongly resisted. Immanuel Kant offered a new way of understanding dualism: theoretical and practical reason. But by the middle of the twentieth century modernists judged that facts alone are important, that the facts gained by theoretical reason could explain the judgments belonging to practical reason and show that they had no importance. Science is the arbiter of facts; so science alone is truly worthy of respect. 

Hence, the modern world in which we live teaches that it does not matter what people believe about the world and their role in it, that human commitment and dedication do not matter. And the same world has no hope of survival unless people are willing to sacrifice some of their priorities for the sake of more important ones. The unwillingness of modern people to even discuss such questions and their continuing to ‘solve” problems by the activities that cause them suggests to me that the modernity dominated by modern science may be the most stupid culture that has ever existed. It is a major obstacle to building an ecological civilization.




Modern Nationalism 

We all take a special interest in those others who accept us as part of their group. For hundred of thousands of years our ancestors lived in tribes, and the members of those tribes identified themselves primarily in that way. Some other tribes were viewed as friendly, but others were threats. These relations could change over time, but one’s fundamental identity and loyalty did not. 

With the rise of civilization, citizenship in cities took over as the primary identity and loyalty of many. There was often a close connection between ancestral tribal identity and citizenship in a city. Within a city there might be many people who were not citizens. Slavery was common and usually slaves were from other cities or from tribes that had not settled in cities. 

Some cities conquered others and established empires. However, most of the citizens in the conquered cities still identified themselves primarily by their cities not in terms of the empire. On the other hand, Roman emperors worked hard to evoke loyalty to the empire and to themselves as representing the empire. When confronting threats from outside the empire, many citizens of cities other than Rome did identify themselves with the empire and its culture against barbarians. Most free people in the empire gave allegiance to Rome and its emperor without abandoning identification with the local city.

The people least willing to give full obedience to Caesar were the Jews. They thought that full loyalty could be given only to God. Although on the whole they were willing to recognize that Rome ruled politically, they distinguished religious devotion from political loyalty, and Rome did not want to allow such a distinction. Despite their weakness, they rebelled several times. To pacify them Rome gave them special privileges and exemptions but eventually drove many of them out of their homeland. 

The problem was aggravated when some Jews recognized Jesus as the Messiah, Christ, or liberator, because this Jewish sect spread rapidly in the Gentile population. It did not seek to overthrow the rulers of the empire, but like the Jews, it denied Caesar ultimate loyalty or “worship”. From that time on the relation of “religion” and politics, or church and state, has been a major issue in Western society. The Roman emperors from time to time tried to stamp out this new threat to their ultimate authority, but Christianity continued to grow. As the empire began to crumble, people in many regions began to look to the church for help in meeting practical needs. The church survived the collapse of the empire and for the first time what had been a voluntary organization became also the basis of self-identification of masses of people and the most powerful institution. For a thousand years most Europeans identified themselves primarily as Christians and secondarily in terms of ethnicity and location. It was generally thought that the political rulers derived their legitimacy from the church.

Of course, power attracts many people regardless of whether it is political or religious. Although the rhetoric of the church never claimed for its ruler, supreme loyalty, practically speaking the church could be as intolerant of dissent and disloyalty as the preceding empire. Its wealth also attracted many. So viewed from the perspective of original Christian teaching, many leaders of the church were corrupted by their enjoyment of wealth and earthly power. There were many protests and eventually the protest of Luther gained powerful support from political leaders. The church split.

The habit of understanding oneself religiously remained prominent; so now many understood themselves to be Catholic or Protestant. But the political rulers had played a large role in determining whether the church in a given area would be Catholic or Protestant. After decades of fighting between Catholic and Protestant princes, they made peace with the decision that political rulers would decide the form of Christianity that would be practiced in their domains. Secular government became dominant.

This move toward regional self-determination was supported by Protestantism in another way. A strong Protestant principle was that all Christians should be able to read the Bible for themselves. However, this could not happen as long as it depended on learning a foreign language, namely, Latin. Few outside the priesthood could read it. Luther undertook to translate the Bible into German. Since the spoken language differed widely according to locale, he had to decide which form of German he would use. Because reading the Bible in German became extremely important, it created a homogeneous language that could unite people who had before spoken many diverse dialects. It also excluded others whose Germanic languages were too different for Luther’s translation to be acceptable, such as the Dutch, the Danes, and other Scandinavians. They needed their own translations.

In this way national feeling was greatly strengthened. If one read Luther’s Bible, one was a German. More and more European writings were in the vernacular, so that boundaries were established among readers. Linguistic boundaries tended to become national boundaries. National feeling became much stronger than when all European literature was in Latin. Modern nationalism was born. 

This modern nationalism meant not only separating Germans from people who spoke other languages but also a drive toward uniting the German people in one country. By the eighteenth century, nationalism had fully triumphed. One was no longer primarily a Christian or a Catholic or Protestant, one was German, or French, or Spanish, or Italian. Wars were fought between nations over issues over perceived national well-being. Control of distant regions in Africa and Asia was clearly for building national empires, and rhetoric about religion played a minor role at best.

Nationalism tended to strengthen German concern for other Germans. Of course, there was hierarchy and exploitation within the Germanic world. But there was little thought of some Germans enslaving others. There was also some respect for other Europeans. So, when Europeans set out to exploit the resources of the planet and colonize much of it, they did not bring with them the slave labor that would make that exploitation possible. In order to justify annihilating or enslaving the people inhabiting other parts of the world, nationalism had to be accompanied by racism. The inhabitants of Africa, South America, and North America did not belong to the same race as Europeans. Indeed, they could be regarded as not fully human, as being human was understood in Europe. Accordingly, the rights pertaining to European individuals did not protect them. Modern civilization has been the most racist the world has seen. Nowhere has racism played a larger role than in the United States.

European nationalism led to two world wars in the twentieth century. So, in its European homeland, the dominance of modern nationalism was brought to an end by the formation of the European Economic Community. It is hard to imagine conflicts between France and Germany plunging the world into was a third time. Sadly, this has not ended the danger of international war. On the one hand national feeling continues to be a danger to peace and an obstacle to moving toward ecological civilization. On the other hand, it helps us to work against mutual antagonism among ethnic and religious groups and even races.


Economism 

What caused the European nations to seek a closer unity was certainly the desire to avoid further wars among themselves. But it is interesting that they formulated their unity in economic terms – the European Economic Community. The clearest symbol of their unity was a common currency. Giving up control over its own money was the greatest sacrifice of sovereignty on the part of the participating nations. We saw the consequences recently when the political party committed to the will of the Greek people was forced to yield to the European banks. 

Of course, the pursuit of individual wealth has played a large role in all societies, at least since the invention of money. When Jesus said one could not serve both God and wealth, this was not spoken against “economism” as a comprehensive system. That did not yet exist. During the period of Christianism there was much criticism of the clerical leadership, supposedly committed to poverty, for its luxurious lifestyle. During the modern period not only individuals, but also nations, have been devoted to the pursuit of wealth. This is a step toward “economism,” especially because it increased the power of banks in relation to nations. But nationalism gave way to economism only when national sovereignty was subordinated to the interests of the economic system. I have noted that this occurred, and was publicly announced, in the formation of the European Union.

Of course, this shift in power had been going on for some time. National rulers often needed to borrow money from bankers and this gave bankers considerable influence on policy. Few histories of Europe give adequate attention to the role of banks. 

The Cold War that took over soon after World War II was partly another was between nations. However, it was in fact and often recognized to be, between two economic systems: Capitalism and Communism. Nothing of that sort had even occurred before.

The United States did not surrender sovereignty to any community of nations. Nevertheless, it may be a clearer instance of the shift from nationalism to economism than the European nations. It accepted leadership of the “Free World,” by which was meant the world that is free from Communism. To be in the service of Capitalism meant to subordinate national interests as measured in nonmonetary ways, or even GNP per capita, to effectiveness in pursuing the goals of capitalists. As long as the capitalists in question lived and worked within the nation, one could find some congruence between their interests and the pursuit of national wealth characteristic of nationalism. But the interests of capitalism were to minimize national boundaries, and the great corporations became global in scope. This was especially true of financial institutions. For the United States to serve global corporations and the global banking system, often at the expense of the American people has been a dramatic expression of the shift from nationalism to economism. 

Americans who are unhappy with this development are told that this is a democracy so that if they want to return to nationalism they have only to elect representatives who will do so. But this is misleading because of another dimension of economism. In our society one must have a lot of money to mount a serious campaign for Congress or the presidency. A few people have been able to get elected without funding from the major corporations or banks or billionaires. But thus far, they have always been a small minority. Most elected officials are indebted to people of wealth and wealthy institutions. These also control the media and the educational system. Whereas democracy works well where people know those for whom they vote or at least other people who know them, representative democracy offers little resistance to the controllers of wealth. When we “democratize” other countries, they are likely to serve global capitalism rather than their own people. 

For a short time after World War II, American capitalism seemed to be dominated by industrial corporations. But fairly rapidly, real power shifted from the industrial world to the financial institutions. Of course, much of the time, the interests of industry and finance largely coincide. Both aim to reduce the power of national governments to restrict business and the flow of capital. But in the “free world,” private finance controls the money supply and has a more direct power over politics than does industry. 

In the long run financiers would benefit from intact eco-systems and a world with plenty of topsoil and oceans that supported lots of fish. But they have been schooled by economists who focus only on increasing market activity. Even continuing the present level of economic activity is extremely likely to make the planet uninhabitable. Its general increase speeds the coming of utter catastrophe. Nothing is more important than to end the actual reign of economism, and that will not happen unless its domination of popular thinking as well as scholarly theory are ended.


Modern Defense 

One adage that is used to cover a great many absurdities is that the best defense is a skillful offense. All individual and all nations are justified in defending themselves against the attack of others. What is allowable or desirable is a matter of dispute. There are those who call for only nonviolent defense. This may mean that an individual accepts serious injury or death rather than injure or kill another person. But there have been instances when skillful use of nonviolent defense has accomplished a great deal more than the use of violence against a far more powerful adversary. 

We Americans, however, can safely assume that our Department of Defense is not discussing such matters. Indeed, judging by the official story of what happened on 9/11, it seems not to devote much time and effort to protecting the buildings and lives of Americans, violently or otherwise. All evidence points to the primary interest in extending military and political control over the entire planet, what is called “full sector dominance.” In short, we act as if the only, or at least, the best way to “defend” ourselves is to attack and control everyone who does not serve us, which means, as explained above, does not serve the Western financial system.

In order for Americans to be willing to be heavily taxed to support this imperialist enterprise, they must be persuaded that it is indeed financed for the sake of the security of American lives and property. That is, what the word “defense” is ordinarily supposed to mean. Accordingly, almost any sum of money can be demanded for “defense” with little or no opposition in Congress. To be accused of being soft on defense is assumed to be the kiss of death, politically speaking. Few things are more important for reducing our destruction of the life system on the planet than redirecting government expenditures from global imperialism to the well being of people, especially, of course, American people and the natural world. Understanding that our expenditures for “defense” actually make our lives and well being more precarious would be a first step toward the exercise of common sense.

The deceptive use of “defense” goes farther. Thus far it has thus far prevented auditing the Department of Defense. It is highly probable that large sums are siphoned off to enrich the rich, and that defense contracts are not entirely directed to serving military needs efficiently. In short, powerful people have much to conceal.

Suspicion is heightened by a remarkable feature of the 9/11 attack on the Pentagon. The part of the Pentagon at which it was directed was the section where defense records were kept. Congress had finally decided on an audit, which then became impossible.

I have focused on the Department of Defense. The situation is similar, perhaps worse, with the FBI and the CIA. These two are supposed to be defending us. They, together with the Congress that funds them so generously, think that we are enemies of our own defense and so must be carefully reigned in. The freedoms of which we brag and being taken from us in the name of security. 

If half of the resources now provided to the “security establishment” were spent on working for peace, justice, and prosperity along with improving global ecosystems, there might be a chance for the healthy survival of civilization. But currently there is no discussion of any reduction in what we spend on “defense” and “security” or any assessment of its actual achievements. “Defense” and “security” sound great. So, we throw more money at them, guaranteeing widespread loss and suffering.


American Exceptionalism 

Earlier I discussed nationalism, and what I say there applies to the American case. However, because of the extreme importance to the planet as a whole of how Americans think of themselves, I am returning to this topic. Much of what I say about American exceptionalism can be paralleled in other countries. It is natural to evaluate all by the standards of your own culture, and when you do that you are likely to find that your culture excels, that in some ways important to you, it is unique, that is, an exception. 

I grew up in Japan, and there is no question that there are unique features, very attractive features, in Japanese culture. Further, their view of themselves tends to divinize their emperor, thus reinforcing their uniqueness. Over the millennia no foreign army had invaded Japan. Japanese exceptionalism led to the belief that the lot of anyone ruled by the emperor was superior, so that its conquest of Korea and Manchuria, and the great expansion of its empire in East the Japanese military was invincible. The willingness of the Japanese to die for the emperor would enable them to defeat any enemy. In the end, of course, they were defeated despite their remarkable attitude and commitment. 

Spiritually and culturally the adjustment to the possibility of being conquered was painful and difficult. I am sure that the sense of their own uniqueness has not disappeared, but whatever is left of it takes less dangerous forms. Indeed, Japanese have much be proud of, and I think pride in positive accomplishments is healthy. Part of its current uniqueness is the intensity with which it supports peace.

Unfortunately, American exceptionalism is more like Japanese exceptionalism before World War II than like current Japanese exceptionalism. Americans tend to think that our goals are beneficial to those we control. They think of the United States as committed to democracy and human rights and as promoting these all around the world. 

Like the self-image of many countries, there is some truth in this. The American Declaration of Independence and the American Constitution completed by the first ten amendments, inspire people all of the world to aim at democracy and human rights. Our success in liberating ourselves from British rule with these goals led to emulation elsewhere. We were not alone in believing ourselves to be in the lead in these matters.

Our view that when we made the decisions for other people, they benefited also had some historical basis. The American occupation of Japan under MacArthur’s control was excellent for the Japanese people in many respects. He broke up the conglomerates that had excessive political and economic power. He broke up large farms and gave ownership to those who had worked them. He got the Japanese to adopt a pacifist constitution. He helped to humanize the emperor without destroying the imperial system. Human rights were emphasized along with democratic governance.

The United States has not always treated immigrants well. Nevertheless, it has done a remarkable job of taking people of many nationalities and creating a unified nation. Religious freedom and cultural diversity are allowed without fragmenting the body politic. Despite the recent losses in the name of security, I write critically about my country without fear of punishment. There is much about the United States of which we can be proud, some of it truly exceptional. 

Our problem is much like that of Japan before World War II. We are so sure of our virtue and of the benefits we bestow on others, that we are blind to much that is happening. We spend more on our military than the rest of the nations combined, and so consider ourselves invincible, and do not notice that we are already losing ground. Even though we know that we have troops all over the world we do not consider ourselves an imperial power.

The history of our country that we learn in school is essentially celebratory, so we simply do not notice the dark side of our history and of our current policies. If any other countries acted as we act, we would consider it inexcusable and see to it that they were severely sanctioned. Consider, for example, how we would react if Russia engaged in drone warfare anywhere. But because it is we who are doing it, and we know that our motives are pure and our actions for the sake of people everywhere, we support our own practice. We are not told how the countries where we kill people in this way feel about it and derivatively, about us.

Nations that do not recognize the evil they do in the eyes of the world are not making themselves more secure. We need to free ourselves from erroneous or one-sided understandings of our history and current actions. Only a people who know who they really have been and are can be trusted to lead the world wisely.

So, who are we? What happened in our history that we ignore? This is a large topic. I will mention only a few points. First, from the arrival of the first settlers until today we have been in constant imperial expansion. One of the crimes of which we accused the British, justifying our revolt, was the effort to protect the indigenous people from our genocidal advance. Once we rid ourselves of British rule that advance continued and has not ceased even today. 

Second, our country’s economy was built to a large extent on slave labor. Of course, everyone knows that slavery is a blot on our record, but our emphasis in recalling our history is that we freed the slaves. They and their continued exploitation tend to drop from history until Martin Luther King again forced attention. 

Third, a major part of our foreign policy dealt with Latin America. While we celebrate the extension of our great nation from sea to shining sea, we pay little attention, not only to the genocide of indigenous people but to the theft of land from Mexico. Meanwhile our “Monroe doctrine” may have helped some Latin American countries gain independence from Europe, but it sucked them into our orbit of economic exploitation. The most decorated soldier in the U.S. Army, General Smedley Darlington Butler, after frequent battles in Latin America, finally understood that all his killing and leading U.S. soldiers to death was for the sake of U.S. corporate profits. He wrote War is a Racket, and devoted the rest of his life to sharing this understanding with the American people, but the truth he exposed has not found its way into our collective consciousness.

In Latin American, we frequently destroyed democratic governments in favor of military regimes that took orders from us. This is continuing to happen today. In my youth I rejoiced that I was not a citizen of one of those bad European colonial powers. The truth, of course, is that our overall record is one of the worst. It provides no basis at all for others to trust our intentions. When we realize that our military operations are also today in the service of corporations, now especially international financial corporations, we have every reason to withdraw support from established U.S. policy. It is time to recognize we are one nation among others, with our strengths and weaknesses, but with no mandate to rule the world.


The University 

​ The discussion of American exceptionalism suggests that a standard American education gives a dangerously one-sided picture of American history. We understand that this is almost inevitable. A major function of schooling has been to turn immigrants from many nations and cultures into American citizens. All history writing is selective. To write a text book on American history for such a purpose will always lead to selections favorable to American self-appreciation. But we suppose that the leaders of the future will go on to college and there acquire more accurate knowledge.

As with so many widely held suppositions, there is some truth to this. In a college or university history department there will be a number of courses going into great detail on some segment of American history. If one majors in American history and takes many of these courses, one will certainly understand that the popular view is seriously mistaken. But it unlikely that the department will help much in developing an alternative overview. In contemporary universities, the goal is not to develop comprehensive views of what has occurred and its meaning for orienting us today. It is to acquire accurate factual information about particular events. This tends to leave the historical basis for American exceptionalism little changed even for many specialists. And, of course, the number of students who specialize in American history is very small.

I begin with this as an example of how American universities do not fulfill the expectations that many outsiders hold for them. Since these expectations are of important human functions, until we recognize that universities are about something else, we will leave these functions unfulfilled. There are some things that modern universities do well. They should be congratulated and appreciated. But meeting the world’s greatest needs is not among them. Either universities must change, or we must find other ways of educating, or we have little chance of “saving the world.” This is recognized by the title of a book written by a famous educator to university professors, Stanley Fish, Save the World on Your Own Time.

He makes it clear that universities hire professors to do research on limited topics and pass the information gained and the research methods on to their students.

Our greatest universities call themselves “value-free research universities.” Being free of values means in part being free of prejudices, open to the evidence. But it also means that values are considered unimportant, and this judgment of unimportance is transmitted to students. This makes the “modern university”, one that became normative only after World War II, unique in history. “Saving the world” is a value, I think it is a very important value that should inform our whole educational system. But discussion of this possibility is excluded from the modern university. Given the many possible topics to be researched, the decision is not made on any judgment of importance for the human species. Often in fact it is made because funding is available for that. When one brings no other values to the table, in a time of economism, money is likely to determine what is done.

The incentive for attending a university today is usually the expectation of improving job prospects, judged by salaries, that is money. Although the intention of the university is to serve the rapid increase of known facts, which facts are made known, and even how they are formulated, tends to be determined by money. Unfortunately, the research for which money is available is more likely to serve the profits of corporations than the sustainability of human life. 

In the second and third sections of the paper I talked about the rigidities of the academy in relation to metaphysics or worldview. I claimed that the evidence gained by university research called for revision of the metaphysics it assumes. I repeat that charge. Universities were once places where questions about assumptions could be asked. They were, in other words, places supportive of intellectual activity. Today they are not. If intellectual reflection were encouraged, I feel quite sure that the commitment to a seventeenth-century philosophy would give way. That could open the door to an interest in wisdom, that is helpful guidance with regard to the pressing issues of personal, social, and national life. I am confident that the current threats to human life would be recognized and that our educational system would be reconceived to help us rather than to block interest in the most important questions.