Support First Things by turning your adblocker off or by making a  donation. Thanks!

Around 1980, those of us coming up in literary studies learned that we could no longer refer to a work of art. The term had become obsolete. If you uttered it even in passing, you appeared behind the times, not up-to-date. You had to use another word: text.

Roland Barthes announced the change in a 1971 essay, “From Work to Text.” The work of literature is a distinct thing, he said, an object between the covers, stable and singular, determinate in its significance. The text, on the other hand, is fluid and unstable, the “movement of a discourse,” never settling into a firm statement, always inviting ­reinterpretation. The work is closed; the text is open. More precisely, the “work” results from the constriction of the text, the spurious closure of it in an act of authoritative interpretation. The “work of art” comes to be only when we delimit the ambiguities, ironies, metaphors, implications, connotations . . . in a word, the textuality of language.

In that era of High Theory, you had to make the textual turn if you wanted to be an academician. The authorities said so. Jacques Derrida spoke of “the domain and the interplay of signification ad infinitum.” Michel Foucault addressed “verbal clusters as discursive layers which fall outside the familiar categories of a book, a work, or an author.” To assume that you could fully “get” the meaning of “Tintern Abbey” or conclusively establish the truth of Gibbon’s history was naive or, worse, an act of suppression. The academic powers insisted that you underscore uncertainty and indeterminacy. If you didn’t, you were stuck in old-fashioned philology and hermeneutics, or ­realist epistemology and positivist inquiry.

Text was a handy way to show your entry into advanced thought—too handy. Anyone could stop referring to the meaning or truth of “Ode on a Grecian Urn” and instead discuss its “play of binary oppositions” and sprinkle some theory-truisms into the presentation (“there are no facts, only interpretations,” “there is no natural language”). Pretty quickly it sounded like jargon, not insight.

I was an enthusiast of deconstruction in those days, grinding my way through Nietzsche and Heidegger. The “problematization” of language was shattering, I thought, and text-talk sounded too casual and certain. In my dramatic grad student head, it was as if an epochal crisis in human understanding—that nothing in language anchors us—had shriveled into a verbal mannerism. My peers weren’t pioneers of critical thought. They were copycats.

Work-to-text was just the beginning. More substitutions followed as the ’80s and ’90s passed. Chairman had to go, and freshman, too. b.c. and a.d. gave way to b.c.e. and c.e. Civilization became culture (because culture is egalitarian and civilization has colonialist overtones). Hyphens in African-American and Irish-American were out. Literature of the Americas was favored over American Literature in some quarters, while all the journals required gender-neutral language in submissions. Speech codes were devised. Sometimes, things got painfully specific. I remember one fellow grad student, fresh from a feminist lecture, coming into the office where TAs were packed six to a room and explaining where the term rule of thumb came from: In the old days, a man was allowed to beat his wife, but with a stick no wider than the width of his thumb. So don’t use it anymore!

Educators took action, too, checking textbooks for offensive words. Diane Ravitch wrote a book in 2003 whose title said it all: The Language Police: How Pressure Groups Restrict What Students Learn. Government bureaucrats followed suit, crafting rules for what room-for-rent ads could say and job interviewers could ask. Libertarian law professor David Bernstein summed up this extension of politically correct diction in another well-titled book: You Can’t Say That!: The Growing Threat to Civil Liberties from Anti-Discrimination Laws. And what Silicon Valley has done with verbal etiquette is too dispiriting to mention. Sometimes it seems we’re all in graduate seminars now, forced to signal loyalty to the latest fads by the words we use.

Many politically inclined professors accused text-theorists of obscuring the actual politics of culture. They called the textualists “mandarin.” They preferred the later Foucault of sexuality and surveillance to the early Foucault of reason and knowledge. The French Freudian Jacques Lacan’s theories of the linguistic origins of the self didn’t interest them unless they could be enlisted in feminist critiques of masculine ego. To go from work to text didn’t alter the power structure and ideological climate at all, they complained, not if we stopped there. Derrida, for all his radical designs, spent his energies on Plato, Descartes, Kant, Hegel, Nietzsche, Husserl, and Heidegger—the Western canon, all white men.

But today’s monitors owe the old textualists more than they admit. People who demanded we drop Indians and go with Native Americans needed exactly what the textualists gave them: a staunch conviction about the decisiveness of words. Foucault’s most popular book was entitled Les mots et les choses (“Words and Things”), while ­Derrida’s was De la grammatologie (“On the Study of Writing ­Systems”). Barthes stated back in 1953, “Language is never innocent.” Heidegger had announced in 1950, “Language speaks. Man speaks in that he responds to language,” while Nietzsche wrote in 1873, “What, then, is truth? A mobile army of metaphors, metonyms, anthropomorphisms.” Two generations of students learned from them to pay scrupulous attention to the smallest verbal mannerisms, persuaded that different words create new realities.

That meant you had to take notice of tiny and trivial expressions. They have force and impact, which makes them dangerous. This is how “textuality” in literary studies in the ’70s fueled today’s oversensitivity and censorship. What text-thinkers did with their word-fixations may have impressed social justice professors as mere language games, but it helped them discredit “Sticks and stones.”Words can hurt, because they are real. As Toni Morrison asserted in her 1993 Nobel Prize speech, “Oppressive language does more than represent violence; it is violence.” Grievance over microaggressions may seem to abandon common sense, but not in an academic system that insists small words do big things.

Last spring I participated in a daylong conference at Harvard on civics and education. While most presenters understood civics as the teaching of citizen-like ­behaviors—voting, volunteering, activism—I stressed civic knowledge and the American past. I referred in my presentation to the “American patrimony,” the ideas, books, artworks, speeches, and events that make up the national heritage. When I finished, another speaker began and pointedly cited the “American matrimony.” It was just a passing remark, not an argument, really, only a bluntly upgraded label for the American past.

When I heard her say it, I blinked and thought, “You’ve got to be kidding.” Matrimony, not patrimony, as our cultural legacy: This correction I had never heard before. Afterward I checked dictionaries for any definition of matrimony that might fit the occasion, but Webster’s Third has but one alternative to the customary “union of man and woman . . . married state: married life,” which is “a card game played with a layout in which . . .” Web Three defines patrimony as family inheritance, of course, but also national or cultural heritage.

Her coinage had no meaning. It didn’t denote an object; it erased a bad word. The referent was the same, the cultural past, but the proper name for it had to change. In the textualist universe, you see, where objects are veiled by nomenclature and we’re stuck in a “prison-house of language” (that’s the title of another High Theory book from 1972), when we switch our words we alter reality—to a better one, in the eyes of the reformers. Ordinary usage and dictionary definitions aren’t binding, not when we have a moral mandate to progress, and the countless thinkers and scholars who’ve spoken of cultural ­patrimony previously can be bowdlerized.

We bristle at this schoolmarmish revisionism because we sense a deeper goal than verbal etiquette. They’re out to undercut the reality and history we take as firm and true. It’s nominalism as political correctness. We trust in male-female difference; they obliterate it with pronouns. They pretend that saying “Happy Holidays!” instead of “Merry Christmas!” emends the historic existence of ­December 25. They don’t care that much about words. They want the world in their own image.

We can’t let the word monitors throw us off-­balance. The society they envision is a treacherous one. At the Harvard panel, I made sure to say “patrimony” once more in the Q&A. It was my return to reality. And if the other respondent had mentioned “matrimony” again, I would have tossed “patrimony” right back.

From now on, I stick to him. I’ll never treat them as singular, either. Western civilization is fine with me, and I’ll write pluralism, but not diversity, not ever. And I have dropped text and returned to work, too, especially in my classes, because I have found that the very thing the textualists found suspect in the term, the faith in unique greatness produced by a genius, is precisely what many nineteen-year-olds, who have grown up with textual relativism pounded into their heads, want to hear from their mentors.

Mark Bauerlein is senior editor of First Things.

Photo by aly via Creative Commons. Image cropped. 

This is the first of your three free articles for the month.