Support First Things by turning your adblocker off or by making a  donation. Thanks!

Western civilization exerts unprecedented influence. Science commands the intellectual loyalty of elites around the world. Western strands of Christianity have enjoyed extraordinary missionary success in Africa and Asia. Communism—a Western ideology—migrated to China, destroyed its Confucian culture, and over the past generation has evolved into a materialist and technocratic mentality far more indebted to Bentham than to Mencius. Globalization is an American-led project, and it has gone from strength to strength. Yet, at the apogee of the West’s cultural power, we are riddled with guilt and doubts.

This paradox of triumph and doom has preoccupied me for more than a decade. After the attacks of ­September 11, 2001, many friends insisted that Islamic terrorism posed an existential threat to the West. The term “Islamo-­fascism,” coined by Norman Podhoretz, conjured an image of ­Osama bin Laden invested with the power of 1930s Germany, the most dynamic European nation of that period. And it suggested that radical Islam possessed the potency of modern nationalism, which from Napoleon to Hitler was the world’s most transformative political ­ideology. I remember challenging Midge Decter soon after 9/11. I insisted that the greatest threat to the West was . . . the West. After a few seconds of silence, she agreed.

Now, twenty years on, the Islamic world is more Westernized, not less so. Western-inspired feminism slowly grinds down resistance, even in Saudi Arabia. The diffusion of modern science allows Iran to develop nuclear weapons. The U.S. Army has spun its wheels in Iraq and Afghanistan at great cost in blood and treasure, but the inflow of Western pornography meets no resistance. In the Middle East, as elsewhere, the West has shown itself supremely effective in disrupting inherited ways of life and dissolving traditional loyalties.

China poses a threat, but as was the case with early-­twentieth-century Japan, which brought war to Asia in the 1930s and to the United States in 1941, Chinese power today arises from the great extent of its Westernization. The clearest sign of cultural triumph is when one’s adversaries adopt one’s weapons. My fear of Chinese dominance is not a fear of being under the thumb of an alien Confucian civilization. Rather, I see in twenty-first-century China a perverse version of the West: the promise of wealth and security in exchange for submission to ­technocratic control.

What, then, accounts for our gloom? To some degree, it arises from our investments in failed utopias. As the Christian vision of the final triumph of God’s justice and peace was secularized in the modern era, the West gave birth to many vain dreams of reason. Descartes epitomized the “begin afresh” mentality, one that imagined that with sufficient intelligence and goodwill we could demolish the flawed city we had inherited and rebuild it as an impeccable empire of reason. In this spirit, we proclaim that democracies don’t go to war with one another. Or that global trade brings peace. Or that scientific management can eliminate inequality, poverty, and every other social evil.

For a recent football game, the University of Minnesota replaced the players’ names on their jerseys with the words “End Racism.” It was but an instance of today’s desperate ambition. But the modern Western spirit of “eliminationism” is countermanded by reality. Man combines the noble with the base, the grandeur of transcendence with grave defects that can never be eliminated—short of eliminating mankind, which is what environmentalist radicals and population-growth fanatics propose. Nobody was more pessimistic about postwar prosperity in the West than the ardent Marxist. He saw the spread of prosperity as a curse, not a modest blessing. Its meager satisfactions induce false consciousness and delay the revolution that will eliminate all injustices—the only aim worth attaining.

The West is not Marxist, but all of us are affected by the secular utopianism to which Marx gave powerful expression. Why hasn’t the “spirit of progress” triumphed? Even if things get better, utopianism exercises such a powerful influence over our public imagination that we can see ourselves only in the dark light of present (and past) failures. That a single black man should be wrongfully killed by police is intolerable, and society explodes with outrage. That one woman in college should feel pressured to have sex is unacceptable, and we set aside due process. Anything short of perfection is cause for upheaval. We have always still farther to go.

Ironically, modern utopianisms, for all their idealism, are built on drab utilitarian foundations. This, too, contributes to our malaise. Noble acts inspire men. Warm rhetoric stirs hearts. Yet such appeals have no place in the empire of reason. Technocracy operates in accord with the predictable. As a result, the modern West tends to reduce human beings to congeries of interest. We tell ourselves that we are utility-maximizing machines—or, if we are sufficiently postmodern, power-hungry organisms bent on domination.

We often hear about the great sins of the West: ­colonialism, racism, genocide. This litany demoralizes us. But our sense that we are unfit to govern the world we dominate (as we nevertheless do, often without hesitation) stems above all from the low view of the human person encouraged by technocratic reason. We lack a vocabulary with which to articulate freedom’s heroic possibilities. Thus, the singular triumphs of the West seem unmerited, even perverse. We turn to the myth of the noble savage. Multiculturalism denigrates the West while championing anything it imagines unsullied by Western influences. The “non-Western” offers us our best hope for moral and spiritual survival.

John Paul II had a name for the pessimism of the West: the culture of death. It is manifest in the widespread acceptance of abortion, the death-dealing tool by which we prevent new life from interfering with what we take to be our “essential interests.” The same holds for euthanasia, which is often freely chosen to relieve us of life’s final burdens.

These are efforts to wring from life what we can get, rather than accepting life as a gift. The increasing childlessness of adult life in the West is less manifestly death-dealing, but it has a pervasive influence. In Norway, the number of women who had no children by age forty-five went from 9 percent in 1985 to 15 percent in 2017; for men, the percentage went from 14 to 25. To one degree or another, most countries in the West are trending toward childlessness, including the United States.

Children are the future. They are lines we cast forward in time. In children, we are drawn toward emotional investments in affairs beyond our limited years. The children in whom we place our hopes need not come from our loins. Parents have fundamental rights that must be given priority. That said, the public is rightly concerned with the welfare of children, which is why we quarrel over how to educate them. In a society charged with the current of new life, we are drawn beyond ourselves, giving birth to ideas, inventions, and endeavors that outstrip our ­immediate interests as we seek to provide the next generation with things we will not survive to enjoy.

Look around and count the absent children. In 2017, demographers reported the lowest birthrates on record for the U.S. The birth-dearth is unprecedented in human history; it is chosen, rather than imposed by disease and famine. In 2020, the diminished scope of our concerns led us to borrow money from the future to sustain our lockdowns, during which we have suspended the education of children, curtailed their play, and retarded their development so that eighty-year-olds will enjoy a marginally lower risk of dying. Long gone is the Christian horizon of life, which refuses death’s final and supreme claim. We imagine ourselves committed to ideals. Yet the culture of death empties ideals of their power. Frustrated by our inability to confect redemption on earth, humiliated by the disenchanting effects of technocratic reason, and increasingly childless, we’re reduced to the animal imperative of survival.

Is it any wonder that the West is haunted by our perceived inability to make the world more just, more sustainable, more at peace? Perhaps we are unworthy because we are impotent. We have inherited a vital civilization that has captured a great deal of the world’s imagination. But we are becoming, at best, its sterile custodians.

McCarrick Report

I count myself among those who wish to hear as little as possible about Theodore McCarrick, just as I prefer not to be informed about the latest activities of Lorena Bobbitt. Nevertheless, in November I was glad to see the publication of the Vatican’s official report on the disgraced ecclesiastical eminence. A two-year process examined the archives of relevant Congregations in the Roman Curia and documents from U.S. sources, as well as compiling interviews with more than ninety witnesses. The McCarrick Report does not claim to interpret the significance of McCarrick’s misdeeds or diagnose the environment in which he was able to sustain his deceptions. Its purpose is to provide an accurate account of the Church’s “institutional knowledge and decision-making.”

Three clusters of events documented in the Report caught my attention. The first concerns McCarrick’s early years as a clerical golden boy. He was ordained in the 1950s in the Archdiocese of New York, during the postwar boom in church attendance and Catholic confidence. His talent, charm, and ambition brought him a quick series of administrative appointments. He benefited from well-placed ecclesiastical patrons, as well as from intimate relations with wealthy Catholic families with whom he vacationed. When he was vetted for appointment to vacant bishop positions (he was eventually appointed auxiliary bishop of New York in 1977), he received unqualified recommendations.

I was not working in the bureaucracy of the Arch­diocese of New York in the 1970s, nor was I circulating among Fifth Avenue Catholics who took their holidays angling for bonefish in the Bahamas. But I would be surprised if some of the Catholic insiders in that decade were not aware of McCarrick’s indiscretions with young men. (The Report gives testimony from McCarrick’s years in New Jersey that suggests as much.) I came of age in that decade and can report that the 1960s knocked the stuffing out of the old consensus about sexual morality. Institutional authorities were disoriented, and few people in positions of authority had the confidence to enforce rules. Many turned a blind eye, not just in the Church, but also in universities, high schools, and other places where adults in close contact with young people exploited the new permissiveness. No doubt there was also a “lavender mafia” in the New York chancery that protected McCarrick. A network of gay priests who covered for each other likely dates back to Cardinal Spellman’s era, and in the 1970s, empowered by the sexual revolution, it received theological cover from the seemingly invincible future that would be brought by the “spirit of Vatican II.”

The Report is largely silent about such matters, and rightly so. My speculations point to a fitting topic for an enterprising historian: a social history of homosexuality in New York and its relation to the Catholic Church. It is not a proper subject matter for Vatican officials compiling a report on Church documents and decisions.

McCarrick’s appointment as archbishop of Washington, D.C., is the second interesting cluster of events. There can be no denying that Theodore McCarrick was an energetic administrator and zealous fundraiser, as well as a first-class networker. I once saw him in action at a large public event. This was after his retirement, when he was supposed to be lying low. I can testify to his charm—and to his evident vanity, which in postwar American Catholicism seems to have been a quality lay Catholics admired in their bishops, since it vindicated their own self-­satisfaction. By the time John Paul II was faced with the decision of whether to move McCarrick from his position in Newark to Washington (and thus, it was assumed, into the College of Cardinals), reports of suspected sexual misdeeds were on record. McCarrick countered with ardent professions of innocence. In the event, John Paul II chose to believe McCarrick. In 2000, the pontiff appointed him to the post in D.C.

As George Weigel and others have noted, John Paul II was generally skeptical of accusations of sexual misconduct. These had often been used by secret police in communist Poland to discredit clergy who showed signs of independence and a willingness to challenge the regime. I also suspect that John Paul II, a man of extraordinary intellectual ability and curiosity, was a sucker for intelligence, especially when it was joined to vitality and élan such as McCarrick possessed. Whatever the reasons, John Paul made a grave mistake. (A few years ago, a young priest friend in Poland told me that McCarrick was not John Paul II’s only misjudgment when it comes to the moral character of men he raised to high office.)

Wisdom in matters of appointment is not distributed in accord with personal sanctity. Nor are those gifted with insight into the character of others immune from error. And always cleaving to the judgments expressed in official documents is no guarantee of good outcomes, especially when those documents are produced and circulated in a bureaucracy embroiled in palace politics, as Rome’s is. But there can be no doubt that John Paul II misjudged. And his mistake was the single most important decision in ­McCarrick’s career. It sent a clear message to the American Church that the highest authority would not countenance accusations against the new archbishop of Washington, D.C. This allowed McCarrick to sustain his denials and rise to still greater heights of power and influence.

The third cluster of significant events took place during the pontificate of Benedict XVI. Unlike John Paul II, ­Benedict became convinced of McCarrick’s guilt. Yet Benedict temporized. He was inclined to take informal measures rather than opening up a canonical process, as some in the Curia recommended. His inner circle ensured the end of McCarrick’s extension in office beyond the canonical age of retirement and extracted a pledge that he would lower his profile and minimize his travel. But ­Benedict did not make his support for such strictures entirely explicit. McCarrick exploited the ambiguity to continue as a public representative of the Church.

I take this episode as characteristic of the pontificate of Benedict XVI. His judgments were consistently sound, but he ran the Church too much like a university seminar. Rather than impose his decisions when a clear exercise of authority was required, he tried to operate collegially. This is not a new criticism; Pope Benedict adduced his failure to master the curial bureaucracy as a reason for his shocking resignation in 2013. But it’s useful to see, in detail, the failures to act—or, more precisely, the ambiguous actions—in the case of McCarrick.

As one friend described it, the Report is “lawyerly,” which is to say, a dry account of what can be documented. It is not an exercise in ecclesiastical soul-searching. Yet the Report marks an important milestone. Not only does it provide us with firm factual bases for our own thoughts about the Church’s failures in the case of McCarrick. (I’ve recounted some of mine above.) It also marks the first time the Church has been willing to make so much of her internal affairs available for public inspection. As John Allen has observed, we may look back and say that this very real effort of transparency was among the more important steps toward reform taken by Pope Francis.

Governance by Grandees

In A Promised Land, former president Barack Obama, a perfect product of the institutions that define elite status in the United States, reflects on the years spent writing his presidential memoirs. He allows that he has entertained doubts about whether his time in the White House was marked by the very purest political courage. Given that (as he has it) “this nation’s ideals have always been secondary to conquest and subjugation, a racial caste system and rapacious capitalism,” shouldn’t he have spoken out more forcefully? He wonders whether he wasn’t “too tempered in speaking the truth” and “too cautious in either word or deed.” Was he remiss? Ought he to have scolded the American people more often for their sins?

“I don’t know,” Obama concedes. It’s a statement that borders on honesty. After all, he lied shamelessly when political calculation dictated that he give the impression that he opposed gay marriage. But the forty-fourth president of the United States retains confidence, at least in himself: “What I can say for certain is that I’m not yet ready to abandon the possibility of America.”

This pretense of introspection suggests a towering vanity, one of the qualities that has always made our first black president seem (to me at least) like a continuation of the old elite and not at all the “transformational” figure he imagines himself to be. His hauteur and self-regard, the certainty that his views are self-evident, the serenity masquerading as self-criticism, the paternalistic solicitude for the insufficiently enlightened, the conceit of being chosen by “history” to rule and govern—all of this and more has always reminded me of my youthful encounters with WASP grandees.

I had Obama in mind when I turned to Geoffrey ­Kabaservice’s 2004 study of the postwar ruling class, The Guardians: Kingman Brewster, His Circle, and the Rise of the Liberal Establishment. Like his close friend McGeorge Bundy, Kingman Brewster Jr. had a reversible-raincoat name, characteristic of Northeast elites. Their first names echoed their family lineages, a common practice among the Mayflower set. Along with perennial political appointees Cyrus Vance and Elliot Richardson, New York mayor John Lindsay, and Episcopalian bishop Paul Moore, these men became pillars of the close-knit and Northeast-based liberal establishment that dominated American public life in the decades after World War II.

Kabaservice places Kingman Brewster at the center of his story. Brewster’s ambitions were cultural and institutional, not political or financial. He rose to become president of Yale University (1963–77) and played a central role in defining the “responsible center” during the tumultuous 1960s: procedural, pragmatic, and—he hoped—principled.

A descendent of William Brewster, the Mayflower colonist and Pilgrim leader, Brewster was educated in Boston and attended Yale (1937–41). He rose to prominence as an undergraduate, becoming a Big Man on Campus. But there was in Brewster a strong strain of New England egalitarianism, and he famously said “no” when offered membership in Yale’s most prestigious senior society, Skull and Bones.

Brewster was a student activist, not just on campus but nationally. These experiences were invaluable when activism threatened to overwhelm education in the late 1960s. In 1940, he invited Charles Lindbergh to speak at Yale. He went on to be one of the founders of Lindbergh’s America First Committee, which opposed U.S. involvement in the war in Europe. Brewster played a prominent role in the Committee’s nationwide efforts, which, contrary to today’s caricatures, was not a movement of “right-wing extremists,” but included in its membership future establishment figures such as Potter Stewart (appointed to the Supreme Court by Eisenhower) and Sargent Shriver (Kennedy intimate and first head of the Peace Corps).

After Pearl Harbor, Brewster volunteered for service. Paul Moore, a fellow Yale graduate and scion of a ­fabulously rich family, joined the Marines in 1941 and was wounded at Guadalcanal. Some friends in this elite circle were killed. Virtually all enlisted. There was never a question in their minds: Those whom divine providence had put at the top of American society benefited greatly, but they carried a special burden of leadership that must not be evaded.

The Great Depression and World War II defined the American experience in the twentieth century. The experiences of military service affected the WASP elites of Brewster’s generation. They worked shoulder-to-shoulder with less privileged men, some of whom had greater courage, more ingenuity, and greater stamina. The war solidified their egalitarian spirit and tempered their arrogance. Moreover, the scale and scope of destruction made them sensible of the tragic dimension of life—the hard realities of loss and the fragility of civilization.

Yet the war also encouraged a technocratic confidence that could overwhelm their tragic sense. The conflict required mobilizing entire industries and, in the case of the atomic bomb, inventing new ones. Enlisted men numbered in the millions. Power was projected across the globe. And the mission was accomplished. Nazism was crushed and Japan defeated. This success inspired Brewster and his generation to believe that when resources are mobilized and smart men apply their intelligence, problems will be solved. This mentality led to two failed wars, one in Vietnam and another against American poverty.

Throughout The Guardians, one feels the tension between humility and hubris in Brewster and his circle. Kabaservice notes that this generation of elite men recognized that old hierarchies were unsustainable in the postwar era. Few were leaders in civil rights. Fewer were early proponents of feminism. But they listened to these challenges to the status quo rather than peremptorily dismissing them.

Kabaservice details changes in Yale’s methods for undergraduate admissions under Brewster: including more public school kids, more minorities, and fewer legacies. The new approach was not just a response to broad changes in American society. Brewster and his social set affirmed and contributed to a redefinition of the elite. On the one hand, they emphasized merit, especially academic aptitude, which stood in contrast to older notions of family status and WASP credentials. On the other, they promoted the ideal of diversity. The elite must “represent” the country, which meant ensuring that places like Yale (as well as foundation boards and other positions of cultural power) had a more proportionate racial, gender, and ethnic composition.

Ending the WASP monopoly on elite status was inevitable. That monopoly could not survive the collective experience of the war. Brewster’s realism on this point was commendable. But he and his circle were naive in their technocratic hubris. They tried to confect a new elite, and like a planned economy, the planned ruling class was a rickety affair. The uproar among black students at Yale (and most other universities) after Brewster’s admissions policies changed made that evident.

As racial anger exploded into the streets in the late 1960s, the WASP grandees scrambled to accommodate new demands. Brewster’s close friend and fellow Yale graduate McGeorge Bundy, the head of the Ford Foundation, laid the groundwork for today’s identity politics. In Bundy we see the beginnings of elite hostility toward white working-class Americans. In the face of urban ­unrest, Bundy blamed white racists. He set the pattern that continues to this day: White elites do not stand down in the face of challenges. Instead, they select new, more racially diverse partners with whom to share power, which means withdrawing power from whites who are below them on the social ladder. White elites do this in order to tamp down dangerous radicalisms (the mission of the “responsible center”), which conveniently secures their legitimacy in times of social change.

We live with their successes, but also with their failures. One of the most significant has been the ongoing inability of our ruling class to join the ideal of diversity (now decayed into identity politics) to the other pillar of the new elite, merit, in convincing ways. It is for this reason that litigation over elite admissions continues, even though our ruling class insists that there are no hidden injustices. It is perhaps also why Obama’s election as our first black president seems to have heightened racial tensions rather than reduced them. It seems the non-elite are not confident that Brewster’s methods will produce results that serve their interests.

Kabaservice admires the WASP elite of the Greatest Generation. I am less enthusiastic. Brewster was undoubtedly a man of remarkable capacities and generosity of spirit. In 1970, in the face of demands from black students, Brewster refused to shut down classes at Yale. But his solution was to accommodate demands for separate housing for black students and a separate curriculum (black studies). This strategy of appeasement contradicted his aim of creating a new, diverse, but integrated student body at Yale—the Old Blue spirit shared with young people from many backgrounds. The compromise was also inherently unstable. As years passed, student radicals cleaved to the pattern Brewster had established, issuing threats and making demands, confident that university administrators would express their utmost solicitude for the “spirit of protest” and use their managerial skills to figure out how to make as many concessions as possible while “keeping Yale Yale.”

The Ivy League strategy of appeasement pioneered by Brewster now serves as the pattern for civil authorities, as last summer’s protests demonstrated. Brewster drew a line: no cancellation of classes. Our politicians do so as well: Nobody can be killed. But short of that, protesters must be “respected.” Vandalism and looting, like campus unrest, are unfortunate but forgiven. Better to be flexible than rigid. Our goal should be to promote “­understanding” rather than confrontation.

Yet I must hesitate in my criticisms. I, too, am sympathetic, even if more jaundiced than Kabaservice. Consider William Sloane Coffin, who figured prominently in ­Brewster’s time as Yale’s president. As university chaplain, Coffin used his pulpit to urge the causes of social justice. The same holds for Paul Moore, who eventually became the Episcopal bishop of New York and used his authority to push for radical change. These men imagined themselves prophets. But in truth they satisfied their moral vanity and contributed to the destruction of the institutions entrusted to their care. Brewster, by contrast, preserved and in many ways strengthened Yale.

The New Left derided the “managerial liberalism” of elites such as Brewster and Bundy. In their eyes, it was just pragmatic temporizing to keep the system going—not an inaccurate assessment. But in fairness, what were the options? How to prosecute a war the nation cannot win but cannot abandon (Bundy’s problem)? How to run an elite university in a decade that purports to reject elitism (­Brewster’s problem)? Sometimes kicking the can down the road is the best option, especially if the can is a grenade.

Which brings me back to Barack Obama and the neo-WASP elite in which he plays a leading role. (It’s fitting that the Obamas bought an estate on Martha’s Vineyard, the summer refuge of Brewster and his friends.) I’m torn in my judgments. Obama was a temporizer, a “managerial liberal,” and perhaps necessarily so, given our fractious nation. But in my estimation, he made things worse rather than better. There is too much William Sloane Coffin in his self-image. Perhaps this, too, was inevitable, given the lure of the prophetic role since the 1960s. Brewster tried to remake the elite for an anti-elite age, and in so doing ­created the illusion among those that succeeded him that people at the top are “transformative” and “change agents,” rather than custodians or (to use ­Kabaservice’s term from Plato) “guardians.” In my bright college years, I learned an important truth: Beware elites who claim to be opposed to elitism. Be skeptical of grandees showered with establishment accolades and elevated to high positions who announce that they are prophets.

Bill Eerdmans, RIP

William B. Eerdmans, a Dutch immigrant to the United States, founded a theological publishing house in Grand Rapids, Michigan, in 1911. His son, William B. Eerdmans Jr., took over in 1963 and ran the operation for more than fifty years. Born in 1923, he passed away on November 13.

When I entered graduate school in the 1980s, ­Eerdmans Publishing was in the midst of a rapid evolution—from niche publisher for conservative Reformed Christians to theological powerhouse. In those years, the differences between Protestant and Catholic had become less significant than the differences between liberal or “revisionist” theology and theology that cleaves to the apostolic tradition. This scrambled the old system of theological publishing, which aside from a few academic presses was organized along denominational lines. Eerdmans Publishing was the first “conservative” Protestant house to expand beyond its denominational base and develop an ecumenical list of serious theological books. Doing so took vision and an appetite for risk-taking, qualities Bill Eerdmans had in full.

As a young Episcopalian theologian, I offered in my first book a Barth-inspired interpretation of Karl Rahner, one of the major Catholic theologians of the twentieth century. It was not clear who would want to read such a quirky book. It was my good fortune that Bill was a quirky man. I pitched my book to him at an annual meeting of the American Academy of Religion in the early 1990s. He made one of the oblique, jesting comments for which he was notorious. Then, with a Cheshire Cat smile, he said, “Why don’t you mail me the manuscript?” I did, and his return correspondence included a book contract. I’ve lost the letter, but as I recall Bill wrote something like this: “It’s not Shakespeare, but it’s pretty good. Why not publish it?”

In those years, I ran with a theological Rat Pack: ­David Yeago, Ephraim Radner, George Sumner, and other Yale PhDs who propounded what George Lindbeck called “postliberal theology.” Bill was a patron. He paid for drinks and supplied encouragement. At one point in the late 1990s, I began outlining an idea for a book. He held up his hand to stop me, saying, “I’ll publish anything you write.” To which I responded, “On what topic?” He replied, “I don’t care as long as it’s less than sixty thousand words.”

At that time I had traveled to Lithuania on a few occasions to offer short courses in theology. The academic center that hosted me was impoverished. During forty years of communist rule, the university library had acquired no books in theology. I asked Bill if Eerdmans would be willing to donate some books. He handed me a book catalogue, telling me to mark it up—“and don’t be bashful.” Box after box of books arrived in Vilnius two months later.

Bill had a warm friendship with our founder, Richard John Neuhaus, another first-rate schemer who, like Bill, enjoyed talking over drinks. RJN’s most famous book, The Naked Public Square, was published by Eerdmans, as were a number of volumes of essays produced by conferences and consultations that RJN organized. Richard was a fine judge of talent, and I’m sure Bill, who was always on the lookout, valued his assessments of writers and theologians. It was through their relationship that Wolfhart Pannenberg came into the Eerdmans fold.

I wish I could have listened in on some of those ­conversations. Bill spoke around what he wanted to say, at once insisting and retracting, saying and not saying. He loved wordplay, even when it took him in strange directions. Richard, by contrast, was preternaturally lucid, confident that he was saying what needed to be said in just the right way. They nevertheless had an excellent rapport, perhaps because when Bill said “yes,” he meant it and kept his word.

When I took over as editor of First Things, Bill came to visit. We had a take-out dinner at my apartment. He was by that point in his late eighties and hard of hearing. But his eyes twinkled, and his speech still cast pixie dust over the evening. We spoke of many things. He had recollections of RJN and First Things that he wanted to transmit. But the leitmotif to which he often returned was not so much advice as a command. It’s hard for me to epitomize—Bill’s distinctive style often resisted summary—but it was something like: Don’t leave the wounded behind.

Was he trying to warn me? We live in a progressive age, and conservatives must often steel themselves in order to resist being swept along. It’s tempting to pound the pulpit and insist upon the gospel. The “no” can overwhelm the “yes.” Truth-telling in this time of lies can seem so important that we leave little time for the love-giving which, St. Paul tells us, is our foretaste of eternity.

Bill’s longtime colleague Jon Pott observed that he could be disorienting. His gnomic statements often left you not knowing quite what to think. But he connected with people. He unnerved and yet reassured. Perhaps it was the odd truth of his strange locutions that reassured. After all, life here below is always tattered around the edges. Even if we are firmly anchored in our faith, until the Lord returns in glory lots of things don’t add up.

Bill Eerdmans was a singular man. I’m grateful to have been able to call him my publisher. May he rest in peace.

WHILE WE'RE AT IT

♦ “Britain’s Choice: Common Ground and Division in 2020s Britain” presents results of research about political opinion and engagement in the United Kingdom. The nearly three-hundred-page report argues that there is more common ground than contemporary politics would suggest. It’s not a judgment I doubt, but I’ll leave that claim aside. What caught my attention were results for one of the groups the study identifies: progressive activists. The wokest of the woke, they make up 13 percent of British society. This cohort is ardent, and the study shows that its members are six times more likely than others to broadcast their political opinions on social media. In accord with their interest in hectoring their fellow citizens, progressive activists are also three-to-four times less likely than others to think political correctness is a problem.

I speculate that the percentage of the American population that fits the progressive activist profile is comparable: more than 10 percent, but not much more. And in the United States as well, their voices dominate social media, as anyone who has spent time on Twitter knows. This fact is producing a democracy deficit. Progressives drown out alternative voices and stomp on dissent. This deficit is now being compounded by censorship imposed by social media giants and search algorithms designed to steer people away from “dangerous” opinions.


♦ In 1949 the Federal Communications Commission formulated a “fairness doctrine” that required holders of broadcast licenses to present public issues in a balanced way. Contrasting viewpoints could not be excluded. The doctrine was rescinded by the FCC in 1987, on the argument that the proliferation of media meant that citizens have access to a wide variety of views and the government need not ensure the airing of contrasting viewpoints.

Perhaps the repeal made sense in the 1980s. In any event, the old fairness doctrine is likely unworkable today, when anyone can set himself up as an internet pundit. But we need to give serious thought to crafting regulations that prevent progressive activists, a small portion of the American population, from dominating social media and driving out alternative points of view. Sen. Josh Hawley’s office has made proposals that merit discussion and refinement.


♦ Carl Trueman on freedom of speech and religion:

The rise of modern expressive individualism is intimately connected to concepts such as freedom of speech and religion, but these things are far less plausible as social virtues now. The intuitions of the rising generation do not default to the standard liberal orthodoxies in these matters. Cancel culture is unlikely to be automatically anathema to generations raised to think of identity as psychological and therefore of freedom of speech and religion as enabling acts of linguistic violence. As fewer people consider religion important in their own lives, fewer people will care about religious freedom. And to those who are dependent on the rising generation to vote them into office—which includes, of course, candidates of all political affiliations—the robust defense of traditional freedoms is therefore likely to be less and less of a political priority.

♦ Carl’s new book is out: The Rise and Triumph of the Modern Self: Cultural Amnesia, Expressive Individualism, and the Road to Sexual Revolution. In it he details the transformation of our conception of what it means to live a good life. We are “expressive individuals” these days, not children of God. This change goes a long way toward explaining the sexual revolution, which gives pride of place to modern preoccupations with identity. Highly recommended.


♦ Amidst the media furor over the Trump campaign’s claims of electoral malfeasance, it’s useful to recall the 2018 Pulitzer Prize for National Reporting. The prize was awarded to the staffs of the New York Times and Washington Post. The prize committee commended them for “relentlessly reported coverage in the public interest that dramatically furthered the nation’s understanding of Russian interference in the 2016 presidential election and its connections to the Trump campaign, the President-elect’s transition team and his eventual administration.”


♦ In view of the fact that two prestigious publications flogged Clinton-campaign propaganda, excused the Obama administration’s lies to the Foreign Intelligence Surveillance Court and its unprecedented domestic spying on the Trump campaign and transition team, encouraged the baseless prosecution of General Michael Flynn, and to this day contribute to the cover-up of these misdeeds, one can be excused for harboring suspicions about the veracity of media reports in 2020. And one can take a jaundiced view of high-minded posturing about the importance of defending “democratic norms.”


♦ It is two weeks after the election as I write. Litigation and press conferences suggest that the Trump team does not have the dramatic evidence of electoral fraud necessary to convince courts and electoral officials to change the outcome. At this juncture, conservative media and bloggers are offering a variety of sober assessments—rather than repeating conspiracy theories or making panicked statements about how, if things don’t go their way, our political system will be discredited. Trump’s challenges of the vote counts were always well within his rights. The extraordinary circumstances of the voting and the outsized role of mail-in votes were well-known threats to election security, as many pointed out in advance. Instead of “threatening our institutions,” as those who peddled the Russian collusion hoax for years are now saying, the Trump campaign’s challenges may do some good. They may spotlight problems and weaknesses in the way some states conduct elections, allowing officials to remedy them.


♦ Trump’s legal challenges should also sober up those who clamor for the elimination of the Electoral College. In our fractious times, it’s easy to imagine a still more contentious post-election struggle over vote counts than what we experienced in 2020. One virtue of the Electoral College is that, unlike a potentially endless counting and court-ordered recounting of the popular vote in a nation of more than 330 million people, it provides the constitutionally decisive and final word on who will be the president of the United States.


♦ Josef Pieper: “If the sons truly no longer knew the significance of the great holidays celebrated by their fathers, then the most immediate tie between the generations would be cut and tradition would, strictly speaking, no longer exist.” A patrimony gives us a place in the world, a role in the great chain of history. It links the past to the present and allows us to see the future as part of a self-same enterprise. The toppling of statues, and the pedagogies of contempt for the loves and loyalties that animated our fathers and grandfathers, leaves the rising generation without a legacy. This is a grave injustice, for it renders the young homeless and fatherless. On the implications, see Mary Eberstadt, “The Fury of the Fatherless” (December 2020).


♦ The school board at St. Boniface Catholic Church recently chastised a parishioner for reading from the Catechism during a board meeting. The leaders of this school, located in a suburb of Toronto, deemed the verbatim account of Catholic teaching dangerous and harmful, “­putting down a marginalized and vulnerable ­community.” The parishioner made his intervention because the meeting had been called to discuss whether to discipline school board member ­Michael Del Grande, who last year had objected to the board’s decision to add “gender identity, gender expression, family status, and marital status” to the categories protected in the parish school’s code of conduct, a move that in today’s environment makes any public disapproval of homosexuality and transgenderism an offense. After silencing the parishioner, the board voted to censure Del Grande and assigned him the penance of completing an equity training program.

The subordination of the teachings of the Catholic Church to the magisterium of the Human Rights Campaign provoked Cardinal Thomas Collins to write a strongly worded letter to Joseph Martino, the St. Boniface school board chairman: “That a Catholic should be criticized, and effectively be prevented by Catholic Trustees from reading from the Catholic Catechism at a meeting of a Catholic School Board, is simply reprehensive.” Cardinal Collins went on to say,

If those engaged in Catholic Education do not ­appreciate the rich and life-giving faith of the Church, which is the defining characteristic of Catholic Education, and instead have bought into the fundamentally anti-Catholic narrative that misrepresents Catholic faith as lacking in compassion, then I question how they can fulfill their sacred mission, and truly serve those who are entrusted to their care.

♦ “The Last Children of Down Syndrome” by Sarah Zhang (The Atlantic, December 2020) discusses the “­velvet eugenics” operating throughout the West. It is especially evident in the widespread use of abortion to destroy children in the womb after prenatal testing indicates a likelihood of Down syndrome. In his review of What It Means to Be Human (“Bring Back the Body”) in this issue, Sohrab Ahmari points out that in France 96 percent of children suspected to have Down syndrome are now aborted. Researchers estimate that nearly 70 percent of women in the United States who are presented with positive test results for Down syndrome choose to abort.

Bioethicist David Wasserman tells Zhang that most parents do not aim at “perfection.” Rather, they are deeply anxious: “There’s profound risk aversion.” This is an instance of the paradox I discuss above. We live in a time when medical technology has given us great resources with which to combat diseases. As Zhang notes, the wealth of Western societies provides extraordinary services to handicapped children and adults. There’s never been a better time to live with a disability. And yet power and wealth seem to stoke fears of vulnerability rather than empowering us to face adversity. Zhang documents that the fear of giving birth to a disabled child increases with the educational levels and income of women. It seems the blessings of progress have turned life into a dangerous minefield to be carefully navigated, rather than a blessing to be celebrated and enjoyed.


♦ John William O’Sullivan observes that Ireland has not emerged from the darkness of its priest-ridden past to the bright uplands of Enlightenment happiness. Yes, the country has become richer. But that’s come at the cost of dire increases in social dysfunction:

Since 1960 the murder rate has increased fivefold; suicides have increased sevenfold; divorces have increased forty-sevenfold; drug-related deaths have risen twelvefold; and alcohol consumption has risen threefold. This is not progress. These are the symptoms of long-term social and cultural decline. The fact that people have more money in the bank means little in the face of these statistics, which show a dramatic decline in the quality of life.

Sadly, Ireland is not unique in these doleful statistics. Charles Murray tells a nuanced tale of social decline for Americans lower on the social ladder. Anne Case and ­Angus Deaton have likewise charted shocking increases in deaths of despair.


♦ Rabbi Jonathan Sacks passed away on November 7. He was a man of remarkable intelligence and devotion who gave himself to the service not just of the Jewish community (which he served for more than twenty years as Chief Rabbi of Great Britain) but of all religious believers and those who wish to buttress the authority of moral truth in our confused age. I’m grateful to Mark Gottlieb for his reflection on Rabbi Sacks’s legacy (“Remembering Rabbi Lord Sacks [1948–2020]”). I did not know Rabbi Sacks well. I can say with confidence, however, that he was truly a great orator. His 2013 Erasmus Lecture was delivered extemporaneously and went on for nearly an hour. The faces of the audience were rapt, and the applause at the end was thunderous. Before that evening, I had been mystified by the old Protestant tradition of two-hour sermons. Listening to Rabbi Sacks, I understood. May his memory be a blessing.


♦ You can view Rabbi Sacks’s Erasmus Lecture, “On ­Creative Minorities,” online at ­firstthings.com/events/­2013-erasmus-lecture. You can also hear him in conversation with contributing editor Mark Bauerlein on the topic of his 2015 book Not in God’s Name: Confronting Religious Violence at firstthings.com/media/religious-violence-and-biblical-answers.


♦ In the Letters section, Dale Coulter takes Michael Allen to task for not giving attention to John Webster’s theology of the church. Their exchange reminds me of a conversation I had with John, during which he agreed to write a commentary on Ephesians for the Brazos Theological Commentary on the Bible series that I was launching. John was keen to warn this recent Catholic convert that he intended to write a commentary that avoided “an ­exaggerated ecclesiology.”


♦ A few months ago, we conducted an online survey of readers. More than two thousand filled out the lengthy questionnaire. Many left extensive comments. We’ve analyzed the data and had fruitful internal discussions. Here are some highlights.

A substantial plurality indicated that reading First Things is like joining a group of friends for a discussion. Others described regular engagement as akin to participating in a seminar discussion. The First Things readership is well educated and heavily Catholic (70 percent). Readers tend to be politically conservative, but they look to us for insights into theology and culture that transcend the news cycle. More than 80 percent reported that First Things has enhanced their religious faith, helped them speak out on public issues, and deepened their appreciation of the Western tradition.


♦ One reader wrote:

Growing up as an evangelical in an increasingly secular, liberal United Kingdom, discovering First Things was a revelation. Finally, I found a community—though disparate and distant—of those who thought like me. I would urge you not to give in to temptation to be less radical. As inevitable troubles deepen, you must speak more clearly than ever.

Rest assured, dear reader, we will.


♦ Thomas Gaier wants to know of readers in the Cincinnati area who have an interest in forming a ROFTERS group. You may contact him at minister87@yahoo.com. 

R. R. Reno is editor of First Things