Support First Things by turning your adblocker off or by making a  donation. Thanks!

Campaigning for the French presidency last year, Nicolas Sarkozy ran hard against what Europeans still refer to as 1968 , describing the post-1968 New Left as “immoral” and “cynical” and defining the choice before the French electorate in stark terms: “In this election, the question is whether the heritage of May ‘68 should be perpetuated, or if it should be liquidated.” Evidently, French politics hasn’t yet discovered the warm fuzzies of the focus group.

Throughout the Western world, 1968 was a bad year, a moment in which history seemed to careen out of control. It was worse in Europe, and the impact of 1968 there was more profound. In Western Europe, the agitations of 1968 aimed to effect a deep rupture with the past, and if those who took to the Paris barricades failed politically, they succeeded culturally; the disspirited Western Europe that languishes in a crisis of civilizational morale today is a reflection of the exhausted politics of 1968, as Nicolas Sarkozy, Marcello Pera, Giuliano Ferrara, Joseph Ratzinger, and others have recognized.

Still, the year is remembered differently in the United States. It was, to be sure, a terrible year, replete with political violence; but it is the 1960s as a whole, the entire Sixties , that has had an enduring impact on our culture and our politics.

I don’t propose to revisit the question of whether what we call the Sixties was in fact born in the Fifties, or whether it unfolded its full plumage in that low decade, the Seventies. Rather, I want to examine six crucial moments in the Sixties with an eye to how they reshaped American political culture, with effects still being felt today. What a large segment of American political culture learned from those moments constitutes the issues-beneath-the-issues in 2008, and in that important sense, America is still fighting battles begun in the Sixties, like it or not.

The First Moment: The Assassination of John F. Kennedy in 1963

John Fitzgerald Kennedy would be ninety years old today, a circumstance nearly impossible to imagine. When Lee Harvey Oswald’s bullets struck on November 22, 1963, the national memory of the thirty-fifth president was frozen in a kind of memorial amber. It’s hard enough to picture a sixty-year-old Kennedy as the proprietor of a great newspaper (a post-presidential career he was considering); it is simply impossible to conjure up images of him at ­seventy-five or ninety. He remains forever young in the national consciousness.

Do we understand why he died, though? In Camelot and the Cultural Revolution , James Piereson argues that the answer is, in the main, no. According to the Authorized Version of the Kennedy story advanced by biographers (and former Kennedy aides) Arthur M. Schlesinger Jr. and Theodore Sorensen, Kennedy’s assassination was the by-product of a ­culture of violence that had infected the extreme American right wing. Right-wing paranoia about communism and civil-rights activism was abroad in the land, and this paranoia had turned the city of Dallas into a seething political madhouse. Something awful was likely to happen there, and it happened to John F. Kennedy, who had gone to Dallas to defend the politics of reason against the politics of irrational fear. Kennedy was martyred by unreason.

Thus the account from the court historians, which, interestingly enough, is also the story told to visitors of the museum on the sixth floor of the Texas School Book Depository in Dallas, from which Oswald fired the lethal shots. Schlesinger and Sorensen were not operating in a vacuum, of course. As Piereson usefully reminds us, they followed the lead of a mainstream media that bathed Kennedy’s assassination and Oswald’s subsequent murder in a torrent of introspection about an America fearful of the world, terrified of social change, and addicted to violence.

The Schlesinger-Sorensen interpretation was congenial to Jacqueline Kennedy, and it may well have owed something to her understanding of what had happened and why. After Lee Harvey Oswald was arrested and identified, Mrs. Kennedy lamented that her husband hadn’t even had the satisfaction of being killed for civil rights; his murderer had been a “silly little communist,” a fact Mrs. Kennedy thought had robbed Kennedy’s death of “any meaning.” So meaning would be created—and thus was born (with the help of popular historian Theodore White and Life magazine) the familiar imagery of the Kennedy White House as an Arthurian Camelot, a “brief shining moment” that must “never be forgot” (as Alan Jay Lerner’s lyrics, from the contemporary Broadway musical, put it).

Yet certain stubborn facts remain, as Piereson points out: Lee Harvey Oswald was a convinced communist, a former defector to the Soviet Union, and a passionate supporter of Fidel Castro; the Kennedy administration was a sworn foe of Castro’s communist regime, had authorized the Bay of Pigs operation, and had negotiated the removal of Soviet IRBMs from Cuba, much to Castro’s chagrin. Hatred of Kennedy’s Cold War policies was Oswald’s motivation. John F. Kennedy was not a victim of the irrational American right wing; he was a casualty of the Cold War—a Cold War, Piereson reminds us, that he prosecuted vigorously, if not always wisely or successfully.

The failure to acknowledge this in a country still jittery over the 1962 Cuban Missile Crisis is perhaps understandable. But too long an indulgence of the Camelot myth has had serious effects on our political culture. By turning John F. Kennedy—the embodiment of pragmatic, rationalist, results-oriented anticommunist liberalism—into a mythical figure whose idealism could never be recaptured, the hagiographers helped undermine the confidence in progress that had once characterized the liberalism of Franklin Roosevelt, Harry Truman, and Kennedy himself. When that confidence dissolved, conspiracy-theorizing migrated from the fever swamps of the extreme right and began to infect American liberalism. And since the glorious Camelot past could never be recaptured, American liberalism became less a matter of substantive change than of style—and eventually of lifestyle. The result is the liberalism we know today: a liberalism for which the legal recognition (indeed, promotion) of lifestyle libertinism remains the paramount concern.

The Kennedy assassination was the event that ignited the firestorm in American political culture that we call the Sixties. Of course, some of the tinder was already there, waiting to be lit. A year before the president’s death, Students for a Democratic Society issued what would become a key text for the New Left, the Port Huron Statement. The Kennedy assassination seemed to confirm Port Huron’s lament for a generation’s lost political innocence: As Tom Hayden and his SDS colleagues put it, “What we had originally seen as the American Golden Age was actually the decline of an era.” With Kennedy dead, there were no answers left in the old pragmatic liberalism—hence the New Left’s loathing of two of the last standing pragmatic liberals, Hubert Humphrey and Henry M. Jackson (not to mention the New Left’s rabid hatred of the most legislatively successful liberal president in history, Lyndon Johnson).

Take two measures of lost innocence, one large jigger of demonology, infuse that mix with the half-baked Marxist political theory of Herbert Marcuse, and what do you get? Within a few years after Port Huron and the Kennedy administration, what you got was the lethal political cocktail that took to spelling America with a kAmerika, a Nazi-like authoritarian polity built on injustice at home and posing a grave danger to the world.

This rapid decline of the political imagination and discourse of the American left in the wake of Kennedy’s assassination led, in time, to another surprise: a reversal in the gravitational field of American political ideas. In 1949, Lionel Trilling, the literary embodiment of the old liberalism, deplored those American conservatives who “express themselves” only in “irritable gestures which seek to resemble ideas.” Less than twenty years later, it was the New Left that embodied Trilling’s grim description, while a revitalized conservative movement was taking its first steps in developing the economic, cultural, social-­welfare, and foreign-policy ideas that would dominate American public life from 1980 through the attacks of September 11, 2001. During that period and down to today, conservatives and neoconservatives challenged Americans to bear great burdens to “assure the survival and the success of liberty,” as Kennedy had put it in his inaugural address, while many liberals who claimed the Kennedy mantle promoted various forms of neo-­isolationism.

All of which, one suspects, was not quite what Jacqueline Kennedy, Arthur Schlesinger Jr., Ted Sorensen, and Teddy White had in mind when they created the myth of Camelot. Irrespective of what the courtiers thought, we can be quite sure that John F. Kennedy did not think of his administration the way SDS did: as the American analogue to Weimar ­Germany.

The Second Moment: Griswold v. Connecticut in 1965

In 1961, the executive director of the Planned Parenthood League of Connecticut, Estelle Griswold, opened a birth-control clinic in New Haven in collaboration with Dr. C. Lee Buxton, a professor at the Yale School of Medicine. Their purpose was to test the constitutionality of Connecticut’s 1879 law banning the sale of contraceptives, a law that had never been enforced and which the U.S. Supreme Court had recently declined to review.

What appears to have been a carefully crafted strategy then unfolded: The state authorities acted; Griswold and Buxton were charged, tried, convicted, and fined $100 each; and the lower court decision was upheld by the relevant Connecticut appellate courts (including the splendidly named “Connecticut Supreme Court of Errors”). Griswold and Buxton then appealed to the U.S. Supreme Court, which accepted the case and, in the 1965 decision, Griswold v. Connecticut , struck down both the convictions and the Connecticut statute on the ground that the law violated what Justice William O. Douglas’ majority opinion called “the right to marital privacy.” Justice Douglas conceded that the Constitution did not mention a “right to privacy,” marital or otherwise, but famously opined that such a right was to be discerned in “penumbras formed by emanations” from the Constitution’s enumerated rights.

In dissent, Justice Potter Stewart described the Connecticut law he believed constitutional as “uncommonly silly,” which, in retrospect, was a phrase he could have used to describe Griswold v. Connecticut , adding “pernicious” to “silly.” For in terms of our legal culture, Griswold was the Pearl Harbor of the American culture war, the fierce debate over the moral and cultural foundations of our democracy that has shaped our politics for two generations.

As Edward Whelan has neatly put it, who knew that contraception could have such generative power? Griswold begat Eisenstadt v. Baird, the 1972 decision in which the court extended the protections of the “right of privacy” to nonmarried couples. Then Eisenstadt begat Roe v. Wade , in which the “right to privacy” was cited to strike down state abortion laws from sea to shining sea, in what Justice Byron White described as an exercise in “raw judicial power.” Roe , in turn, begat Casey v. Planned Parenthood, which positioned the “right to abortion” as a Fourteenth Amendment liberty right. Roe and Casey then begat the 2003 Supreme Court decision in Lawrence v. Texas , which struck down a state antisodomy statute, with Justice Anthony Kennedy making an explicit reference to Griswold ‘s “right to privacy” as “the most pertinent beginning point” for the line of reasoning that led the Court to Lawrence . And if Eisenstadt , Roe , Casey , and Lawrence were the direct descendants of Griswold , it is not difficult to see how Goodridge v. Department of Public Health , the 2003 Massachusetts Supreme Judicial Court decision mandating so-called “gay marriage,” was a collateral descendant of Justice Douglas’ discovery of a constitutional “right to privacy.”

This judicial artifact of the Sixties has had tremendous impact on our political culture. Just as the oral contraceptive pill facilitated the sexual revolution technologically, Griswold facilitated it constitutionally. Governmental indifference to contraception was soon construed to imply governmental indifference to abortion, via the misconstrual of abortion as a matter of ­sexual privacy rather than as a matter of public justice; and the “right to abortion” soon became a defining issue in our politics. “The ‘right to abortion,’ with its theme of sexual liberation,” as Hadley Arkes puts it, “has become the central peg on which the interests of the Democratic party have been arranged,” just as, “since the days of Ronald Reagan, the Republican party has become . . . the pro-life party in our politics.”Careful observers will note here a profound inversion. If abortion and related life issues are in fact the great civil-rights issues of our time—in that they test whether the state may arbitrarily deny the protection of the law to certain members of the human community—then Griswold eventually led to a situation in which the Democratic and Republican positions on civil rights flipped, with members of today’s Democratic party playing the role that its Southern intransigents played during the glory days of the American civil-rights movement.

The Supreme Court was not the only actor in these momentous changes, of course. The invention of the oral contraceptive pill must rank with the splitting of the atom and the unraveling of the DNA double helix as one of the three scientific achievements of the twentieth century with world-historical impact. The sexual revolution was also influenced by trends in philosophy, particularly existentialism’s emphasis on authenticity. The inability of many modern moral philosophers to get beyond Hume’s fact-value distinction in order to think their way through to a contemporary form of natural-law moral reasoning (which would in turn have helped discipline the public debate on abortion) also played its role. The supine surrender of most religious authorities to the sexual revolution removed one cultural obstacle to the sexual revolution’s triumphant progress, which was in turn supported by developments (or, perhaps better, deteriorations) in popular culture.

Still, in measuring the impact of the Sixties on the politics of 2008, the legal consequences of Griswold must be underscored. Here the Supreme Court began to set in legal concrete the notion that sexual morals and patterns of family life are matters of private choice or taste, not matters of public concern in which the state has a legitimate interest. That this trend should have eventually led to claims that marriage is whatever any configuration of adults-sharing-body-parts declares it to be ought not have been a surprise.

Nor should it have been a surprise that the Court, having successfully claimed for itself the authority to write a “living Constitution” based on penumbras and emanations, should assume the roles of National Metaphysician and National Nanny (as it did in Casey , with its famous “mystery of life” passage and its hectoring injunction to a fractious populace to fall into line behind the Court’s abortion jurisprudence). The royal road to the imperial judiciary may not have begun with Griswold , but Griswold certainly accelerated the pace of the coronation procession.

The Third Moment: The Tet Offensive in 1968

The American war in Vietnam spanned the entire decade of the Sixties. The American war over Vietnam continues to this day, as President Bush found out last fall when he used the Vietnam analogy to warn against the likely consequences of a precipitous U.S. withdrawal from Iraq. “A risky gambit,” Time called it. “Nonsense,” said historian Robert Dallek. “Irresponsible and ignorant,” harrumphed Senator John Kerry. “Ludicrous,” sneered the Nation ‘s Robert Scheer. “Surreal,” opined the editors of the New York Times. Why, asked Senate Majority Leader Harry Reid, was Bush tying his “flawed strategies” to “one of the worst foreign-policy blunders in our nation’s history”?

To judge by the bludgeoning the president took, you’d have thought he committed blasphemy. Which, in a sense, he had—for if there is any conviction on the left today that resembles the most stringent interpretation of biblical inerrancy among certain Protestant fundamentalists, it’s the left’s commitment to its narrative about Vietnam. That narrative, according to historian Arthur Herman, rests on four theses.

The first holds that an America obsessed with communism blundered mindlessly into an internal Vietnamese struggle in which no vital American interest was at stake. That obsession led the United States to fight a war against an indigenously supported native guerrilla movement for which the U.S. military was unprepared (the second thesis); so American forces resorted to barbaric tactics and then lied to the American people about them.

According to the third thesis in the canonical narrative, a losing struggle in the fetid jungles of Vietnam destroyed American troop morale and discipline; this disintegration led to rampant drug abuse, the murder of unpopular officers, atrocities like the My Lai murders, and a generation of physically and psychologically scarred veterans. Finally, in the fourth thesis of the left, America’s failure led to desirable effects, such as the reunification of Vietnam; and if atrocities happened after the communist takeover of South Vietnam and Cambodia, well, those atrocities were triggered by our meddling in affairs that were none of our business.

The Vietnam fundamentalists of the left have a serious problem, however, for the last decade’s worth of work by historians using primary-source materials from former Vietcong and North Vietnamese figures suggests that each of these four theses is wrong. The details of these historians’ work are interesting, for they point toward the sad conclusion that America seized defeat from the jaws of victory in Southeast Asia; but however one assesses that judgment, the ongoing effect of the canonical account of Vietnam on contemporary politics is unmistakable.

The antiwar domestic American politics of the Sixties were a volatile expression of what might be called, following the existentialist fashions of the time, the “politics of authenticity.” In the politics of authenticity, what counts is the nobility of my feelings; what does not count is evidence, and what is not required is an examination of conscience in light of the evidence. The politics of authenticity lead us by a short route to a public morality of feelings, impervious to data and dismissive of a moral calculus of possible consequences. Or to put it in Max Weber’s terms, the morality of intentions trumped the morality of responsibility in the American debate over Vietnam. The irresponsibility that characterized the Carter and Clinton administrations’ responses to the threat of global Jihadism—a fecklessness deeply influenced by the canonical Vietnam narrative—is one obvious result.

This irresponsibility in the name of putatively superior moral intentions and sensibilities has gotten worse in recent years, having been goaded to hitherto unimaginable extremes by the distorting psychological impact of what the American left has considered an illegitimate presidency since December 12, 2000. Harry Reid’s premature proclamations of defeat in Iraq—a defeat the impact of which the majority leader seemed to relish—is difficult to explain unless you understand that. For Reid and those of his persuasion, George Bush’s suggestion that a precipitous American withdrawal from Iraq would lead to bloodbaths similar to those in postwar Vietnam and Cambodia is blasphemy by a political heretic and usurper against the canonical account of America’s Vietnam and its revelation of the perils of American hubris.

The Tet Offensive of January 1968 was the point at which the liberal canonical account of America’s Vietnam, which was already shaping American journalism, began to have a marked impact on policy. Lyndon Johnson, taking Walter Cronkite’s misreporting of Tet seriously, lost heart; the Democratic party largely abandoned the war that John F. Kennedy had begun; public opinion, shaped by what now appears to have been some of the worst reporting of the television age, turned decisively against the war. Today, no responsible historian considers Tet anything other than a colossal military defeat for North Vietnam and the end of the Vietcong as a major force in the struggle for Vietnam’s future. But when David Halberstam (who, with Neil Sheehan, did more than anyone else to create the canonical narrative of Vietnam) died tragically this past year in an auto accident, not a single obituary notice I read suggested he had been terribly wrong about Tet or that his wrongheadedness had helped create a political situation that had had lethal consequences for millions.

The point here is not media-bashing. The point is that the canonical narrative continues to distort the worldview of many of those charged with responsibility for our national security. Senator Barack Obama may, for perfectly understandable political reasons, wish to “get beyond” the Sixties. But here is a question for Senator Obama, or anyone seeking the awesome burden and responsibility of the American presidency at this moment in history: With whom do you stand on the question of Vietnam and its relationship to our global responsibilities today? Do you stand with the fundamentalists, impervious to evidence? Or would the new, evidence-based historical-critical approach to understanding America’s war in Vietnam shape your thinking about American responsibilities in the ­twenty-first-century world?

The Fourth Moment: The Kerner Commission in 1968

The Sixties began with the American civil-rights movement at the height of its classic phase; the Sixties ended with the leaders of classic civil-rights activism dead or marginalized. A movement for national reconciliation in a color-blind society had been seized by race-baiters who preached a gospel of victimization and identified the struggles of black America with the revolutionary theories of such Third World ideologues as Frantz Fanon.

This happened in an astonishingly short period of time. When Lyndon Johnson signed the Civil Rights Act of 1964, he shared presidential pens with Martin Luther King Jr. and Roy Wilkins; within half a decade, King was dead, men like Wilkins were charged with being “Oreos” by the new black militants, and a culture of victimization had settled like a thick fog over America’s inner urban areas. Dr. King’s dream of a nation come to the mountaintop of justice had been displaced by chants of “Burn, baby, burn!” Equality of opportunity was passé; racial quotas masqueraded under the euphemism of “affirmative action”; King’s righteous demand that his children be judged by the content of their character rather than the color of their skin was inverted by race-hustlers and shakedown artists—an inversion subsequently validated by activist judges. The result was the alienation of the majority population and the descent of American inner cities into a miasma of broken families, illegitimacy, crime, substance abuse, and poverty.

Why and how one part of the American drama of race played out this way can be debated. But the fact that it happened continues to shape the American politics of the early twenty-first century. Perhaps the pivotal moment was the Kerner Commission Report of 1968, more formally known as the Report of the National Advisory Commission on Civil Disorders, created by President Johnson to determine the cause of the racial riots that had burned across America in the summer of 1967.

By 1967, the United States had faced the original sin of its founding and was making immense strides in building what is today the most racially egalitarian major nation on the planet. Segregation of public ­institutions had been declared unconstitutional and segregation of public facilities outlawed. The poll tax in federal elections had been banned by the Twenty-Fourth Amendment, Americans of African descent had been rapidly enfranchised, and, as the 1964 Democratic National Convention demonstrated, black America had begun to play a significant role in national politics. That all this had been accomplished by a religiously grounded movement of national moral and legal reform, in which blacks and whites worked, marched, and bled together, held out the prospect of further progress in sustaining racial equality before the law, creating equality of economic opportunity, and strengthening the culture of responsibility throughout American society

The Kerner Commission, however, seemed blind to many of these positive dynamics, proposing an analysis in which black “frustration” and white “racism” were the two forces shaping American urban life. Black America was a victim, and a victim could not be held morally responsible for lashing out against his victimization. According to the Kerner Commission’s analysis, racist white America was similarly bereft of moral resources, such that government, rather than the institutions of civil society that had been so central to the classic civil-rights movement, had to become the principal agent of enforced social change in order to deal with the crisis of an America “moving toward two societies . . . separate and unequal.”

While the Kerner Commission was rewriting the national narrative on civil rights in favor of a storyline of racial victimization and irresistible irresponsibility—precisely what King, Wilkins, and others of their generation had fought against—what all this meant was being played out in the furious 1968 controversy over local control of public-school faculty appointments in the Ocean Hill-Brownsville neighborhood of Brooklyn. By the time things simmered down, Brooklyn’s inner-city schools were in considerably worse shape, white liberals had become accustomed to making excuses for black violence, and the old alliances between the civil-rights movement, on the one hand, and the American labor movement and organized American Jewry, on the other, had been put under severe strain. Albert Shanker and the American Federation of Teachers may have won some of the battles in Brooklyn, but they lost the larger war, as American liberalism, forced to choose between maintaining its classic emphasis on a race-blind society and keeping pace with the new black militancy, eventually chose the latter. As for the black-Jewish alliance, that, too, shattered over time, to the point where Jesse Jackson could refer to New York as “Hymietown” and still remain both a player in Democratic politics and a feared figure in corporate board rooms intimidated by racial blackmail.

The emergence of what presidential historian Steven Hayward has called a “therapeutic victim culture,” which would have a profound impact on American politics, began with the collapse of the classic civil-rights movement in the mid-Sixties: which is to say, at its greatest moment of triumph. The classic civil-rights movement was determined to reshape America through moral reason; distorted into a twisted parody of itself through the victim culture, it was followed by a moralism self-consciously detached from reason that would prove incapable of calling anyone, black or white, to the great cause of equal justice for all.

As usual, those who paid the heaviest price were those with the least resources to withstand the breakdown of moral reason and the culture of responsibility in entire neighborhoods: the underclass. But among those with the resources to indulge irresponsibility, the new, late-Sixties’ culture of victimization would eventually set in motion two trends in American public life that are with us in 2008: the gay movement (which successfully, if quite implausibly, identified itself with pre-civil-rights-era black America) and leftist celebrity activism (which would end up providing political cover for the likes of Saddam Hussein and Hugo Chavez).

The Fifth Moment: The Publication of The Secular City in 1965

At the beginning of the Sixties, the National Council of Churches, ecumenical embodiment of mainline Protestantism, was as secure in the pantheon of influential American institutions as the American Medical Association and the American Bar Association. Thirty years later, to cite Richard John Neuhaus’ familiar formula, the mainline had become the oldline and was on its way to being the sideline.

That unenviable position now achieved, the National Council of Churches has been reduced to renting a few offices at 475 Riverside Drive, the famous “God Box” it had once occupied to capacity. As mainline Protestantism ceased to be a culture-forming force in American public life, the void was filled by a new Catholic presence in the public square and, perhaps most influentially in electoral terms, by the emergent activism of evangelical, fundamentalist, and Pentecostal Protestantism in what would become known as the Religious Right, a movement that has formed a crucial part of the Republican governing coalition for more than a quarter-century. The pivotal moment in this tectonic shift in American religion’s interface with American public life came in the Sixties, when the mainline imploded, theologically and politically.

The political side of the tale is a familiar one. What had begun as mainline Protestant support for the classic civil-rights movement quickly morphed into liberal Protestant support for black militancy, the most strident forms of anti-Vietnam protest, the most extreme elements of the women’s movement and the environmental movement, the nuclear-freeze and similar agitations, and, latterly, the gay-liberation movement. All of which must be considered a sadness, for it was the mainline that provided moral-cultural ballast to the American democratic experiment from the colonial period through World War II.

The theological side of the ledger was embodied by Harvey Cox’s 1965 bestseller, The Secular City, with its argument for a radically secularized Christianity in which the world sets the agenda for the Church. Cox’s book has not worn well over time; hardly anyone reads it today, save as a period piece. In its time, however, The Secular City put into play nearly all the major themes that, while they led mainline Protestantism into religious marginality, nonetheless had a decided influence on the politics of the Sixties, and of today.

The cult of the new; the fondness for revolutionary rhetoric; evil understood in therapeutic categories; worship conceived as self-realization; the celebration of action detached from either contemplation or ­serious intellectual reflection; insouciance toward ­tradition; moralism in place of moral reasoning; the identification of human striving with the in-breaking of the Kingdom of God—whatever Harvey Cox’s intentions, these are the things that people learned from The Secular City and its sundry offspring in the world of liberal American religious thought. By the time the Hartford Appeal for Theological Affirmation tried to redress the balance in 1975, the damage had been done. The Secular City helped accelerate the secularization of American elite culture, which created not only new openings in the public square for more-traditional religious bodies but also new fault lines in our politics—fault lines that are as visible as this morning’s headlines and op-ed pages.

The Sixth Moment: The Rise of Environmentalism in 1969

That human beings cannot live without transcendent points of spiritual and moral reference is nicely illustrated by the fact that, as liberal mainline Protestantism was collapsing, those who previously might have been expected to have been among its staunch adherents found a new god: the earth. The transformation of the quite sensible and admirable American conservation movement into an “ism”—environmentalism—is best understood, I suggest, as a matter of displaced religious yearning. Having found the God of liberal Protestantism implausible or boring, American liberal elites discovered a new deity, the worship of which involved a drastic transformation of nearly every sector of American life by the new liberalism’s favorite instrument of salvation, the state.

Inspired in part by bestsellers such as Rachel Carson’s 1962 Silent Spring , conservation-become-environmentalism evolved in the Sixties into a movement highly critical of technology and its impact on global ecology and deeply skeptical about markets; it also cross-pollinated politically with the antiwar movement. Which is at least a bit strange, in that it was an artifact produced by the much-deplored military-industrial complex—the photographs of Planet Earth taken by the astronauts of Apollo VIII on their Christmas 1968 circumnavigation of the moon—that gave the new environmentalism its icon and something of its emotional power.

That irony notwithstanding, antiwar activist John McConnell, having seen the pictures taken by Frank Borman, Jim Lovell, and Bill Anders from the command module windows of Apollo VIII, created the “Earth Day flag” from one of those photos and in 1969 proposed a global holiday in celebration of the earth to a UNESCO conference being held in San Francisco. The Earth Day Proclamation signed by U Thant, Margaret Mead, and others followed, as did the now ­annual celebration of Earth Day.

The new environmentalism was peopled by many of the same activists who had been instrumental in the anti-America’s-war-in-Vietnam movement, and, in subsequent decades, the new environmentalism has displayed characteristics similar to the fundamentalism or fideism of those who cling to the wreckage of the conventional narrative of America-in-Vietnam. Among those characteristics (in addition to a certain apocalypticism) is an imperviousness to contrary data and scientific evidence. As the Danish statistician Bjørn Lomborg has shown in study after study, life expectancy is increasing on a global basis, including in the Third World; water and air in the developed world are cleaner than five hundred years ago; fears of chemicals poisoning the earth are wildly exaggerated; both energy and food are cheaper and more plentiful throughout the world than ever before; “overpopulation” is a myth; and the global picture is, in truth, one of unprecedented human prosperity.

Acknowledging this, however, would call into question the revelation vouchsafed to another of the new environmentalism’s ideological allies, the population-control movement: namely, that people are a pollutant, a pernicious idea, born of the earlier progressivist eugenics movement and brought to a popular boil in the Sixties by evidence-light propagandists like Paul Ehrlich, that continues to affect U.S. foreign-aid policy to this day. Now, as always, the worship of false gods tends toward bad politics.

A Decade Still Much with Us

Taken together, these six moments suggest that something of enduring consequence happened to liberal politics, and thus to American political culture, during the Sixties. A politics of reason gave way to a politics of emotion and flirted with the politics of irrationality; the claims of moral reason were displaced by moralism; the notion that all men and women were called to live lives of responsibility was displaced by the notion that some people were, by reason of birth, victims; patriotism became suspect, to be replaced by a vague internationalism; democratic persuasion was displaced by judicial activism. Each of these consequences is much with us today. What one thinks about them defines the substratum of the politics of 2008, the issues-beneath-the-issues.

That this trajectory was unaffected by the victory of democracy and the free economy in the Revolution of 1989 and the collapse of the Soviet Empire tells us something important about the post-Sixties phase of the story. Beginning in the late Sixties, American liberalism followed the path of the global Left, substituting social issues and lifestyle libertinism for its previous concerns with economics and participatory politics. American liberalism, like its European counterpart, adopted the strategy of the Italian Marxist theorist Antonio Gramsci and began a long march through the institutions—first the universities, the media, the philanthropies, the religious communities; today the institutions of marriage and the family. That this has had the most ­profound impact on our politics is obvious: The American culture war, which is one of the preeminent issues-beneath-the-issues, shapes the public discourse on both domestic and foreign-policy questions every day.

The transformation of the pragmatic, results-­oriented, rationalist liberalism of John F. Kennedy, first into the New Left and subsequently into postmodern American liberalism, put the imperial autonomous Self at the center of one pole of American public life, where it displaced the notion of the free and virtuous society as the goal of American democracy. This raises the most profound and urgent questions about the future. Can a common culture capable of sustaining institutions of self-governance be built out of a congeries of autonomous selves? Can a politics detached from moral reason give reasons why toleration, civility, and persuasion are superior to coercion in doing the public business? What can the politics of autonomy—which is the distillation of the politics of the Sixties—say in the face of the existential threats that will confront the next president of the United States and the next Congress: the threat of Jihadism (which has a very clear, and very different, idea of the good society), and the threat of a slow descent, via biotechnology, into the stunted humanity of the brave new world? Is freedom understood as autonomy, and is willfulness a freedom worth sacrificing for? Or will only a renewal of the idea of freedom for excellence—freedom tethered to moral truth and ordered to goodness—see us through the political and cultural whitewater of the early twenty-first century?

The Sixties are indeed much with us, both for good and for ill. We should not forget that part of the Sixties that called the American people to live nobly in the defense and promotion of liberty, rightly understood. But the large question facing us today—the issue-beneath-the-issues in 2008—is whether the admirable legacy of the Sixties will win out over the less happy residues of that turbulent decade.

George Weigel is Distinguished Senior Fellow of Washington’s Ethics and Public Policy Center and the author, most recently, of Faith, Reason, and the War Against Jihadism. This essay is adapted from the William E. Simon Lecture of 2008.