Support First Things by turning your adblocker off or by making a  donation. Thanks!

From all appearances, it is now back in style to be critical of American individualism. Indeed, that critique has never gone entirely out of style, and for very good reasons. But views on these matters also seem to follow cycles which, if not of Schlesingerian predictability, are nevertheless fairly regular; and for the past three decades the language of individual rights, entitlements, and self-realization has nearly always prevailed in the field of public discourse. There are signals emanating from various quarters, including even feeble and intermittent signals from the “communitarian” elements in the Clinton Administration, that this may have begun to change, at least on a rhetorical level. But if we are to have a national discussion of individualism, and if such a discussion is to move beyond simplistic platitudes, which often serve only to veil the instrumental partisanship that really motivates them—the ritual invocation of the 1980s as a “decade of greed,” for example—we will have to think more deeply and carefully about what American individualism really is, and about the proper social preconditions for combating it.

This is not always easy to do. In his classic study Childhood and Society, the psychologist Erik Erikson observed that those who tried (as he did) to generalize about the character of Americans came up against a peculiar obstacle. “Whatever one may come to consider a truly American trait,” Erikson mused, “can be shown to have its equally characteristic opposite.” An intriguing thought: that the genius of American culture resides not in anything monolithic, but rather in its capacity for generating duality, even polarity. This thought seems to be supported by many of the most distinguished observers of American social character, from Alexis de Tocqueville on, who often take the country to task (or praise it) for diametrically opposite traits—sometimes even in the same work.

One can readily apply Erikson’s observation to a variety of issues—for example, the tension between secularism and religious faith in American culture. But nowhere is it more relevant than in the perpetual opposition between individualism and social conformity, that great chestnut of American cultural analysis. The same fundamental questions arise again and again: are Americans excessively prone to willful individualism, or, conversely, are Americans excessively prone to mindless conformity? The best answer would seem to be, Yes, take your pick, for you can find ample support for either assertion. And if you like, feel free to take one of each. Americans are seen to be predisposed to a radical individualism that holds all social obligations and traditional forms of authority of no account, yet they also are seen to be imprisoned within an anxious and timid social conformism that smothers autonomy, expressiveness, and creativity in the name of sociability and order. Can it be that both are somehow enduringly true?

Tocqueville seemed to think so. One of the many virtues of his Democracy in America was its ability to formulate with precision both halves of the dualism, and the sound reasons for fearing each. Tocqueville distrusted individualism because he saw in it the seeds of a completely atomistic society in which every individual would withdraw into the sanctity of his own private world, with the consequence that all high-mindedness, public-spiritedness, and even public life itself would shrivel and die. But he also feared that America’s egalitarian dogma would condemn it to a culture of pervasive mediocrity and envious sidelong glances, and would eventually lead it to embrace the secure and paternalistic collectivism of a centralized government—and thereby extinguish the liberty that he prized above all else. Although Tocqueville did not explore the connection between them systematically, he left the clear impression that these two pathologies of American democratic culture were not merely alternatives, but were somehow linked to one another, complementary expressions of a single modern condition.

Other writers, of course, took less subtle positions; and over the course of the nineteenth century, the prevailing views of different periods varied dramatically. Ralph Waldo Emerson, for example, whose intellectual influence in antebellum America was unparalleled, played the role of high priest for a religion of radical individualism and self-reliance, and argued that our souls faced no greater peril than the “vulgar” demands placed upon them by our social existence. Society itself, he proclaimed, was a “conspiracy against the manhood of every one of its members,” which served to “untune and dissipate the brave aspirant” and shunt him off into the shame of abject conformity. But in the years after the Civil War, social thinkers sounded a markedly different note in response to the increasingly organized conditions of their rapidly industrializing society. Edward Bellamy, author of the best-selling utopian novel Looking Backward (1888), reversed Emerson’s view completely, disparaging the purely individual life as a “grotto” and exhorting his readers instead to sacrifice themselves for the sake of the Nation, the highest expression of human solidarity. Similarly, pioneering sociologist Lester Frank Ward argued in 1893 that the reign of individualism had passed forever, and that the time had arrived for society to “imagine itself an individual” and “take its affairs into its own hands and shape its own destiny.”

The more communitarian outlook suggested by Bellamy and Ward reached a culmination of sorts in the first two decades of the twentieth century, especially in the social vision of Progressive writers like Herbert Croly, Jane Addams, and John Dewey. But the ascendancy of this outlook proved short-lived. The rise of the Hitler and Stalin regimes, and of the anti-individualist ideologies advanced to legitimate them, put any such “socializing” tendencies on the defensive, and gave renewed vigor and credibility to the earlier Emersonian exaltation of the autonomous individual—which seemed only to intensify in the years after the Second World War. Indeed, the concept of “totalitarianism,” so vital an element in the political discourse of the past half century, not only profoundly affected interpretations of foreign governments, it decisively affected the interpretation of social structure and social psychology in the United States itself. In the postwar years, it became the first order of business for sociologists, historians, theologians, and other social critics to alert Americans to the self’s imminent peril, and provide resources for its defense.

Such was the gravamen of such influential postwar works as David Riesman’s The Lonely Crowd or William Whyte’s The Organization Man, which sought to guard the endangered self against what were seen as the totalistic demands of an organized and conforming world. The latter of these works also affixed a memorable label to the human results of those demands. Those Organization Men who toiled in the modern white-collar paperwork factory, Whyte asserted, “belong to it as well.” Their fate exemplified the process of “collectivization” that had affected almost every field of work in modern America. Abandoning the Protestant Ethic of their elders, which had prescribed hard work and full-throttle competition as the route to individual salvation, they preferred the cozy security of a “Social Ethic” that “makes morally legitimate the pressures of society against the individual.” Man, in this view, is essentially a social creature who only becomes worthwhile “by sublimating himself in the group”; and the insistence upon individual freedom was a conspiracy against the solidarity of society.

Such works as The Lonely Crowd and The Organization Man contributed immeasurably to the tenacious image we have of the fifties as an era of torpid complacency and other-directed conformity. But that image is, of course, far from being the whole truth, especially about the intellectual ferment of those years. At the opposite end of the spectrum from the Organization Man’s Social Ethic, yet in their own way equally representative of the time, were the frenzied neo-Whitmanesque effusions of Beat culture, and the hipster-existentialism of a work like Norman Mailer’s essay “The White Negro” (1957), which would formulate the threat of “social totalitarianism” even more emphatically. One could barely “maintain the courage to be individual,” Mailer lamented, in these “years of conformity and depression,” when “a stench of fear has come out of every pore of American life,” and the only example of courage “has been the isolated courage of isolated people.” Onto this bleak scene, he proclaimed, had come the Hipster, who realizes that, given the poor choice we are offered between “quick death by the State” or “slow death by conformity,” there is only one authentic way left to live: “to divorce oneself from society, to exist without roots, to set out on that uncharted journey into the rebellious imperatives of the self.” The social alternatives Mailer saw could not have been more stark and simple:

One is Hip or one is Square (the alternative which each new generation coming into American life is beginning to feel), one is a rebel or one conforms, one is a frontiersman in the Wild West of American night life, or else a Square cell, trapped in the totalitarian tissues of American society, doomed willy-nilly to conform if one is to succeed.

At stake was nothing less than “the liberation of the self from the Super-Ego of society”; and indeed the “nihilism of Hip proposes as its final tendency that every social restraint and category be removed.” Old Victorian notions of individual agency and responsibility are also removed, for “the results of our actions are unforeseeable, and so we cannot know if we do good or bad.” The Hipster sees “the context as generally dominating the man . . . because his character is less significant than the context in which he must function.” Character, that great Victorian pillar, is at best a “perpetually ambivalent and dynamic” concept, existing in “an absolute relativity where there are no truths other than the isolated truths of what each observer feels at each instant of his existence.” So the completely unencumbered, isolated, solitary self, obedient above all else to its own inner promptings, is also in thrall to a kind of fatalism. The Hipster surrenders control over his own actions and “lives with death as immediate danger,” the complete existential hero.

If nothing else, this formulation shows how the fantasy of devouring social totalism and the fantasy of an unencumbered self arise together and stand in symbiosis. But the rebellious thrill that accompanied Mailer’s words when they were written seems distant today. What were once vices are now habits, and in a society increasingly permeated by anomie and senseless mayhem, and by a popular culture that unceasingly glorifies them, Mailer’s words seem almost tamely descriptive. One has seen or heard a thousand variations on them by now, in record liner notes, TV commercials, and celebrity interviews.

But precisely their tameness is an indication of the profound changes that have occurred in the intervening years. Not only Mailer’s Hipster, but most men and women in modern Western societies are increasingly guided in their personal moral judgments by what Alasdair MacIntyre calls “emotivism,” the doctrine that moral evaluations ultimately represent nothing more than expressions of personal preference, attitude, or feeling. This would seem to promise a chronically anarchic and ungovernable world. Yet, as MacIntyre shrewdly points out, the modern organizational world is easily able to accommodate such potentially disruptive moral subjectivism. It does so by requiring of us a split-mindedness, which abandons the ideal of a whole human life characteristic of traditional social orders. Instead, the modern social world is “bifurcated” into a “realm of the organizational” and a “realm of the personal,” each of which obeys an entirely different moral calculus. In the former realm, “ends are taken to be given, and are not available for rational scrutiny”; in the latter, judgments and debates about “values” occur, but, rather like matters of taste, are not amenable to rational resolution.

MacIntyre’s point bears directly upon the tension between the Hipster and the Organization Man. There is constant discussion in bifurcated societies about “a supposed opposition between individualism and collectivism,” he observes, but in fact, such debates are superficial. The crucial fact is the one upon which the opposing parties agree: “[T]here are only two alternative modes of social life open to us, one in which the free and arbitrary choices of individuals are sovereign and one in which the bureaucracy is sovereign, precisely so that it may limit the free and arbitrary choices of individuals.”

In light of this agreement, the policy debates dominating the politics of modern societies tend to vacillate between “a freedom which is nothing but a lack of regulation of individual behavior” and “forms of collectivist control designed only to limit the anarchy of self-interest.” Such is the usual form taken by debates about “individualism,” and such is the reason why the discussion always remains so shallow and unedifying. In contemporary society, organizational bureaucracy and individualism are partners as well as antagonists. Indeed, MacIntyre agreed, it is in the cultural climate of organizational bureaucracy “that the emotivist self is naturally at home.” The Organization Man and the Hipster only think they are opposites. In a deeper sense, they are more like siblings: mutually defined, mutually enabling.

The fundamental problem is this: that large modern organizations are, by their nature, incapable of serving as genuine moral communities, able to embody moral authority and moral connectedness. For MacIntyre, this problem was especially pronounced in the case of the modern nation-state, which his prescriptions assaulted directly, comparing the present situation to the declining years of the Roman Empire. In that earlier time, “men and women of good will turned aside from the task of shoring up the Roman imperium,” preferring instead to construct “new forms of community within which the moral life could be sustained” despite the surrounding darkness and barbarism. “[W]e too have reached that turning point,” he declares darkly, and so our task is “the construction of local forms of community within which civility and the intellectual and moral life can be sustained through the new dark ages which are already upon us.” The reconstruction of “community,” of the possibility of social units mediating between radical individuals and the vast megastructures in which they are embedded, units in which connectedness and moral responsibility were once again possible, had become paramount in importance.

By recommending a turning away from the imperium, MacIntyre was asserting, in the tradition of Plato and Aristotle, the connection between a particular political regime or social order and a particular kind of soul. He was also implicitly rejecting the other time-honored answer to the individualist-collectivist quarrel: the idea of the national community, which had been a staple of progressive and liberal political rhetoric. In the United States, that idea’s political high-water mark came in the presidency of Lyndon Johnson, who was committed, both in rhetoric and policy, to the conception of the nation as a community, even as a family, very much in contrast to the imagery of self-reliant frontier individualism. Unfortunately, however, Johnson presided over the most thorough shattering of the national unity since the Civil War, a development whose repercussions are still very much with us. The Vietnam War set off a profound weakening of the idea of national community, and undermined the authority of what Robert Wuthnow has called the “legitimating myths” of American national purpose and destiny; and subsequent events have not entirely restored these articles of national faith. It is notable that the symbolism of the Vietnam Veterans Memorial in Washington pointedly refuses the allusion to nation, offering instead the engraved names of the thousands of dead or missing individuals. When visitors come to the Vietnam Memorial, they have not come to gaze upon a grand monumental structure, but to pick out a familiar name on a collective tombstone. Like it or not, the Memorial fittingly represents our condition.

Even without Vietnam, the opposition to the idea of national community had been gathering force on its own in the 1950s and 1960s, from a variety of positions. The old Republican right, epitomized by the doughty Robert A. Taft of Ohio, had never accepted the understanding of the nation inherent in the New Deal. But this was not only a sentiment of the right. From their beginnings in the Port Huron Statement, the intellectuals of the New Left evinced a strong inclination towards decentralized, small-scale, participatory institutions, which would stand in stark contrast with the gigantic corporate, government, and academic bureaucracies that they saw dominating American life. The Black Power movement also advocated community self-governance and empowerment, and some mainstream politicians, notably Robert Kennedy, looked upon the idea with favor, seeing it as latter-day Jeffersonianism. The resurgence of ethnic sensibility in Northern cities, partly stemming from a proud or fearful distaste for the nation’s growing homogenization, partly stimulated by resentment of bureaucratic or judicial intrusions into settled neighborhoods, also had the effect of asserting particularism and ethnic identity over against the national identification. And on the level of domestic national politics, beginning with the administration of Johnson’s successor, Richard Nixon, through the successive administrations of Ford, Carter, Reagan, and Bush, the rhetoric of streamlining, decentralization, and New Federalism regularly took precedence over the appeal to a consolidated national community.

A notable exception to this last generalization, Jimmy Carter’s use of William James’ notion of a “moral equivalent of war” as the rhetorical frame for his comprehensive national energy policy, was not a notable success. In the era of Theodore Roosevelt, such appeals would surely have fallen upon more approving ears. In his 1979 “crisis of confidence” speech, Carter warned that the nation had embraced “a mistaken idea of freedom” and was heading down the path of “fragmentation and self-interest,” of “self-indulgence and consumption.” He urged that Americans instead “rebuild the unity and confidence of America,” because only by following the “path of common purpose” could we come into an experience of “true freedom.” Resonating with the time-honored language and imagery of national community, solidarity, and self-transcendence (“There is simply no way to avoid sacrifice”), Carter’s eloquent political jeremiad turned out to be dramatically out of step with its historical moment. Surely one reason for this response was the waning appeal of the particular ideal of national community (though not necessarily of nationalism per se) to which he was appealing. Mr. Clinton’s more ambitious health care reforms may meet a similar response from skeptical Americans who distinguish between sacrificing for their nation and surrendering to their government.

To be sure, a movement towards political decentralization has so far been more rhetorical than actual. The various New Federalisms have been notoriously shallow in practice, and the size of the Federal government has continued to grow rapidly during the very administrations that espoused reduction. Such may well continue to be the case; and the moral Hipsterism that seems to dominate our culture may also continue to flourish. Yet it seems possible that the social ideals capable of energizing Americans have begun to change, in roughly the same direction that MacIntyre’s words suggest: towards disaggregation, dispersal, decentralization, and at the same time towards more intense experiences of community—that is, toward identification with social forms intimate enough to bridge the gap between isolated, morally irresponsible selves and ubiquitous, morally irresponsible organizations. For better or worse it is the support group, not the nation-state, that looks to be the model of revitalized social organization now and in the near term.

Progressives like Herbert Croly and Theodore Roosevelt had assumed that there could be such a thing as a national community, guided by a strong sense of the public interest and common weal, which could substitute for the small-scale community rendered obsolete by the inexorable consolidating effects of the industrial age. But their dreams were thwarted, because, as political scientist Michael Sandel argues, the nation-state has simply “proved too vast a scale across which to cultivate the shared self-understandings necessary to community.” Since abandoning the Progressive notion of a common good, Sandel observes, we have limped along in a “procedural republic,” in which the nation-state, after endowing individuals with “rights and entitlements,” relies upon disembodied procedures, rather than on substantive and authoritative policies, to govern. This arrangement offers the individual little satisfaction or freedom. “It is,” Sandel explains, “as though the unencumbered self presupposed by the liberal ethic had begun to come true—less liberated than disempowered, entangled in a network of obligations and involvements unassociated with any act of will and yet unmediated by those common identifications or expansive self-definitions that would make them tolerable.” Such is the symbiosis of Hipster and Organization Man in action.

Are there other, more hopeful ways than withdrawal to deal with the problems engendered by the moral calculus of emotivism? Perhaps the most prominent work to grapple with this complex of issues in recent years was Habits of the Heart (1985), a collaborative exploration of contemporary American society whose principal author was the prominent sociologist Robert Bellah. Although the book was a kind of successor in genre to The Lonely Crowd, it was in the end a far more unsparing analysis of contemporary America. The decline of the public realm, the disappearance of civic consciousness, the disintegration of marriage and family life, the increasingly tenuous and openly self-serving character of human relations, the near-disappearance of religious values from our shared existence: all were laid out in considerable detail, using extensive interviews and case studies of “representative” Americans. Bellah’s criticisms echoed, in many particulars, those embodied in the traditionalist-conservative critique of individualism, exemplified by the works of Russell Kirk and Robert Nisbet. Yet Bellah was anxious to offer ways of reconstituting community while remaining firmly situated within the liberal and pluralist tradition. The result of this balancing act, however, was a book that issued a resounding call for a restored “framework of values” upon which Americans can agree as a national community and upon which they can rebuild a common social life—but that was unfailingly nebulous in specifying what those values might be. A work that thus set out to address the disintegrative tendencies of contemporary American culture ended up capitulating to those tendencies at every crucial point.

At the heart of Habits’ argument was its call for a return to America’s “biblical” and “republican” traditions to counterbalance the dangerously amoral, selfish, emotivist, radical-individualist tendencies of unrestrained liberalism. Yet the book devoted little space to an explication of those traditions, and was vague as to the authority that these traditions are to possess. Habits preferred to frame the issue linguistically—as one of enhancing public discourse by bolstering Americans’ “second language” of republican and biblical forms, and pointing out its embeddedness in the shared “narratives” of “communities of memory,” as a counter to its “first language” of radical individualism. But one can be forgiven for asking: what authority do the authors believe should be given to, for example, the Ten Commandments, or to biblical teachings on sexuality? How much authority should the local community be permitted in establishing, for example, the boundaries of permissible expression, or regulating its own public schools? Or were those “biblical and republican” traditions to be appropriated piecemeal, purged of any elements that ran contrary to conventional wisdom? If so, how did such selectivity differ appreciably from the liberalism the authors were criticizing?

Habits took a similar approach to the two traditions it affirmed, hoping to gather the fruits of their cohesive effects without paying the price of accepting their authority. But surely the term “biblical tradition,” for example, demands far more than that. The authors of Habits seem to envision strong personal morality without the taint of discipline or intolerance, strong communal and civic values without insularity or particularism, strong commitments without sanctions against those who disdain them, strong national self-esteem without national pride, patriotism without chauvinism, and so on. But anyone who takes seriously the binding power of the most fundamental sociological concepts will have trouble seeing this wish list as more than insubstantial word-combinations, wholly without plausible historical precedent. Tradition without tears is likely to be no tradition at all.

Another interesting, and very different, challenge to individualism has emerged out of the postwar feminist movement, in the form of an assertion that the ideal of individual autonomy is characteristically male. Individualism, in this view, far from being universal, is highly gender-specific; and as American intellectual and cultural history is rewritten, and the constitution of present-day American society changes, that gender-specificity will be revealed, relativized, and transcended. But such a view, though undoubtedly not without historical support—since eighteenth-century notions of individualism generally did not include women within their purview—seems hard to credit without a good deal of qualification, once modern feminism’s own history and its characteristic liberatory agenda are taken into account.

To begin with, Betty Friedan’s The Feminine Mystique (1963), the critical text in the postwar intellectual and political revival of feminism, rested upon assumptions strikingly similar to those of The Lonely Crowd, The Organization Man, and other postwar “guardians of the self.” It consistently praised concepts of autonomy, independence, liberation, and emancipation, warned against social tyranny, and stressed the demolition of barriers to women’s equal economic opportunity and to full and separate human self-realization. Such a line of argument also reflected a longer tradition of American feminist thought, stretching back to Margaret Fuller’s stubborn call for self-sufficiency, and to the language and style of the 1848 Seneca Falls Declaration of Sentiments. It was this line that the postwar women’s movement brought so successfully into the political and intellectual environment of that era.

A single-minded emphasis upon equality, however, has never been the whole story with modern feminism. The struggle for political rights often warred with, but never entirely eclipsed, a struggle for distinctiveness, for a respectful recognition of women’s dramatically different perspectives and values; and the view of individualism as a gender-specific phenomenon has descended from this latter imperative. Central to that effort is the work of educational psychologist Carol Gilligan. Trained in Eriksonian developmental psychology, Gilligan was struck by the degree to which studies of moral development omitted women from their samples, and omitted a consideration of gender from their theoretical framework and research design. The Eriksonian Bildungsroman, built around the passage from infantile dependence to adult autonomy, reflected a conception of adulthood that favored the separateness of the individual self over its connection to others. But women, Gilligan proposed, showed a strikingly different trajectory of moral evolution and a different style of moral judgment, and researchers who habitually treated the male pattern as “normal” needed henceforth to take respectful account of women’s difference. For example, she argued, women’s perception of self is much more consistently embedded in relationship with others than was the case for autonomy-minded men; hence their moral decisions are made contextually, rather than by reference to an abstract and impersonal standard of justice.

Gilligan’s work found broad and immediate public resonance when it first appeared in the late 1970s; and its underlying theme—the effort to connect the feminist movement with the search for meaningful ways of recovering social connectedness in an autonomy-obsessed, individualistic American culture—surely had much to do with that success. Yet her work has also been sharply criticized as nothing more than a postfeminist social-scientific rationalization for a return to Victorian womanhood. Her case says something about the dilemmas facing a movement which strives to offer a model of relatedness that could supplant “male” models of individualism—even as it insists upon its own rightful participation in the very same thrust toward individual empowerment. Feminism finds itself caught between the desire for unrestricted latitude of action and the desire to retain the distinctive assets of women’s experience—between an aspiration to secure the kind of personal autonomy that had formerly been the exclusive province of men, and an insistence that women are crucially different from men and that women’s heightened sense of mutuality and relationship are the central features of that difference.

The dilemmas of modern feminism, therefore, form a particular case within the more general dilemmas posed by individualism. The task facing writers like Gilligan, Mary Ann Glendon, Jean Bethke Elshtain, Elizabeth Fox-Genovese and others who have approached the problem in this way is that of reconciling the cultural (and in some cases biological) implications of difference, which they feel committed to uphold, with the politics of equality and individual rights. Each has mapped some element of a route out of individualism; but it is not clear how consequential such initial forays can be unless they can establish the ability of an authoritative political and moral language, other than that of individual rights, to be admitted to the public arena. Fox-Genovese in particular has complained that feminism itself has increasingly become one of the most pernicious expressions of the individualistic ethos in American culture, offering an “atomized” view of society, a “celebration of egotism,” and a denial of “the just claims of the community.” She would hardly be the only one to feel defeated if feminism accomplished nothing more than to secure women’s equal right to be Hipsters. Yet that result seems likely so long as there is no authoritative way left of talking about obligations that transcend the self, and about the prerequisites of moral community.

It is perhaps encouraging, then, that so many in the recent generation of American social thinkers seem preoccupied by questions of community and solidarity, rather than individualism and autonomy. But the ultimate impact of such thinking appears uncertain. It is unlikely that individualism will be effectively challenged merely by an earnest effort to invest more moral energy in the massive social entities within which we move. If anything, the opposite effect seems more likely, at least in the short run. We live in an era in which nations and empires are experiencing, or facing the prospect of, disaggregation, disintegration, dissolution into their constituent elements. This is true not only in Eastern and Central Europe, or Canada: in the United States the devolution of the national community into an infinitude of subcommunities is reflected in the fracturing of American political life into special-interest blocs based upon race, class, ethnicity, gender, sexual practice, religion, or ideology. To be sure, there is much that is ominous in such developments; but if considered in light of MacIntyre’s and Sandel’s observations, such reorientation may also represent the beginnings of an historically inevitable, and perhaps not entirely unhealthy, reaction against the pathologies of the unencumbered self and the limitations of a centralized and nationalized social order.

But the negative possibilities that lurk in the disaggregation of nations and empires often seem to outweigh the positive ones, as the murderous conflicts left in the wake of the collapse of the Soviet Union make painfully clear. The Balkanization of American politics and social life along the lines of highly specific and exclusive identities may represent nothing more than a translation of the energies of radical individualism into group terms—making possible a more effective pursuit of the group’s interests within the context of a pluralistic, bureaucratic, and procedural political order, but leaving the public life of the nation impoverished, and perhaps that much further from the possibility of communities capable of sustaining healthy and enlivening differences. The debate over “multiculturalism” that rages in contemporary academe reflects a fear that the satisfactions achieved through the embrace of ethnic tribalism or other particularisms may come at an incalculable price to the common culture, upon whose cohesion all such self-conscious and hyphenate embraces depend. Do the homogeneous American “lifestyle enclaves” explored by Frances FitzGerald in her fascinating study Cities On a Hill, running the gamut from the retirement towns of the Sunbelt to the Castro District in San Francisco to the Rajneeshi religious commune in Oregon, represent a recovery of the vibrancy of community life through shared values and shared narratives? Or do they represent a ghastly final adaptation of individualism to the Weberian logic of bureaucratic specialization—a debased form of social association that is merely the unencumbered self writ large? Do they reflect the emptiness of the word “community” in contemporary American parlance, to the point that it has become an all-purpose noun whose sole purpose is to give the appearance of solidity to the adjective that precedes it?

Those questions are difficult to answer, but several observations may at least help sharpen them. First, as FitzGerald’s title implies, “lifestyle enclaves” have been a persistent feature of American history, from Massachusetts Bay to Sun City. They perhaps constitute a special case of Tocqueville’s more general observation that Americans have a propensity for forming themselves into voluntary associations, which generally combine a functional purpose with a social one, bringing men and women together for the sake of a self-identified, homogeneous community of interest. (Here again, one thinks of the phenomenal rise of the support group.) Such an association need not rely upon an authoritative person or institution to justify itself to its members, thanks to its functional character. In its ideal form, then, a voluntary association offers the prospect of a frictionless association of self and society, conceiving the latter as a receptacle into which the desires of the former may flow freely and be fulfilled, without the coercive hand of human authority intervening.

Yet no problem is more fundamental to American social character, and to the complex of issues here under examination, than the locus of authority; for a large, complex society cannot be governed by the premises of a voluntary association—and even voluntary associations must wrestle, in the end, with problems of authority, as the history of churches and reform movements amply indicates. A social order that is egalitarian, or, more important, that understands itself as such, grants little room for what is traditionally understood as authority, unless that authority is depersonalized and embodied in the social entity as such—as in the adage vox populi vox dei, or in the triumph of bureaucratic utility over patrimonialism, tradition, and charisma in the modern organization. “When the conditions of men are almost equal,” Tocqueville remarked, “they do not easily allow themselves to be persuaded by one another.” Equals, that is, do not take orders from one another. Indeed, there is no more characteristic American attitude than disdain for anyone who would arrogate the title of master.

So observed D. H. Lawrence in his idiosyncratic Studies in Classic American Literature, one of the enduringly valuable books about American life. America, he taunted, was in thrall to antiauthoritarian fanaticism; it was “a vast republic of escaped slaves,” of “masterless” souls who fancy that they can find freedom by “escaping to some wild west” wherein they can evade all constraint. But real freedom came not from such chain-rattling opposition but from inner obedience, from belonging to “a living, organic, believing community, active in fulfilling some unfulfilled, perhaps unrealized, purpose.” People are not free, he asserted, “when they are doing just what they like.” Indeed, “if one wants to be free, one has to give up the illusion of doing what one likes, and seek what IT wishes done.” Despite Lawrence’s idiosyncratic paganism—what, after all, did he mean by IT?—and the fact that he was himself really something of a Hipster, the command is also rich with more venerable Christian overtones: who would save his life must lose it; who would be free must first submit. But submit to what?

Seek ye first the Kingdom of God, Christ had taught in the Sermon on the Mount; and it must have seemed that Lawrence was trying to ring his own psychodynamic change upon that imperative. Seek ye the authority not of the conscious ego-self, but of “the IT,” the transpersonal “deep self.” Yet if the practical meaning of that command seemed obscure and wide open to self-deception, its gravamen, particularly so far as an individualistic ethos was concerned, was not. There is a price for the possibility of authentic community, a price levied, so to speak, both up and down the scale of organization—both upon the would-be autonomous self, and upon the consolidated social order within which it presently lives, moves, and has its being. Such communities would prove unfavorable habitats for Hipsters or Organization Men, since those types depend upon a scale of organization that renders the individual both unencumbered and impotent.

But the willingness to surrender a significant portion of oneself to a social whole cannot reliably flow from a vague and uplifting desire for the warmth and supportiveness of “community-building,” but will be directly commensurate to the degree of authority invested in that entity; and that very thought tends to strike terror in the hearts of Americans. By investing authority in an obscure “IT”—in preference to, say, a Judeo-Christian God who has revealed Himself with disconcerting and inhibiting definiteness—Lawrence himself was evading moral authority in practice even as he was affirming it in theory. Such is the appeal today of so much New Age spirituality, which often serves merely to bless emotivism and dress it up in flowing robes.

Still, as a critique of American culture, Lawrence’s remarks are harder to dismiss. We tend to forget how often we allow the marketplace to function as our model for, or substitute for, moral authority. While such a model opens considerable possibilities, it proscribes others; it is a source of impersonal legitimizing mastery perfectly suited to a society of would-be autonomous, masterless men and women, in much the same sense that Sandel’s “procedural republic” can continue to function even in an authoritative vacuum. We should not be deceived, therefore, into thinking that we can combine and shuffle and incorporate the virtues of various social arrangements and “traditions” at will, without having first inevitably refashioned them in the very image of our own present condition.

Every way of life, even a seemingly neutral and eclectic pluralism that tolerantly affirms a wide-open bazaar of ideas and values, has its built-in imperatives, its virtues, its vices, its codes, its taboos, its benefits, and its costs. Whether we wish it or not, we cannot avoid paying the price for our own. And part of that price, it seems, may be a reflexive, unvanquished individualism accompanied by a perpetual yearning for unrealizable forms of community—a dream of being both autonomous and connected that in the end often settles for being neither. Whether that tension is sustainable, or will instead turn out to have been transitional, giving way in due course to other forms of social organization, of course remains to be seen.

Wilfred M. McClay is Associate Professor of History at Tulane University and author of The Masterless: Self and Society in Modern America (University of North Carolina Press).

Photograph is in the Public Domain.