Support First Things by turning your adblocker off or by making a  donation. Thanks!

A well-known account of creativity sets the scene for a celebrated act of creation, and a bleak scene it was: “In the beginning … the earth was without form and void, and darkness covered the face of the abyss.” On that occasion, creativity consisted in bringing something out of nothing—a feat so remarkable that the Hebrew verb used to describe it (barah) refers exclusively to divine activity. In the English language, the words “create” and “creative” long had a similar connotation. It was not until the Renaissance that they began to be used in the way that we now take for granted, referring to the work of human minds or hands. That change in linguistic usage was just one sign of a new way of thinking about the place of man in the universe. And central to that new worldview was an exhilarating vision of human potency—a vision that we English speakers associate with Shakespeare’s ode to humanity in Hamlet: “How noble in reason! How infinite in faculty! In form and movement how express and admirable! In action how like an angel! In apprehension how like a God!”

With that heady estimation of human capacities came a certain sense of liberation from tradition, custom, and the group. Still, it was not until the Romantic era that creativity came to be identified in many quarters with individual originality, and later yet that disdain for tradition became a tradition in itself. At the extreme, in our time, the tradition of antitraditionalism would have it that homo sapiens sapiens can cancel its debts to the past, and that we too can bring wonders out of the void.

The idea that tradition is antithetical to creativity of the human sort is what I propose to examine and challenge—along with its usual underlying assumptions that tradition is necessarily static, and that the essence of creativity is originality. To anyone with a scientific bent, my project will seem to be an exercise in the obvious. For in the history of science, as Stephen Toulmin, Thomas Kuhn, and others have made clear, nearly every great advance has been made by persons (typically, groups of persons) who simultaneously possess two qualities: a thorough grounding in the normal science of their times, and the boldness to make a break with the reigning paradigm within which that normal science takes place. My concern, however, is the progress of antitraditionalism in the human sciences, where what counts as an advance, or creativity, is more contestable, and where many eminent thinkers now devote much of their energy to attacking the traditions that have nourished their various disciplines. Several of these literary legionnaires have crossed academic boundaries to join forces in a kind of holy war against “western civilization.”

I shall confine my attention here to the field with which I am most familiar, namely the law—though this may vex the spirit of Erasmus, who seems to have had a rather low opinion of lawyers, calling them “among the silliest and most ignorant of men.” Erasmus might have been amazed, however, if he could have known the degree to which lawyers themselves, and especially teachers of lawyers, would one day come to exhibit disdain for their own craft, and to disavow openly the ideals of their traditions. I say traditions, for, where Americans are concerned, there are three of them involved: the common law tradition that we inherited from England, the tradition of American constitutionalism, and the craft tradition of the profession.

The chief philosopher of the Critical Legal Studies movement, Roberto Unger, came uncomfortably close to the mark in his well-known description of the legal world he entered in the late 1960s. Most professors then, he wrote, “dallied in one more variant of the perennial effort to restate power and preconception as right. In and outside the law schools, most jurists looked with indifference or even disdain upon the legal theorists.… When we came, they were like a priesthood that had lost their faith and kept their jobs. They stood in tedious embarrassment before cold altars.” To the extent that Unger’s portrait captures something of the current self-understanding of lawyers, it points to a state of affairs that cannot help but have implications for our law-saturated society. Thus it seems worthwhile to inquire into the extent, causes, and consequences of a widespread loss of confidence among lawyers themselves concerning legal traditions that have been intimately bound up with our historic experiment in ordered liberty.

The legal tradition we inherited from England is almost a textbook example of what Alasdair MacIntyre has called a living tradition, one that is “historically extended” and “socially embodied,” whose development constantly points beyond itself.

In taking up the tradition of the common law, we are faced first with a matter of terminology. The term “common law” refers to the type of law, and lawmaking, that historically distinguished the English legal system from the legal systems of continental Europe. The common law is an evolving body of principles built by accretion from countless decisions in individual lawsuits. Because it emerges from practice rather than theory, its principles are highly fact-sensitive, and not too general. Continental law, by contrast, had, as they say, the smell of the lamp—it was developed by scholars, and was further rationalized and systematized by comprehensive legislative codifications.

What need to be emphasized here are two remarkable features of the common law tradition. First, its continuity. Over centuries that saw the rise and fall of feudalism, the expansion of commerce, and the transition to constitutional monarchy, the common law of England adapted to each new circumstance without abrupt change or any root-and-branch reorganization of the sort represented by the European codifications. Statutes in England, for the most part, were like patches here and there against the background of the common law, and judges tried to construe them so as to blend them, as far as possible, into the fabric of the case law. Thus in 1894 the great English legal historian F. W. Maitland could look back on centuries of legal evolution and say: “When we speak of a body of law, we use a metaphor so apt that it is hardly a metaphor. We picture to ourselves a being that lives and grows, that preserves its identity while every atom of which it is composed is subject to a ceaseless process of change, decay, and renewal.” Lest the reader surmise that that sort of thinking was confined to armchair types, or to the other side of the Atlantic, consider that at just about the same time in Boston, a judge named Oliver Wendell Holmes, Jr. was likening the law to a “magic mirror” in which a suitably trained observer could see a “mighty princess” eternally weaving into her tapestry “dim figures of the ever-lengthening past.”

The second feature of the common law that must be stressed is the distinctive methodology that enabled it to adapt and grow while maintaining its continuity. To try to describe that method is a bit like trying to describe swimming or bicycle riding, for it consists of a set of habits and practices that are only acquired by doing. But the conventional understanding goes something like this: the common law judge is supposed to be a virtuoso of practical reason, weaving back and forth between facts and law, striving not only for a fair disposition of the dispute at hand, but to decide each case with reference to a principle that transcends the facts of that case, all with a view toward maintaining continuity with past decisions, deciding like cases alike, and providing guidance for parties similarly situated in the future. It was those sorts of operations that Lord Coke had in mind in the seventeenth century when he famously said: “Reason is the life of the law; nay, the common law itself is nothing else but reason.” To Coke, “reason” did not mean deductive reason (as it did for Descartes), nor self-interested calculation (as it did for Hobbes). It was, rather, an extended process of collaboration over time. Or, as he put it himself, it was a kind of group achievement, “gotten by long study, observation, and experience, … fined and refined over centuries by generations of grave and learned men.…”

To be a traditionalist in such a tradition seems pretty clearly not to be frozen in the past or mired in the status quo, but rather to participate, as MacIntyre puts it, in a community of intense discourse about what it is that gives the tradition in question its point and purpose.

It is a fair question whether one can really speak of creativity, as distinct from mere inventiveness or successful problem-solving, in the common law tradition. On this point, I would like to suggest that the American Founding was a classic instance of the kind of creativity that emerges from, and is deeply rooted in, the very tradition that it irrevocably transforms. The Declaration of Independence, the Constitution, the Federalist Papers, and the landmark early decisions of the Supreme Court could only have been produced by statesmen who were steeped in the English legal heritage. True, the Founders varied considerably in the degree of regard they had for that tradition. But all of them fully understood the role the common law had played in safeguarding the lives, liberties, and property of Englishmen through civil war and social upheaval. When the American colonists came into conflict with their royal governors, they claimed that same common law as their birthright. They broke with the mother country, but they did not reject their own past. Rather, their English legal inheritance was taken up, transformed, and made into the basis of a new order.

Another type of legal creativity—the everyday creativity of judges and lawyers—is of a more modest sort, but should not be underrated. It is the creativity of the artisan rather than the architect, but the same observation holds regarding the importance of being grounded in tradition for any significant advance. Think of the preventive law handed down through generations of practitioners—the well-wrought agreements, bylaws, contracts, deeds, leases, wills, and trusts that at their best aid human beings to carry on mutually beneficial relations with a minimum of friction, to make reliable plans for the future, and to avoid unnecessary disputes by anticipating and providing for contingencies. What lawyer could responsibly draw up such a document without consulting the experience of bench and bar embodied in the humble formbook? And when preventive law fails, creativity of a modest sort is involved in dispute settlement as well.

At this point, an inquisitive person might ask why, if the common law tradition had such a robust capacity for creative development, it has fallen into disrepute among many of its inheritors.

It seems relevant, to begin with, that the tradition is one whose participants were never much given to introspection. Legal knowledge for most of our history was, after all, passed on through apprenticeship. And because its methodology was latent, our continental friends, with their strong suit in theorizing, often assumed that we had none. Erasmus was characteristically blunt on this subject: “The study of English law,” he said, “is as far as can be from true learning.” And, in a similar vein, Tocqueville, on his visit to the United States in 1831, remarked on how “strange” it was that such a legalistic society had not yet produced “any great writers inquiring into the general principles of the laws.”

A mere decade after Tocqueville penned those words, however, the man who was to bring fancy theory to American law was born in Boston, Massachusetts. And as fate would have it, he devoted much of his prodigious talent to debunking the roles of tradition and reason in the law. It would be hard to exaggerate the impact of Oliver Wendell Holmes, Jr. on American legal culture. His life was literally the stuff of which legends are made. He was the gifted son of a famous father, and a genuine Civil War hero, wounded at Ball’s Bluff, Antietam, and Fredericksburg. By the time of his death in 1935, at age 93, he had distinguished himself in every role the legal profession had to offer. As a young practitioner in a busy Boston firm, he spent his evenings writing a treatise on the common law that revolutionized the way American lawyers write and speak about law. He then taught briefly at Harvard Law School before his appointment to the Massachusetts Supreme Judicial Court in 1882. After twenty years on that bench (three of them as Chief Justice), he was named, at the age of 61, to the United States Supreme Court, where he served for another 29 years. He left his stamp on nearly every area of law—as much through his fluent literary style as through his innovative theories or his decisions and dissents in landmark cases. Where legal theory is concerned, Holmes’ writings set the scholarly agenda for the entire twentieth century: legal realism, pragmatism, sociological jurisprudence, law and economics, and critical legal studies are all little more than elaborations of themes announced by Holmes. As a leading Legal Realist wrote in the 1930s, “Holmes was the daddy of us all.”

This “daddy,” though, may have inadvertently hindered his progeny from building on his own achievements. Consider how he taught them to view tradition. Wherever one looks in the law, Holmes complained (in 1897, in what is still the most widely quoted law review article ever published), one sees “tradition” getting in the way of “rational policy.” But the “tradition” against which he railed was just the dead hand of the past, as witness another well-known statement: “It is revolting to have no better reason for a rule of law than that it was laid down in the time of Henry IV.”

This reductionist assault on tradition was combined with an attack on traditional legal reasoning. The opening sentence of his 1891 treatise, The Common Law, was a thinly veiled challenge to Lord Coke’s dictum that “Reason is the life of the law.” Holmes wrote: “The life of the law has not been logic: it has been experience.” Now, one interesting fact about that famous aphorism is that even at the height of nineteenth-century legal formalism, American lawyers were not making such an extreme claim as that the life of the law was logic. As for Coke, he would have agreed wholeheartedly with Holmes on the priority of experience over logic. A second interesting fact about Holmes’ formulation is that it closely resembles a statement in a German legal treatise that we now know Holmes checked out of a Boston library in 1879. I mention this not to impugn Holmes’ originality but to call attention to a point that is more important: the criticism in question was directed by its German author at the rigid conceptualism of scholars who did believe that logic was the life of the law. The German academics who were then engaged in drafting the German Civil Code of 1896 envisioned that code as ideally possessing “logical closedness” (logische Geschlossenheit). It was highly inappropriate for Holmes to apply the same criticism to Coke, or even to more formalistic nineteenth-century writers on the judge-made, open-textured, common law.

But Holmes was a man on the move. With the same fast shuffle that he used to convert “tradition” into fossilized “history,” he reduced “reason” to dry “logic”—a collection, as he put it, of syllogisms, axioms, and deductions. That, as they say, took the whiskey out of the highball. And there was more. For Holmes announced, as though it were a discovery, what lawyers have always known: that there are many times when the law is silent, obscure, or incomplete, and that judges often do not simply find and apply the law, but actually exercise a limited lawmaking function. In a passage meant to shock, Holmes offered his theory of what judges were making law from. Expediency, opinion, and even prejudice, he said, had all played a greater role “than the syllogism” in fashioning the rules by which we are governed. One might think that it would not take a Sherlock (or an Oliver Wendell) Holmes to discern that judges, being human, display the usual human flaws in the exercise of their rational faculties.

But Holmes pressed the point further. Revisiting the subject at the height of the blunt macho style that was his trademark, Holmes told a group of law students: “You will find some text writers telling you that [law] … is a system of reason.” That, according to the great man, was obfuscation. Laws were nothing more or less than “commands” backed up by the armed might of the state. The aim of legal study was simply the science of prediction, prediction of “where the axe will fall.” And in words that are still engraved on the mind of every law student, he said that if you want to know the law, “you must look at it as a bad man does.” He exhorted his youthful audience to use “cynical acid” to wash away all the moralizing language of right and wrong so that they could see the law as it truly is. But don’t think I mean any disrespect for the law, he hastened to add, “I venerate the law.” Why such piety toward mere command? Because “it has the final title to respect that it exists.”

In the same speech, later published under the title The Path of the Law, Holmes said that his “ideal” was to put the law on a more modern and scientific basis through clear thinking about means and ends, costs and benefits. “Reason” might be out, but what Holmes approvingly called “rationality” was in. Accordingly, in a passage that has become sacred scripture in some quarters, he advised that every lawyer should acquire a knowledge of economics, for the “man of the future is the man of statistics and the master of economics.”

By thus cabinning tradition and reason, Holmes helped to prepare the way for the carnival of twentieth-century American legal theory. For many of his successors, tradition took on a pejorative sense—it became the debris of old errors, power relations, and prejudices. His critique of reason laid the foundation for the fact-skeptics and the rule-skeptics of the 1930s, who called themselves Realists, and for their successors, the critical theorists who came on the scene in the 1960s.

With hindsight, though, it is remarkable how deeply Holmes himself drew from the springs of the very tradition he was engaged in disparaging. In this respect, he bears a striking resemblance to those of his contemporaries whom Hilton Kramer calls the great “tradition-haunted” artists—Picasso and Matisse, Eliot and Yeats, Schoenberg and Stravinsky. With his vision of the “mighty princess,” with his mastery of his craft, and with his vaulting ambition, Holmes was a Picasso-like figure—larger than life, boldly iconoclastic, yet mindful of his lineage and of the continuity of culture. The Path of the Law was his Demoiselles of Avignon.

Now what is very important to acknowledge at this point is that Holmes was far ahead of most of his contemporaries in his perception that the common law tradition was in a period of crisis. And he was right on the mark in his understanding of the way in which the American constitutional tradition represented an important break, as well as continuity, with the common law tradition. To borrow Thomas Kuhn’s terminology again, trouble spots had appeared in the reigning paradigm, and the normal legal science of the turn of the century was not handling them well.

What was the nature of the crisis? The Anglo-American legal system, not for the first time in its long history, was confronted with major social, economic, and political changes, but this time they were occurring with unprecedented rapidity. I will mention just three major sources of difficulty.

Consider first that when Holmes was a young lawyer in the 1870s, legislatures had begun producing a new type of statute—primitive regulatory legislation, much of it addressed to conditions in factories. Those whose interests were adversely affected by these laws took their complaints to the courts, with the result that the Supreme Court embarked on its first sustained adventure with the power of judicial review, a power that it had possessed for nearly a century, but which it had exercised sparingly. The behavior of the Supreme Court and other courts in that period (striking down much early social legislation as infringing on economic rights) is now frequently treated in law school classes as showing that the judiciary was in the service of the dominant classes. But there was another dimension to the story. When late-nineteenth-century judges entered the still relatively uncharted areas of statutory interpretation and constitutional review, they really did not know quite how to handle the new situation. It is helpful to keep in mind that as late as 1875, nearly half of the United States Supreme Court’s case load was still pure common law litigation. By 1925, however, statutes figured importantly in all but about 5 percent of the cases. Most judges during those years of transition tended to proceed in the way they knew best—by falling back on their habitual practice of construing enacted law (including the Constitution) in such a way as to blend in with, rather than displace, the common law background where, as it happened, freedom of contract was ensconced as a leading principle. In a series of famous dissents, Holmes, to his credit, tried to point out to his fellow judges that the rules of the game had changed in 1787. But that point seldom got across until the 1930s, and even then it was not fully absorbed.

A second source of strain on customary ways of doing things, down in the capillaries of the legal system, was the rapid development of industry and commerce. Pre-industrial property, tort, and contract principles were often poorly suited to the new problems generated by urbanization, industrialization, and the rise of the large business corporation.

Third, by the end of Holmes’ long life in the 1930s, the American experiment had moved into yet another new phase as the federal government vastly expanded its activities through legislation which the Court, in a stunning about-face, upheld against constitutional attack. Among the New Deal lawyers who had broken ground by producing the Social Security Act, the securities laws, and the national labor relations legislation, there were some eminent academics. They called attention to an urgent problem: the need to begin the study of legislative drafting and interpretation in the nation’s law schools. Remarkably, however, to this day that call has gone largely unheeded.

In sum, it seems fair to say that the stresses and strains on the legal system that appeared during Holmes’ lifetime called for a high degree of juristic skill, imagination, and creativity. How did the legal profession respond?

To begin with the obvious, lawsuits have to be decided. Judges on the front lines cannot decline to render a decision just because the problems brought to them are new and the law is unclear. So judges in state courts duly began to try to adapt the principles of pre-industrial personal injury law, face-to-face sales law, and agrarian land law to the circumstances of an industrialized and urbanized society. Often expressing their impatience with the failure of legislatures to resolve the larger policy conflicts implicit in that enterprise, some of these judges moved boldly and openly into rule-making and policy-making. The purported discovery that judges had always engaged in “creative” activity was taken by some as legitimating a more ambitious concept of the judicial role, and by a few as a sign that all bets were off. Contrary to what many people believe, the Warren Court of the 1950s did not initiate, but simply took to new heights, this movement by judges into areas where their legitimacy is weakest—finding broad social and economic facts (as opposed to the facts in the case before them) and adjudicating between conflicting and competing interests in society (as opposed to resolving the particular dispute at hand).

What about the call of scholarly New Dealers like James M. Landis and Felix Frankfurter for more attention to the study of legislation? Non-lawyers may be surprised to learn that, even now, American law schools devote practically no systematic attention to the drafting or interpretation of statutes, and very little to empirical study of how various regulatory arrangements work out in practice. In the spring of 1992, a committee appointed to review the curriculum at Harvard Law School reported that Harvard (like most other law schools) was still teaching the basic required first-year program “almost without regard to the coming of the regulatory state, and without recognition that statutes and regulations have become the predominant legal sources of our time.”

An interested citizen, learning of this state of affairs, might be forgiven for wondering what American law schools have been doing all this time. Part of the answer is that prosaic regulatory law never really had a chance once constitutional law became the “glamor subject” of legal education in the 1950s and 1960s. The constitutional law that took center stage, moreover, sloughed off its old preoccupations with federalism and the separation of powers to concentrate its main attention on the court-centered jurisprudence of the rights revolution. The old, instinctive preference for judge-made over enacted law that had shaped constitutional interpretation at the turn of the century enjoyed an Indian summer among majorities on the Warren and Burger Courts as they embarked on this second exciting adventure with judicial review. And that same preference persists today among many law professors who teach that the Constitution is just a “text” whose various provisions are mere starting points for freewheeling judicial development—as if the people of the United States had not established a regime that places important limits on both judicial and legislative lawmaking.

Meanwhile, a new generation of tradition-and reason-bashers appeared in the legal academy. The law-and-economics school took up Holmes’ “ideal” of instrumental rationality, while the critical theorists gave his skepticism a new postmodern twist with the aid of French literary criticism and neo-Marxist social thought. It is a big step, for example, from observing that there are certain leeways inherent in fact-finding and rule-application, to asserting that there is no such thing as a fact and that all rules are radically indeterminate and manipulable. It is another major leap from being realistic about the difficulties of disinterested decision-making to a condemnation of the entire legal system as fatally tainted with racism, sexism, and other forms of hegemony. And there is a world of difference between the lawyerly reformism of the New Dealers and the view that law is nothing more than concentrated politics—between, you might say, Felix Frankfurter and the Frankfurt School.

Noticeably less interested than Holmes and the Realists in law as such, the “crits” have borrowed the main elements of their critical theory from France, Italy, and Germany—countries that had never had experience with anything like the common law, where reason too often meant raison d’ état, and which historically have had difficulty establishing regimes in which the rule of law is respected. But many of the critical scholars had absorbed so little “normal science” that they could not see the problems in transferring a critique of continental law to their own system. In the dark, all law looks the same.

The work of many of these legal avant-gardists thus reminds one of Marx’s dictum that history repeats itself, first as tragedy, then as farce. They have added only marginally to the critical insights of their “tradition-haunted” predecessors, for their wholesale rejection of the past has left them with a rather flimsy conceptual apparatus. By opting out of the ongoing argument embodied in living legal traditions, they are left to commune disjointedly with one another or with themselves. Many of them, like their counterparts in the art world, are attracted by violence and destruction. One legal Dadaist, for example, has written exuberantly in the Yale Law Journal that “trashing is fun.” “Context-smashing” is central to the program advocated by Unger, who criticizes “eighteenth-century constitutionalism” for the drag that its checks and balances impose on the pace of social change. He envisions future governments with “ministries of destabilization” to promote ferment and creative disarray.

After suggesting a comparison between avantgardism in the law and in the arts, I must admit that I am more uneasy about the cultural effects of this phenomenon in law than in, say, painting. I also believe that, unfortunately for us Americans, bashing legal traditions is a graver matter for our society than it is for the continental countries from which American critical legal theorists have drawn their inspiration. More homogeneous nations have many sources of social cohesion other than their legal systems: in shared customs, history, religion, song, and poetry. But for a people as diverse as ours, the law has had to bear more cultural weight. In the United States, for better or worse, law has become a principal carrier of those few values that our heterogeneous citizenry holds in common—freedom, equality, and inclusiveness of the community for which we accept common responsibility. Law is one of the chief means through which we order our lives together. The attack on reason and tradition in law thus strikes at the heart of our version of the democratic experiment.

I would not wish to be understood to be saying that the legal profession has entirely capitulated to the tradition of antitraditionalism. The bar still contains a fair number of men and women who take pride in their craft, and who endeavor to follow Abraham Lincoln’s advice to lawyers to be peacemakers among neighbors. On the bench there are still many independent judges who are not afraid to exercise judgment, and who endeavor to do so modestly and objectively. Such judges, as Paul Carrington has observed, are in an important sense models for the functioning of the rest of government, beacons by which public servants of all sorts “are led in the direction of restrained adherence to principle.” Even in the legal academy, one can still find an occasional figure like Archibald Cox, who dared to speak recently of his “deep belief that judges and academic scholars have an obligation to put aside all private predilections, commitments, self-interest, using only the most disinterested and detached reason they can bring to bear.”

But the legal profession is changing. Many practitioners are now more accurately described as businessmen and women than as members of a liberal profession. And certain other developments in the practice cannot be entirely unrelated to the atmosphere in the academy. For the teaching that all rules are indeterminate includes the rules of ethics that were once thought to impose limits on what a lawyer might or might not do. And one may surmise too—though this would be hard to prove—that years of portraying legal reasoning as mere window dressing may have contributed to a certain neglect of craftsmanship.

All this is beginning to sound as though, in the legal world, darkness is once again creeping over the abyss, where our old enemy, chaos, lies waiting. Certainly our three American legal traditions—each in its way representing a human effort to hold the chaotic forces of power, passion, and self-interest in check—are not enjoying their finest hour. If I were to try to imagine one of today’s students updating Roberto Unger’s description of the elite academy, it might go something like this: “When we came, most professors had ceased even to pretend that law was anything other than power and preference. Their legal apostasy often coexisted, however, with a fervent devotion to this or that new cause or dogma. Disdaining their ancestors, they did not hesitate to live off the inheritance amassed by the labor of others. Turning their backs on the laity (who still naively looked to law for justice), they decked their altars with mirrors—into which they gazed long and attentively, self-made men worshipping their creators.”

The bleakness of this picture is mitigated, however, by the reminder that other legal dark ages have passed away. After a copy of the Corpus Juris Civilis was rediscovered at Amalfi in the eleventh century, it took only a few hundred years for Europeans to build up an impressive legal system.

Then, too, in theology—as Avery Dulles has pointed out—the sources continue to radiate great liberating power. They provide each new generation with “a platform from which we can see and judge the present differently from the way in which the present sees and judges itself.” There is an analogous platform in law, but a shakier one. What provides it, I believe, is not some particular source or sources, but the distinctive common law method of reasoning. The platform is shaky because common law reasoning has to start from premises that are doubtful or in dispute, and because it does not aim at certainty, but only at determining which of opposing positions supported by strong evidence and convincing reasons should be accepted. We may concede to Holmes & Co. that the open texture of this sort of reasoning does permit bias to creep in, and that its reliance on precedent can preserve not only the wisdom, but also the sins and ignorance and power relations of the past. We must admit as well that its conclusions are flawed, due to our own limitations and the limitations of those upon whose accomplishments we build.

But when all that is conceded, the fact remains, as Aristotle pointed out long ago, that dialectical reasoning is the only form of reasoning that is of much use in “the realm of human affairs,” where premises are uncertain, but where, though we can’t be sure of being right, it is crucial to keep trying to reach better rather than worse outcomes. It is time for lawyers and philosophers alike to recognize that common law reasoning is an operating model of that dialectical process, and that its modest capacity to guard against, and correct for, bias and arbitrariness is no small thing. Over time, the recurrent, cumulative, and potentially self-correcting processes of experiencing, understanding, and judging enable us to overcome some of our own errors and biases, the errors and biases of our culture, and the errors and biases embedded in the data we receive from those who have gone before us. As Benjamin N. Cardozo once put it, “In the endless process of testing and retesting, there is a constant rejection of the dross.”

And so, by a long and circuitous route, I come back to the proposition with which I began: that human creativity is inescapably dependent on what has gone before. This is not a very remarkable proposition. Perhaps what ought to seem remarkable is only the extent to which so many practitioners of the human sciences seem to want to ignore it.

As for what might emerge if legal theorists were to turn back to law, and to consciously appropriate their own tradition of open-ended, dialectical, probabilistic reasoning, who knows? Creativity is mysterious not only in its origins but also in its outcome. An Eskimo creation myth makes this point in a charming way. At the beginning of the world, the story goes, Raven the Creator planted a seed. From the seed came a vine, and on the vine in time there grew a pod. One day the pod ripened and fell to the ground and out of it climbed the first man. Raven, the great black bird, walked round and round the man, inspecting him from top to toe, and finally asked, “What are you?” “How should I know,” said the man, “you were here first.” “Well,” said Raven, “I planted that seed, I created that vine and that pod, but I never thought something like you would come out of it!”

Marry Ann Glendon, a member of the Editorial Advisory Board of First Things, is Professor of Law at Harvard University. Her eight books include Rights Talk (1991), The Transformation of Family Law (1989), and Abortion and Divorce in Western Law (1987), which won the Scribes Book Award of the American Society of Writers on Legal Subjects. In 1991 she was elected President of the International Association of Legal Science. This essay originated as the seventh annual Erasmus Lecture, sponsored by the Institute on Religion and Public Life.