Support First Things by turning your adblocker off or by making a  donation. Thanks!

The most practical man of business is usually the slave of a defunct economist, John Maynard Keynes observed seventy-five years ago, in reference to the theories he proposed to overthrow; and this judgment applies with force to his continuing grip on the minds of the world’s policy makers. Public debate about responses to the economic crisis remains circumscribed by Keynes’ vocabulary. The Group of Twenty remains locked into Keynesian orthodoxy: debating whether governments should pursue “austerity,” diminishing spending while cutting deficits and debts, or “stimulus,” offering a “stimulative” monetary policy while increasing spending. None of the leaders of the world’s top economies has uttered a pip about creating incentives to rebuild equity and revive the entrepreneurial spirits of their people, which is the only way to stop debts from compounding on public and private balance sheets. In the years following the publication, in 1936, of his General Employment, Interest, and Money, Keynes admitted that his new language was obscure and often misleading and that his analysis stood in dire need of rethinking. He even called his followers “fools.” His early death, in 1946, allowed obscurity to ossify into academic orthodoxy and come to distort public discourse to this day.

Why has his rule lasted so long? How did so many clever people become the slaves of this defunct economist? His language and analyses were entirely wrong, but during the Great Depression of the 1930s, his call for governments to step in quickly as financial intermediaries when private intermediaries were shutting down was the correct one. And he correctly fought a conventional wisdom that then favored balanced budgets as a response to the crisis. The crash of the equity market and the banking system had wiped out most of America’s wealth. Institutions and households lacked the means to invest and restart the economy, until the government provided it in the form of war spending after 1939. When individuals and corporations lack the ability to issue debt, government becomes, by default, the financial intermediary. An increase in government debt in such exceptional cases constitutes an increase in wealth and can pave the way back to healthy capital markets.

In 1790 Alexander Hamilton did just that when he persuaded the nascent United States to fund the Revolutionary War debt at par; he restored credibility in the country’s currency and fiscal and monetary discipline. Later, future Nobelist Robert Mundell showed that an increase in government debt may sometimes represent wealth. It happens when a well-funded public debt (to borrow Hamilton’s term) is supported by future prosperity, which implies both more creation of assets and more tax revenues. Tax cuts stimulate growth and produce an increase in wealth when the rise in tax revenues exceeds the interest that the government pays on the bonds it issued to cover the initial loss in revenue. This insight underlay the “supply-side economics” of the Reagan administration, unfortunately reduced even by some of its backers into simplistic caricature.

Yet, as Mundell observed, curing the stagflation of the 1970s required fighting inflation with tight monetary policy while promoting creation of assets through tax incentives. That is just what Paul Volcker’s Federal Reserve and the Reagan administration did during the early 1980s, launching a quarter century of noninflationary prosperity. Keynes taught that all forms of aggregate demand were equivalent, such that fiscal policy and monetary policy merely increased or reduced short-term spending power but had nothing to say about restoring the conditions for rebuilding a country’s assets.

Politicians, however, liked the Keynesians’ claim that governments could solve all the problems of the economy by taxing, spending, and manipulating interest rates”without raising difficult subjects such as the accountability of government bureaucracies or fluctuating exchange rates. Whether the politicians believed in the nonsense is irrelevant: It was convenient for them to behave as if they did. And economists were pleased to have a theory that elevated their status to the modern equivalent of court astrologers.

After the 2008 crash, a generation of academics learned in the minutiae of the 1930s Depression, including Federal Reserve Chairman Ben Bernanke and White House adviser Lawrence Summers, were entrusted with the execution of Keynes’ old prescriptions. The great economic crisis that began in 2007 and has not yet abated was supposed to have been Keynes’ decisive triumph—the crisis in which governments actually did what he urged them to do during the Great Depression, and the proof that an elite of puppeteers could make the innumerable actors in economic life dance into recovery.

After two years of the largest peacetime deficits the modern world has ever known, and the lowest global level of interest rates in history, Keynes’ remedy has clearly failed. The world economy has not recovered. Instead, the locus of crisis has shifted to the balance sheets of the governments whose spending powers were supposed to have been the solution to the crisis.

Keynes’ idea is simple. In fact, it is simple by construction, for it focuses on the very short term within a closed economy. If consumers won’t spend, the government will spend for them; if businesses won’t invest, the government will invest for them; and if investors won’t take risks, the central bank will reduce the yield on low-risk investments to almost nothing. The risk-taking of entrepreneurs, the cleverness of inventors, the skills and motivation of the workforce, the competitiveness of industries—all the granular reality of a dynamic society, chugging along through trial and error—vanish into Keynesian aggregates like gross domestic product, price indices, productivity, and so forth. Behind all the technical language stands the assumption that bureaucracies, with no business experience whatsoever, can somehow make wise decisions about allocating capital—and do so quickly. These gross simplifications take on the aura of academic theurgy when packaged into seemingly complex mathematical models that occult their ridiculous assumptions.

No economic recovery will ever occur until governments bury, once and for all, the Keynesian concept of aggregate-demand management and instead take measures to revive incentives to rebuild risk-capital financing by mobilizing the country’s entrepreneurial talent.

The policy debate is a blame game, but one played by blind men with an elephant. Some say that if the Federal Reserve had not kept interest rates so low for so long, there would have been less credit expansion and fewer defaults. Others say that if the Fed had paid more attention to the external value of the dollar than to price indices or GDP, it would have suppressed the developing bubble in home prices. Still others argue that without official support for subprime securitization, the vast subsidies provided by government-sponsored mortgage funders, and the monopoly position of the heavily conflicted rating agencies, the securitized debt bubble might have been contained. Yet others argue that the proprietary trading focus of deposit-taking institutions made the payments system vulnerable to panic.

All these observations are true, and all of them are misleading, for the crisis arose not from any of these errors as such but rather from the Keynesian mindset of policy makers and regulators that prevented them from identifying these problems before they combined to threaten the financial system and the long-term health of the economy.

A decisive issue was the role of “shadow banking”—that is, the expansion of unregulated derivatives such as credit default swaps and collateralized debt obligations (CDOs). Two trillion dollars of CDOs, which hold assets in trust and pay income to investors with varying degrees of seniority, were issued between 2004 and 2008. These instruments allowed rating agencies to classify the majority of CDOs as virtually default-proof, which in turn allowed banks to drastically reduce the amount of capital required to hold them. In short, derivatives vastly increased the leverage in the banking system.

“Shadow banking” made short work of the old Keynesian definition of monetary “aggregates.” It was no longer possible to speak of “aggregate” demand generated by a homogenous sort of money controlled by the central bank through traditional open-market operations. Unregulated derivatives decoupled the real world of banking from the imagined one of aggregate monetary policy. Policy makers kept talking about the “aggregate,” homogeneous kind of money, devising policies in its terms, when they should have concentrated their attention either on the institutions needed to manage the different kinds of domestic moneys coming into existence or prohibit their existence. “Shadow banking” would never have gained the significant role it did unless people believed that the Federal Reserve would be there to supply the needed liquidity in the event of a run on the shadow-banking system, too, and believed that “every money was created equal.”

The panic of 2008 (in distinction from the still-ongoing crisis) occurred when people realized that this was not the case. The run on the shadow-banking system contaminated the payment system, and without a functioning payment system, no economy can survive. Economies can survive without capital markets financing long-term investments, although not very well—as communism did for most of the last century. Economies also can survive for a few months if capital markets that just replenish inventories suddenly freeze. But once the payment system collapses, panic ensues, and the economy shuts down. Hyperinflation is the usual result, although, as the events of 2008 showed, a collapse of the payment system also can result in a run against money-market funds and other deposit-taking institutions, and the effects can be just as fearful.

The panic happened, in part, because central bankers relied for decades on the Keynesian view that “money” is a homogenous aggregate that central bankers controlled. While they were acting as if this were the case, the financial industry in effect printed its own money. The problem was neither obscure nor insoluble. If the Federal Reserve had only paid attention to entrepreneurial changes in the financial system rather than operating under the illusion of “aggregates,” it could have imposed constraints on the shadow banking system with the same effect that constraints have on ordinary deposit banking.

The derivatives disaster offers one example of how poorly the Keynesian framework and its dependence on aggregates has served economic policy. Another example is the misleading measurement of the so-called inflation rate, which continues to lull the central bank into complacency about the long-term impact of zero interest rates.

The cost of housing constitutes nearly a third of the consumer price index (CPI). Measuring that cost has always been problematic. But the errors compounded during the last decade in particular, as houses ceased to be mere domiciles and instead became a new, putatively liquid-asset class. This happened because laws, regulations, and government-backed entities such as the federal mortgage insurers subsidized the housing market directly or indirectly. These subsidies drastically altered the nature of this asset class as well as the meaning of “inflation rates.” Initially, scholars and politicians rationalized the subsidies, claiming that home ownership stabilizes communities, gives people a stake in the system, and provides bootstrap capital for business start-ups. These observations were accurate as long as the terms “ownership” and “equity” kept their meaning. In fact, the subsidies encouraged homeowners to reduce their equity by taking on as much leverage as possible. By 2007, homeowners enjoyed ownership only on paper, holding a small margin of equity against a great deal of debt.

These distortions in the housing market played hob with the inflation indices, as Americans bought homes in the expectation of flipping them. Rents meanwhile remained stable, and rent is what the Bureau of Labor Statistics measures to determine the cost of housing. As a result, the housing component in price indices—30 percent of the index—barely changed. To many observers, including at the Federal Reserve, the stability of this price index suggested that there was no excessive credit creation in spite of the rapid increase in home prices. The resulting false readings led to wild misjudgments on the part of the Greenspan and later the Bernanke Federal Reserve Board, who did not believe that their lax monetary policy caused excess credit creation.

The 35 percent fall in the trade-weighted dollar index and the quadrupling of the gold price between 2002 and mid-2009 should have made clear that something was woefully amiss, although the CPI failed to flash an alarm signal. Whatever one thinks about the gold price, that the market was willing to pay four times as much for a traditional inflation and devaluation hedge should not have gone unnoticed by the somnolent wizards of Constitution Avenue.

The collapse of the credit expansion raises the prospect of deflation, and the Keynesian elite now proposes to ward off this danger by returning to the inflationary policy that brought about the crisis in the first place. The International Monetary Fund’s chief economist, Olivier Blanchard, offered what he called a “bold innovation” in February 2010, proposing that central banks pursue 4 percent inflation. Evidently Blanchard thinks that people will happily accept a 22 percent reduction in their wages over five years and a 48 percent reduction over ten years. Professional deformation on this scale attests to the triumph of Keynes over common sense. The reasoning of proponents of such policies—Paul Krugman advocates even higher inflation rates—is that the fear of inflation would lead people to spend money before its purchasing power declined. The Keynesians did not stop to ask how Americans could begin a spending spree after the colossal wealth destruction of the past several years, just before the largest retirement wave in American history.

Perhaps the wildest distortion of all occurs in the measurement of the central aggregate in the Keynesian model, the gross domestic product.

Earlier this year, the world was shocked, shocked to find that a series of Greek governments had lied outright about the size of the country’s GDP, national debt, and other aggregate measures. As early as 2006, however, Europe’s statistical bureau, Eurostat, had threatened Greece with lawsuits over its statistical fabrications. Hungary now is facing a similar problem, the just-elected government acknowledging that the official numbers were little more than statistical illusions. What does this imply about aggregate figures published with such solemnity by national statistics bureaus?

This is nothing new. Back in 1987, Italy announced the year of il sorpasso, that is, “the year we leapfrogged you.” The Italian National Institute of Statistics (ISTAT), the government’s statistics office, arbitrarily added an extra 18 percent to its estimate of Italy’s “real national income.” The adjustment was needed to show that Italy conformed to the evolving European Community requirements for debt and deficit ratios to national income. ISTAT simply asserted that Italy had a black market constituting between 20 percent and 30 percent of its economy, and offered the 18 percent boost as a “conservative” adjustment. Greek politicians made similar statements when Eurostat confronted them, putting the size of the country’s black market above 25 percent.

No outrage greeted these assertions. On the contrary, the statisticians went back to work churning out aggregate indices, and Keynesian economists plugged them into ever-more-recondite models. It might have seemed appropriate to ask if it were true that certain countries have black markets encompassing a quarter or a third of economic activity, and, if it were true, why. But no one did because the claim provided the data necessary to continue the Keynesian assumptions. The economics profession was content with government fabrications until early in 2010, when it seemed clear that Greece, as well as some other southern European countries, might not be able to service the debt that it had assumed on the strength of its aggregate economic measures. In politics, as in business, only the threat of bankruptcy forces lies into the open.

As an advocate of emergency measures during the 1930s, Keynes, as we have said, deserves some credit. His legacy in economic theory, though, has been malignant. It is easy to explain why he drew support during the 1930s. It is harder to explain why the Keynesian model, with its inherent tendency to drive off the road, has survived so long.

Part of the answer is that countries devoted to “Keynesian” policies had a run of good luck that covered up the systematic errors of economic policy. And the memory of this run of good luck still beguiles politicians and their advisers, who yet hope that the easy times will come back.

A major source of that good luck was the migration of capital and talent spurred by troubles elsewhere. Until 1989, most of the world suffered under communist or other dictatorial regimes prone to violent political upheaval. Whatever talent and capital was able to escape from the dictatorships arrived on the shores of a handful of Western countries, foremost among them the United States. The export of human and financial capital to the United States and a few other countries helped cover up accumulating mistakes. In politics as in business, competitors survive not because they are clever but because the competition is stupider.

The massive flow of talent and capital to Western shores until 1989 had consequences that economists have not acknowledged, let alone measured. Consider the present debate about taxation. Paul Krugman intoned in a recent New Yorker interview: “I don’t advocate a marginal tax rate of 70 percent, but it’s worth noting that we had rates in that range all through the 1960s and much of the 1970s, without much evidence that effort at top levels was being crippled. So that’s feasible.”

Well, it is not feasible—unless the rest of the world were to move to even higher marginal tax rates, which is not going to happen. The United States was able to sustain that marginal tax rate during the 1970s because it seemed like tax paradise compared to the rest of the world. When the United Kingdom imposed a 90 percent marginal tax rate, its best talent fled to the United States and Canada—the Beatles included. The flow of talent and capital covered for many domestic mistakes within the United States. When America enjoyed a monopoly as a destination for talent and capital, governments were able to impose high tax rates, for capital had few other viable destinations.

Observers such as Paul Krugman believe that the past can be replicated because they do not understand that past, particularly the sources of the Western nations’ good luck. His is the Keynesian mindset: a short-run view of a closed economy. But the American monopoly did not last. Communism collapsed in 1989, and the previously unstable and chaotic countries of Asia turned into economic tigers within a decade. Other parts of what we then called the Third World—a term that rings quaint today—developed political stability and economic success.

Western countries, whose workers are used to living far better than people with similar skills in the rest of the world, are in trouble. They find themselves with a generation of employees in the public and private sectors who have been pampered by the extraordinary and nonrepeatable circumstances of the past fifty years.

Greece is the industrial world’s first casualty, with Portugal, Italy, Spain, and perhaps even France not far behind. Some American political jurisdictions, notably California, are suffering similar crises. These casualty countries are all democracies. But political parties are slow to correct the accumulated mistakes of decades. Capital markets are forced to correct mistakes much faster; the markets are signaling that unless governments move swiftly, they risk default. Rather than learn their lesson and abandon Keynes, European governments have responded by throwing him into reverse, through fiscal austerity that suppresses growth and increases the real burden of public debt. What their countries need is to let their people leverage their entrepreneurial talents and build up assets.

Unless this is done, the world economy will remain locked into a stagnation punctuated by periodic crises. Lasting prosperity stems from innovation and entrepreneurship. It is easy to talk about this, but hard to bring it to life.

To match talent and capital, and sort out winning ideas from the failures, societies must build their risk capital and sustain the entire maze of institutions that maintain accountability among investors, entrepreneurs, and intermediaries. Accountability requires the continuous reevaluation of entrepreneurial experiments and the ability to intervene to shut them down through bankruptcy to prevent costly errors from compounding.

Government bureaucracies are the worst conceivable arbiters of failure and success because bureaucrats are rewarded not by the success of enterprise but by political constituencies that often have reason to avoid the consequences of failures. So-called speculators play an indispensable role, for they provide the liquidity needed during the long years that elapse between the floating of an idea and its successful execution.

Matching talent and capital and sorting winning ideas from failures do not sound difficult on paper. But a political culture that fosters the entrepreneurial spirit, tolerating trial and error and allowing the innovative newcomer a shot against entrenched interests, has been a relative rarity even in Western history. Western societies developed the institutions that support entrepreneurship only through a long and fitful process of trial and error. Stock and commodity exchanges, investment banks, mutual funds, deposit banking, securitization, and other markets have their roots in the Dutch innovations of the seventeenth century but reached maturity, in many cases, only during the past quarter of a century. Among the Western nations, the United States has been the venue most friendly to entrepreneurship, although the direction of the present administration—which has expanded the government lock on financial markets to an extent never before seen in peacetime—has taken the country in a different, and troubling, direction.

Washington has turned the financial industry into the equivalent of a quasi-governmental public utility. The federal government provides emergency capital and cheap money to the banks, and the banking system in turn finances the federal deficit. The present administration has turned American capital markets into a state-influenced oligopoly, antithetical to entrepreneurial innovation. It is not surprising that venture capital, business start-ups, and other indicators of entrepreneurial activity remain depressed, and the economy shows little sign of recovery.

In this situation it is not a good sign that the same cast of characters who led the financial system over the cliff—including Federal Reserve Chairman Bernanke and former New York Federal Reserve Bank president (now Treasury Secretary) Timothy Geithner, both with bureaucratic and academic experience only—continue to direct policy. The public officials and leaders of financial institutions who managed their way into the crisis have little incentive to ask tough questions about what went wrong, much less to change the way that things are done.

It is time to drive out the Keynesians and adopt a different benchmark altogether. The goal of economic policy should be to help individuals and firms sustain entrepreneurial risk-taking over an extended period of time. It is not difficult to identify what should be done, and in what sequence.

Maintaining confidence in the currency and the payments system is the first order of business: If systemic risk overwhelms portfolio decisions, investors will not take risks on innovation. The Federal Reserve’s reliance on the CPI led to errors in monetary policy, which some Keynesians now propose to repeat. There is a better way to guide monetary policy toward a stable currency. In a recent essay (“Why America’s Economic Recovery Needs the Global South,” First Things, December 2009) we argued that global currency stabilization, starting with a fixed-parity arrangement between the U.S. dollar and a convertible Chinese yuan, would provide a new yardstick for monetary stability.

The governments of the G20 misuse tax policy as an instrument to manage aggregate demand, oscillating between “stimulus” and “austerity.” We have already seen, in the sad case of Western Europe, that reshuffling debt from private to public balance sheets only infects public credit with systemic risk and ultimately leads to the collapse of both public and private credit.

The choice is not between stimulus through government spending and austerity but between stagnation and innovation. Government policy must foster independent sources of risk capital. In the 1790s, or the 1930s, or the 1980s, increasing government debt helped to do this, but it is very hard to argue that this is the case in 2010. The policy lever that can best influence risk-taking today is fiscal. Governments should reduce taxes on capital income and eliminate taxes on capital gains entirely.

Spending cuts must be second in the sequence, after tax reductions have generated more risk capital and employment. Cutting spending first will cause a sharp drop in employment, with no collateral benefits through additional deployment of risk capital.

The worst course of action would be for the federal government to arrogate to itself the role of the private institutions slowly established by the efforts of tens of thousands of talented individuals. Restoring conditions for long-lasting prosperity requires risk investments that create assets, which in turn help create seed capital for new entrepreneurs, in a virtuous cycle. Government job creation produces at best temporary incomes, diverting capital from creating assets.

Finally, reform of immigration policy must ease the entry of skilled, entrepreneurial immigrants. Without attracting the talent that has long contributed so critically to the country’s prosperity, America cannot regain its glory days (see Reuven Brenner, “Our Muddled Masses,” First Things, January 2010).

The United States is now at a critical juncture. The economy has failed to respond to the administration’s Keynesian patent medicine. Washington has the option to either start this “virtuous cycle” and adjust capital markets, taxes, and regulations accordingly or be doomed to decades of mediocre performance.

Reuven Brenner holds the Repap Chair at McGill University’s Desautels Faculty of Management and is author of Labyrinths of Prosperity and Force of Finance. David P. Goldman is a senior editor at First Things. He was previously head of fixed-income research at Bank of America and managed research groups at other investment firms.