Students in my history of architecture course are amused to discover that the final exam offers a choice of questions. Some are bone dry (“discuss the development of the monumental staircase from the Renaissance to the nineteenth century, citing examples”) and others deliberately open-ended (“General Meade overslept at Gettysburg and the South has won the Civil War; you are commissioner for the new national capital and must tell us which architects you will choose and what instructions you will give them.”) In offering this whimsical range of options, I do nothing original; my own professors at Haverford College did much the same in their day.

But a peculiar thing has happened. When I began teaching twenty-five years ago, almost all students would answer the imaginative question but year in, year out, their numbers dwindled, until almost all now take the dry and dutiful one. Baffled, I tried varying the questions but still the pattern held: Given the choice, each successive cohort preferred to recite tangible facts rather than to arrange them in a speculative and potentially risky structure. In other respects, today’s students are stronger than their predecessors; they are conspicuously more socialized, more personally obliging, and considerably more self-disciplined. To teach them is a joy, but they will risk nothing, not even for one facetious question on a minor exam.

I am hardly the only one to notice the risk-avoidance. William Deresiewicz gave a harrowing account of the problem in a widely noted New Republic essay with the incendiary title “Don’t Send Your Kids to the Ivy League.”

So extreme are the admission standards now that kids who manage to get into elite colleges have, by definition, never experienced anything but success. The prospect of not being successful terrifies them, disorients them. The cost of falling short, even temporarily, becomes not merely practical, but existential. The result is a violent aversion to risk. You have no margin for error, so you avoid the possibility that you will ever make an error.

Deresiewicz’s analysis begins with the college admissions process itself but says little about the habits and behavior patterns that these students acquired on the way to college, in early childhood. For some reason, my students were viewing playful questions as inherently risky, as if by collective instinct. Was it possible that they never learned to play in the first place?

Now if one goes by the strict dictionary definition of play as “to occupy oneself in amusement,” these young men and women have played a great deal indeed. But while thirty minutes in front of television or atop the elliptical trainer may be recreation or entertainment, it is not play. Certainly not that special kind of play that is the gleeful anarchy of children left to their own devices. This summer a woman was arrested in South Carolina on the charge of letting her nine-year-old daughter play unsupervised, something incomprehensible to those born in the 1950s or 1960s. For us, unsupervised play constituted the entirety of our childhood. Launched from the house and banished till mealtime, we roamed our allotted territory, from this house to that driveway, and not a step farther (fifty years later the electric charge of those invisible barriers still tingles). Each year the boundaries would expand, but even in the nutshell of six front yards, the child was a king of infinite space, with room aplenty for tag, hide and go seek, or relieveo.

In the last generation this sort of free and unsupervised play lost ground, along with those institutions that sustained it: platoon-sized families, stay-at-home moms, and multiple “eyes on the street.” Its place has been taken by the play date, negotiated in advance with the kind of deliberation required by the marriage of a Hapsburg and a Tudor. No longer the posse of shrieking kids, hurtling around the block, but instead the purposefully organized activities of contemporary childhood: tee-ball and soccer camp, swim class and 5k runs—the interstices filled with the distractions of the DVD and Nintendo 3DS.

For children who know only supervised play, there is no conflict that is not resolved by an adult. One never learns to negotiate and resolve conflicts with one’s peers. This was not always an amiable or tear-free process; playground justice was just as harsh and swift as medieval justice. But it was justice, and even that most brutal aspect of playground life in the 1960s, the afterschool fistfight, was regulated by the standing circle of classmates who yelled out encouragement or insults, and who stopped the proceedings when it went too far. In all of this was a restless testing of the limits of freedom, with little feints and modest rebellions. These often ended unhappily, especially when the offending instrument was a stick, stone, or pack of matches, but here were those first lessons in overstepping the bounds that seem essential for the development of an individual conscience.

More and more, parents feel obliged to steer their children toward those activities that might have a future payoff, already thinking ahead to that harrowing ivy league gauntlet that Deresiewicz describes. Such is the instrumental view, play as a means to an end and not an end in itself. But as any cultivator of plants knows, to promote one trait can cause others inadvertently to atrophy. One thinks of the modern tomato, indestructible yet flavorless, or the modern rose, exquisite and almost completely devoid of scent. And the process of producing the well-socialized, well-tempered contemporary child has inadvertently blunted some of those qualities that can only be acquired, as it were, when no one is looking. Chief of these is initiative—the capacity to size up a situation and take quick decisive action. Only those children who play under minimal supervision—“free range kids” in the happy phrase of Lenore Skenazy—get the chance to develop this sense of dash or pluck. They do this in the process of deciding what to play, establishing the rules, choosing sides, and resolving the inevitable dispute. In short, by acting as miniature citizens with autonomy rather than as passive subjects to be directed.

There is an extraordinary scene in Abel Gance’s 1927 silent classic Napoléon, which shows the future emperor as a ten-year-old schoolboy. Persecuted by older boys, Napoléon organizes an epic snowball fight and leads his small group to victory over a much larger party. In all of cinema there is no more spirited depiction of childhood play, and the moment of joyous discovery of skills and capabilities—in this case independent leadership—that will form the indispensable toolkit of the adult to follow.

Michael J. Lewis is Faison-Pierson-Stoddard professor of art at Williams College.

Become a fan of First Things on Facebook, subscribe to First Things via RSS, and follow First Things on Twitter.

Articles by Michael J. Lewis


Show 0 comments