Support First Things by turning your adblocker off or by making a  donation. Thanks!

The age-old distinction between schoolchildren and university students is fast losing its meaning. On many campuses, the infantilization of university students has become institutionalized. College administrators treat students as if they were biologically mature children rather than young men and women. Administrators assume that millennials are so fragile that they require therapeutic support to make the transition from high school to the university.

In some instances, the infantilization of university students has become a caricature of itself. Many universities provide anxious undergraduates facing exams with soft toys and pets to stroke in designated chill-out rooms. Harvard Medical School and Yale Law School both have resident therapy dogs in their libraries. At the University of Canberra in Australia, according to a 2015 report in The Australian, pre-exam stress-relief activities include “a petting zoo, bubble wrap popping, balloon bursting, and a session called ‘How can you be stressed when you pat a goat?’”

Most of the explanations that are used to account for the emotional fragility of university students blame new social and economic factors such as the rapid pace of change or economic insecurity faced by undergraduates. Such accounts overlook one of the main drivers of the infantilization of young people, one that transcends social and economic conditions: the inability of contemporary society to educate young people in the values of the past.

Education needs to conserve the past. Political philosopher Hannah Arendt was unequivocal on this point. “To avoid misunderstanding: it seems to me that conservatism, in the sense of conservation, is of the essence of the educational activity,” she argued. Arendt’s objective was to conserve not for the sake of nostalgia but because the conservation of the old provided the foundation for renewal and innovation. Indeed, she went so far as to argue that “education must be conservative” in order to create the conditions wherein children can feel secure to reform and improve their world. It is only in relation to the world as it has been preserved that young people develop their potential for creating something new.

The characterization of conservation as the essence of educational activity can be easily misunderstood as a backward or reactionary political agenda. But the argument for conservation is based on the understanding that, in a generational transaction, adults must assume responsibility for the world as it is and pass on its cultural and intellectual legacy to young people. An attitude of conservation is called for specifically in the context of ­intergenerational transmission. Until recently, leading ­thinkers from across the ideological divide understood the ­significance of transmitting the knowledge and the values of the past to young people. Antonio ­Gramsci, the Italian Marxist thinker, wrote in one of his Prison Notebooks, “It is imagined that the child’s mind is like a ball of string which the teacher helps to unwind. In reality each generation educates the new generation, that is, forms it.” He assumed that young people’s experience of life is insufficient to grasp the workings of the world. They require the assistance of the older generations to gain their bearings.

Writing from a conservative perspective, the English philosopher Michael Oakeshott concluded that “education in its most general significance may be recognised as a specific transaction which may go on between the generations of human beings in which newcomers to the scene are initiated into the world they inhabit.” Oakeshott went on to call it a “moral transaction,” one “upon which a recognizably human life depends for its continuance.”

The socialization of young people through the intergenerational transmission of the legacy of the past forges connections between members of society. It provides young people with the cultural and moral resources necessary to make their way in the world and gain strength from the experience of their elders. A sixteen-year-old boy whose uncle and grandfather served in the navy has a model of duty available to him even if he doesn’t join up when he comes of age. A girl whose mother committed herself to environmental activism all her life grows up with a set of values that orient her to the planet. This is more than school-acquired knowledge. It is fundamental to the adulthood that children and teenagers envision as they get older. The stories that children hear from their parents, relatives, and neighbors help them to understand who they are, how they are expected to behave and to respond to the challenges of everyday life. Through this intergenerational dialogue, the experience of the past is both tested and revitalized.

But during the past century, this natural process has been stymied. Western societies have found it increasingly difficult to socialize young people into the values of the previous generations. In the face of extraordinary technological and social change, older generations have lost confidence in the values into which they were acculturated. As things now stand, Western society is estranged from the values that inspired it in the past. It no longer provides adults with a compelling narrative for socialization.

The estrangement of society from the legacy of its past acquired an explicit form among progressive educators in the interwar era. As R. J. W. Selleck noted in his study English Primary Education and the Progressives: 1914–1939, this group of educators was “distressed and alienated” by the values that prevailed at the time. They “shied away from imprinting the future generation with the marks of the present.” They regarded Britain’s Empire, especially its ­political and class culture, as a source of embarrassment. J. H. Nicholson, a professor of education at Newcastle University, explicitly voiced this outlook. He worried that “we are an uneasy generation, most of us to some extent ill-adjusted to present conditions.” The logical course of action, then, was that we “should therefore beware of passing on our own prejudices and maladjustments to those we educate.”

It was a compelling precaution for an adult cohort that had lived through World War I and was witnessing the rise of fascism and totalitarianism across Europe. At first, though, the trend toward turning an educational institution into a less authoritarian one was confined to early schooling. Educators took the view that playing was an ideal vehicle for learning since it helped children relax and gain confidence. The teacher should be less a stern taskmaster and more a facilitator of exploration and creativity. This rationale deepened, however, when the approach spread from the school to the home, and parenting experts of the time began to warn mothers and fathers about the potential damaging effect of traditional educational pressures on their children. During the 1930s and 1940s, for instance, some experts claimed that premature attempts to teach children to read would undermine their capacity in the long run. In order to avoid the harm caused by premature reading, many American children were raised on a diet of picture books.

Critics of this therapeutic turn of education, sensing that these arguments could be extended up the age ladder, urged that its influence be confined to the schooling of infants. The educational theorist Michael Demiashkevich, who escaped the Soviets and immigrated to America in 1923, warned that making mud pies and dressing dolls were great kindergarten activities but “the kindergarten must not be permitted to extend its domination over the secondary school and college.” While progressive educators of his time insisted that schools “should abandon formal discipline” and infuse classrooms with “the spirit of play,” Demiashkevich emphasized the necessity of role models: “Each generation of men is much stronger against the various adversities of life when the young people preserve the results of the efforts expended by their elders.” He might have agreed to lighten this remembrance of past efforts for five-year-olds, but certainly not for high school and college students. Though he clearly anticipated the subsequent growth in the influence of childish pedagogic techniques in the classroom, he would be shocked by the therapeutic use of pets and soft toys in the twenty-first-century ­university setting.

Obviously, Demiashkevich’s admonitions had no influence on American public education in the following decades. The representation of the values of the past as an outdated form of prejudice became more than just a dogma of teacher-training programs. It swept through American society at large in powerful forms of personal liberation, for instance, the counterculture poets, pop psychology, and feminist movements. As the American political scientist C. J. Friedrich stated in 1972, “in the twentieth century tradition [became] a pejorative term.” Since the end of World War II, and especially since the 1960s, the sentiment of personal freedom has had its negative form, too: an anti-traditionalist scorn directed at those who refuse to move with the times and who still take seriously the values of the past. This deep-seated mistrust of tradition goes so far as to warn mothers and fathers to be wary of the ­child-rearing practices used by parents in previous times. Instead, parenting professionals advise mothers and fathers to heed the advice of today’s child-rearing experts.

In Western societies, the silent crusade against the past directs its energy most destructively toward altering the way that the adult world socializes young people. The advice and views of grandparents are frequently castigated as ­irrelevant and possibly prejudicial to the development of the child. For example, a report published in The International Journal of Obesity claimed that the risk of child obesity is 34 percent higher if grandparents care for children full time. The implication of this argument is that it would be better for children to spend less time with their grandparents. As a result of the institutionalization of these attitudes, children are distanced from the values held by their ­grandparents and certainly by their more distant ancestors.

The outlook put adults in a helpless position. It’s not that parents and grandparents don’t love or provide for their children enough. Cultural relevance and competence are the issues, not devotion. Indeed, parental love could even be harmful to children if it isn’t informed by the empirical knowledge the experts possess. As child psychologist William Kessen put it in a 1979 article in American Psychologist, the displacement of parental intuition by expert authority is cast by the field as “the mighty brush of ­scientific method” clearing away “the eccentricities of folk knowledge, and the superstitions of grandmothers.” In truth, he countered, a lot of pseudoscience lay behind the research: “there have been unsettling occasions in which scraps of knowledge, gathered by whatever procedures were held to be proper science at the time, were given inordinate weight against poor old defenseless folk knowledge.” That was the ­strategy: to marginalize the family’s folk knowledge, with parenting experts standing ready to provide technical solutions to the problem of growing up.

Needless to say, skepticism about the values and authority of the older generations had important implications for intergenerational relationships. Once adults have become suspicious of the values into which they were socialized, their capacity to educate children becomes compromised. Instead of confronting the question of how to conduct essential intergenerational transactions both within and outside education, people have tended to evade the duty altogether. Educators and experts justify this act of evasion by claiming that in a fast-changing world, the values and customs of the past become irrelevant. What children need is an education that provides them with the skills that are necessary for them to be confident, flexible, and emotionally intelligent. If society is changing so rapidly, then the moral, artistic, and political content of the past is less valuable than the ability to adapt to constantly changing conditions. From this standpoint, the past and its values are not a legacy worth transmitting to young people.

We are now at the point at which we can assess this progressive attitude toward the past and the generational breakdown it creates. As David Walsh noted in The Growth of the Liberal Soul (1997), “the inability of liberal societies to develop any institutional means of transmitting its own virtues” has precipitated a cultural crisis. What Hannah Arendt warned about the difficulties of socializing the young without transmitting the values of the past has been fully realized. “Since the world is old,” she observed, “always older than [children] themselves, learning inevitably turns towards the past, no matter how much living will spend itself in the present.” When the turn doesn’t take place, living does indeed spend itself in the ­present—but without the necessary equipment. The phrase “learning from the past” is often taken as a platitude, yet it is impossible to engage with the future unless people draw on the insights and knowledge gained through centuries of human experience. Individuals gain an understanding of themselves through familiarity with the prior unfolding of humanity.

This is not to say that educators have entirely abandoned the duty of socialization. That’s impossible. Lack of clarity about the transmission of values has led to a search for alternatives. In the past, those devoted to the socialization of children were mainly interested in the transmission of moral and cultural attitudes. These concerns still retain their relevance. But there is a striking difference between the way that schools see the role of socialization now and the way it was understood a century ago. Socialization is increasingly perceived as a form of behavior management. It is less about introducing pupils to an established way of life or familiarizing them with a community’s moral code than it is about instructing them in how to manage their emotions and conduct relationships with others. Now, we train them in so-called life skills. The role of parents is not so much to transmit values but to validate the feelings, attitudes, and accomplishments of their children. In the twenty-­first century, responsible parents are portrayed as accomplished managers of their children’s emotions.

Experts and educators frequently claim that the socialization of young people relies increasingly on therapeutic techniques because of recent discoveries about hitherto unknown deficits of childhood that called for therapeutic guidance and intervention in the lives of children. These “discoveries” about the vulnerability of the child, however, can be interpreted quite differently. They are not discoveries about the condition of the children, but an expression of the difficulty that parents and other adults experience in attempting to socialize young people.

Affirming children and raising their self-esteem are projects that are actively promoted by parents as well as schools. This emphasis on validation has run in tandem with a risk-averse regime of child-rearing. But while validation aims to empower youths—to give them the confidence and flexibility to be successful adults—the opposite is the case. The actual consequence of this approach to parenting has been to limit opportunities for the cultivation of independence and to extend the phase of dependence of young people on adults. The extension of dependence is reinforced by the considerable difficulties that contemporary society has in providing young people with a persuasive account of what it means to be an adult. In popular culture, adulthood is rarely ­associated with positive connotations. It is often portrayed as the gradual loss of options. That is one reason why social scientists invented the term “emerging adulthood.” The term encompasses those who consider themselves too old to be adolescents but too young to be fully fledged adults.

In other words, we have a perverse situation wherein parents and educators attend to children’s needs and then get out of the way so that they may grow up independently—only to find that the children become more needy as young adults. The system of behavior management encourages youths to interpret their existential problems as psychological ones. Exposed to therapeutic pedagogy during their schooling, the recent cohort of undergraduates is educated to understand the ordinary challenges of growing up through the language of mental health. Not surprisingly, they don’t easily make the transition to conduct associated with maturity and the exercise of autonomy.

The socialization of children through validation invariably communicates the idea that self-esteem is a core value of education. Young people expect to be affirmed and believe that they possess the right to be esteemed regardless of their accomplishments. It’s not their fault—this is what they have been taught. Validation is a prerequisite for mental health. That is why so many of the demands of the recent generation of campus protesters rest on the ground that what is at stake is students’ emotional well-being.

Trigger warnings, safe spaces, the policing of offensive language, and the crusade against microaggressions and cultural appropriation are all justified because they protect students from trauma and other mental health problems. Take the case of an article titled “Classroom Censorship Can Improve Learning Environment,” published in Oberlin College’s online student paper. The student author of this article combines his appeal for a trigger warning on the Greek tragic play Antigone with an account of his personal struggle with the idea that “suicide is the way out.”

The current cohort of high school graduates arriving on campus has been thoroughly socialized into this therapeutic worldview. From their standpoint, just about any experience that makes students feel uncomfortable or distressed—a poor grade, classroom criticism, an act of miscommunication—is an act of invalidation.

There is another perverse irony here in that the demand by students to be validated constitutes an invitation to paternalism, which, if one is an adult, is a form of disrespect. But universities have been more than ready to embrace the task. Indeed, they have become hyperactive moral guardians of their students. They play an aggressive role in providing students with guidelines for etiquette and behavior. University speech codes, guidelines on microaggressions, and workshops on bias and consent are examples of practices and initiatives designed to guide students’ outlook and behavior. And many students are more than willing to cooperate. At UC Santa Cruz, black students recently demanded, among other things, that all new students undergo “diversity competency training”­—a form of paternalism that falls nicely in line with the administration’s therapeutic ­guardianship.

Outwardly, such codes and catechisms that touch on the minutiae of everyday campus life resemble premodern religious texts and manuals of etiquette. They differ, however, in one important respect. Today’s codes of conduct self-consciously avoid a moral tone. Concepts such as microaggression and language policing are the invention of psychologists and advocates of identity politics who emphasize sensitivity and offense, not of moral philosophers and theologians who appeal to the true and the good.

As I noted in What’s Happened to the ­University?: A Sociological Exploration of Its Infantilisation, this extension of the role of socialization into the domain of higher education was well observed by the socio­logist Alvin Gouldner as early as the late 1970s. ­Gouldner described colleges and universities as “finishing schools” for inculcating the values promoted in higher education. Since then, the socializing ambitions of universities have become increasingly ­unrestrained. They do not simply attempt to promote a distinct outlook on the world, but actively seek to delegitimate values that are associated with traditions that they regard as outdated. Take the statement made by chancellor Ronnie Green of the University of Nebraska welcoming new students to the 2016–2017 academic year. He praised diversity and inclusion as the principal values of his institution. As far as he was concerned, students arriving on campus had no option but to embrace these values. As he put it, “our beliefs on diversity and inclusion . . . are ­not-negotiable.” His call to conform or else echoed the authoritarian temper associated with illiberal institutions.

Gouldner noted also that universities often socialize students into values that are “divergent from those of their family.” One of the ways that this objective is accomplished is by encouraging students to distance themselves from “out-dated traditional language” and customary forms of behavior. In other words, the university is a place of re-education. Family lore must be discarded.

But it takes no great insight to recognize the poverty of the university’s ultimate aims. Though the project of changing language and behavior has succeeded in marginalizing traditional relationships and attitudes, it fails to give deep meaning to human experience. That is why moral policing on campuses is ­continually in search of new ways of “raising awareness.” Awareness has little to do with a moral sensibility. Raising awareness need not have a particular objective, either. It is represented as a value in its own right, and in a morally disoriented world it helps socialize young people to believe that those who possess raised awareness are better people than those with traditional views. But since awareness is a caricature of a virtue, its pursuit merely evades genuine answers to the big questions of our time.

It is unlikely that Green and other campus leaders are inclined to see that the socialization of university students is systematically framed through the ­medium of therapeutics. They present their moral policing as precisely the type of adult authority youths need right now. But their techniques of behavior management offer a poor substitute for the intergenerational guidance provided through the traditional exercise of adult authority. Today’s version, emphasizing tolerance and safety, cannot provide young people with a legacy that could serve as the foundation for cultivating habits of independence and maturity. Without such a foundation, the younger generation is deprived of the moral resources that can give youths the strength to make their way in the world. In such circumstances, many will feel fragile and disoriented and forever disposed to interpret existential problems through the prism of mental health. Socialization through validation, not through tradition and lore, leaves the young without a ground of meaning through which they can make proper sense of their experience. 

Frank Furedi is emeritus professor of sociology at the University of Kent in Canterbury, England. His What’s Happened to the University?: A Sociological Exploration of Its Infantilisation is published by Routledge.

Dear Reader,

Your charitable support for First Things is urgently needed before July 1.

First Things is a proudly reader-supported enterprise. The gifts of readers like you— often of $50, $100, or $250—make articles like the one you just read possible.

This Spring Campaign—one of our two annual reader giving drives—comes at a pivotal season for America and the church. With your support, many more people will turn to First Things for thoughtful religious perspectives on pressing issues of politics, culture, and public life.

All thanks to you. Will you answer the call?

Make My Gift