On November 4, 2014, ­­sixteen-year-old Cameron Lee, a popular, athletic, straight-A student at Henry M. Gunn High School in Palo Alto, California, leapt in front of a commuter train. His suicide note provided no clear reason for his act; there were no apparent signs of mental illness, and he was not a bullied misfit. His death followed two other student suicides just three weeks prior, one from the same school and another from a nearby private high. Three months later, another senior at Gunn, by then known to local students as “the suicide school,” jumped to his death from the roof of his family’s home. 

Gunn High School is located in one of the ­wealthiest school districts in the country and has some of the nation’s highest test scores. Its students succeed brilliantly in the meritocratic game of standardized tests and college admissions. But the pressure to perform has left them susceptible to feelings of worthlessness. If one can’t measure up and make the grade—what then?

Gunn saw a similar cluster of suicides in 2009. In separate incidents, three current students, an incoming freshman, and a recent graduate all jumped in front of the local Caltrain. That year another recent graduate of the school died by hanging himself. Following these suicide clusters, a 2014 survey of Palo Alto high school students revealed that 12 percent of them had very seriously contemplated suicide in the past year. Another recent report summarizing national and state-level surveys of American high school students put this number at 17 percent.

The largest school district in California, Los Angeles Unified, recorded more than five thousand incidents of suicidal behavior or deliberate self-harm (such as cutting) last year. When this district began tracking these issues in the 2010–2011 school year, there were only 255 incidents. Angus Deaton, a Princeton economist who won the Nobel Prize for his work on the intricacies of measuring human well-being, has been following what is now a national epidemic of suicide and depression. In a recent study, he found that since 1999 there has been an alarming national increase in deaths from drugs, alcohol abuse, and suicide—a trend that is especially pronounced among white Americans born since 1975. Deaton calls these “deaths of despair.”

Due to this epidemic of premature deaths, the overall life expectancy in the U.S. has begun to decline for the first time since the 1930s. In the year 2000, the outbreak of deaths of despair was concentrated in the Southwest (Nevada, Arizona, New Mexico). By 2007, the trend had spread to Appalachia, ­Florida, and the West Coast. By 2014, the epidemic was country­wide, found in both rural and urban areas in every region of the U.S. Add to this the drug overdose epidemic of the past few years—the worst drug crisis in U.S. history in terms of mortality—and these deaths of despair show no signs of slowing.

Depression is now the most common ­serious medical or mental health disorder in the United States. According to the World Health Organization, depression is the leading cause of disability worldwide. Sixteen percent of Americans will have an episode of major depression at some time in their lives, and six percent of all Americans—14 million—have suffered from major depression in the past year. Furthermore, rates of disabling depression have markedly increased over the past several decades, particularly among young people. According to data from the Department of Health and ­Human Services, more than three million adolescents reported at least one major depressive episode in the past year, and more than two million reported severe depression that impeded their daily functioning. A ­recent national study found that the share of twelve- to twenty-year-olds who had suffered major ­depression in the last year increased by 37 percent from 2005 to 2014. We are witnessing a rising plague of melancholy. 

Most people who die by suicide are suffering some form of depression, whether major depressive disorder, the depressive phase of bipolar disorder, or alcohol- and drug-induced depressive states. The most recent data from the Centers for Disease Control indicates that, between 1999 and 2014, suicide in the U.S. rose dramatically for both men and women in every age bracket up to age seventy-five. Social ­scientists have been particularly baffled by the fact that the suicide rate among girls ages ten to fourteen has tripled. We should let these numbers sink in: Suicide is now the second leading cause of death among adolescents and young adults, and the tenth leading cause of death overall in the United States.

Rising rates of suicide, drug abuse, and depression can all be traced to increased social fragmentation. Since the 1980s, reported loneliness among adults in the U.S. increased from 20 percent to 40 percent. The recently retired surgeon general announced last year that social isolation is a major public health crisis, on par with heart disease or cancer. He noted that loneliness is associated with increased risk of heart disease, stroke, premature death, and violence. It works in a way comparable to smoking or obesity: increasing a whole host of health risks and decreasing life expectancy. It is no accident that one of the most severe punishments we inflict on prisoners is solitary confinement—a condition that eventually leads to sensory disintegration and psychosis. It is not good for man to be alone.

Even where familial or other social connections remain intact, these ties are often weaker and the mutual obligations less binding today than in decades past. I recall one young adult patient who had given his depressed mother explicit permission to kill herself if she someday chose to do so. “I don’t want her to do it, but who am I to tell her she needs to continue living? It’s her decision.” If she was dying of despair, he was not going to get in the way.

Economic explanations alone cannot account for the rise in depression and suicide. Adolescent suicide, for example, is equally common among the very wealthy and the very poor. According to Deaton, the rise in suicides depends “on family, on spiritual fulfillment, and on how people perceive meaning and satisfaction in their lives in a way that goes beyond material success.” Family is the first society in which we gain social identity and security, and its ­declining fortunes have left many Americans vulnerable to despair. While overall divorce rates have declined ­modestly since a peak in the 1980s, divorce rates remain high for those without a college degree, and more Americans are simply opting out of ­marriage entirely. 

Sociologists have documented the close connection between the retreat from marriage and declining religious participation, especially among the working class. As a consequence of these changes, many Americans have “lost the narratives of their lives,” as Deaton puts it. This leads to a loss of meaning and hope. In a survey of 35,000 people from all fifty states, the Pew Research Center found that the percentage of Americans who believe in God, attend religious services, and pray daily declined significantly from 2007 to 2014. This drop is more pronounced among whites than blacks, and is largely attributable to the “nones”—the growing cohort of Americans, particularly among the millennial generation, who say they do not belong to any organized religion. The religiously unaffiliated now account for 23 percent of the adult population, up from 16 percent in 2007.

What is behind these trends? There are doubtless complex factors in play, including economic problems. Predictably, liberals are calling for a stronger safety net and a single-payer health-care system, while conservatives are calling for a deregulated free market that will spur economic growth and raise all boats. Neither solution addresses the deeper cultural dynamics. 

In 1897, Émile Durkheim published Suicide, an early attempt to understand the connection between culture and suicide. Noting the difference in suicide rates between Catholic and Protestant Germans, Durkheim argued that higher levels of social integration in Catholic societies helped reduce suicide, while greater individual autonomy and social isolation in Protestant societies tended to increase it. He identified two typical forms of suicide: There is egoistic suicide, stemming from a lack of integration into a community and leading over time to a sense of meaninglessness and ennui. Then there is anomic suicide, which increases during periods of social and economic upheaval—times at which people lose their communal moorings and drift toward despair. 

In recent times, America has experienced both a weakening of social connections and rapid forms of cultural change. Robert Putnam of Harvard has documented a dramatic decline in social capital—the fabric of connections to family, friends, neighbors, and mediating institutions of society—over the past several decades. There has been a loss of blue-collar jobs (with an attendant loss of responsibility and social esteem for men), changing roles and expectations for women, increasingly unstable family structures, isolated suburban living, and absorption in television and the Internet. 

What is lost with this decline of social capital? Thick social networks (the real, not virtual, variety) facilitate the exchange of ideas and information, as well as norms of mutual aid and reciprocity, collective action and solidarity. These help form our identities and give our lives a strong sense of purpose and belonging.

Too many people today have lost these moorings. Social bonds are weakening, and the social fabric is fraying. We are at risk of losing a solid identity, a clear orientation, and the coherent narratives that give meaning to our individual and shared lives. In a world stripped of universally binding truths, the sense that we are losing solid foundations leads to free-floating angst. This is a condition that cannot be tolerated for long. 

William Styron’s memoir of melancholy is aptly titled Darkness Visible, a phrase taken from Milton’s description of hell in Paradise Lost. Styron recounts that his depression was a condition so mysteriously painful and elusive as to exceed description. The inability of others to understand this experience is part of what makes depression so isolating. Preferring the older term melancholia, Styron lodges a protest against the very word depression, a term used indiscriminately to describe an economic downturn or a rut in the road—a truly wimpy word for such a serious illness.

The medical and psychological sciences have taught us a lot about this affliction, but the full story of depression is more complex. Innate biological and genetic factors contribute, but social and cultural factors also play a role. In short, while depression does indeed involve a “chemical imbalance in the brain,” this does not mean that it is nothing but a chemical imbalance. Your serotonin and dopamine levels may be out of kilter, but you may still have a problem with your Tinder compulsion and dinners alone in front of the television. 

We now have a sizable body of medical research which suggests that prayer, religious faith, participation in a religious community, and practices like cultivating gratitude, forgiveness, and other virtues can reduce the risk of depression, lower the risk of suicide, diminish drug abuse, and aid in recovery. To cite just one finding from among a growing body of medical research on this subject, Tyler VanderWeele of Harvard’s T. H. Chan School of Public Health recently published a study of suicide and religious participation among women in the U.S. Against the grim backdrop of increasing suicide rates, this study of 89,000 participants found that some groups remain protected from the rising tide of despair and self-harm. Between 1996 and 2010, those who attended any religious service once a week or more were five times less likely to commit suicide. Those who identified as either Catholic or Protestant had a suicide rate about half that of U.S. women in general. Of the 6,999 Catholic women who said they attended Mass more than once a week, none committed suicide. Religious practice turned out to be more important than mere affiliation; self-identified Catholics who did not attend Mass had suicide rates comparable to those of other women who were not active worshipers. 

There are straightforward reasons why religious practice protects against suicide. Church attendance is a social activity that protects people against loneliness and isolation. While this is not of course a unique benefit of religion, certain things are. Judaism, Christianity, and (in most cases) Islam have strong moral prohibitions against suicide. In Hinduism and Buddhism, suicide is considered bad karma. When these moral prohibitions are internalized, they reduce the risk of deliberate self-destruction. Furthermore, religious faith can instill a sense of meaning and purpose that transcends present exigencies; this helps people not only survive periods of intense anguish, but even to find meaning in suffering. As a patient of mine once put it, “If not for my relationship with Jesus, I would have killed myself a long time ago.”

Finally, long-term studies of individuals at high risk for suicide—patients who have been hospitalized for suicidal ideation or a suicide attempt—are telling. To investigate the differences between high-risk patients who survive and those who die by suicide, researchers have analyzed medical and mental health diagnoses, symptoms, physical pain, social and economic factors, and so forth. Over a ten-year span, it turns out that the one factor most strongly predictive of suicide is not how sick the person is, nor how many symptoms he exhibits, nor how much physical pain he is suffering, nor whether he is rich or poor. The most dangerous factor is a person’s sense of hopelessness. The man without hope is the likeliest candidate for suicide.

Hope cannot be delivered by a medical prescription. Yet we know it is essential for mental health. Hope allows us to live today, here, now, even as it orients us toward the future. Those who survived the Nazi concentration camps later recalled that death camp prisoners knew whenever a fellow prisoner had abandoned the last vestiges of hope. The despair could be seen in his eyes and countenance, in the very way that he carried himself. In time, the prisoners developed a name for such people: “the walking dead.” Before long, the person who had lost hope would stop eating or drinking, would come down with a terminal infection, or would straggle and be shot. We cannot live without hope. 

Contrary to popular myths about lemmings, suicide is a uniquely human behavior. Man is the only animal that deliberately takes his own life. Suicide is an act that requires rational self-reflection and awareness of one’s future. And it is influenced by one’s philosophical outlook and social context. Behavioral scientists describe depression as a response to toxic environments. Like the pain a child feels when he places his hand on a burner, depression can be a sign that an environment has become dangerous to the human organism. What are the toxic elements of contemporary culture that have led so many to withdraw into depression?

In a meritocratic age, we are valued for our usefulness. Whether in the rich precincts of Palo Alto, where children face high pressure to perform, or the forgotten stretches of West Virginia, Americans are increasingly told that they are valuable only insofar as they contribute to a productive economy. Old sources of meaning—­fatherhood, fraternity, civic involvement, church membership—have receded in significance before the SAT and future earning power. When the useful replaces the good and efficiency becomes the highest value, human beings are instrumentalized. This happens at a personal level when freedom is seen as doing what you want, making life a mere means of gaining pleasure. Rather than opening up new vistas of freedom, economic and social liberation has made men subject to a logic of utility. Among the dreary death works produced by today’s culture industry, there are T-shirts that proclaim, “I’m not saying I hate you, but I would unplug your life support to charge my phone.”

The law is a teacher, and American law ­increasingly teaches indifference to life when it runs up against respect for radical autonomy. California and Colorado recently joined four other states in permitting doctors to assist terminally ill patients to take their own lives. In the same week that Gov. Brown signed the California bill, two British scholars published a study showing that laws permitting assisted suicide in Oregon and Washington have led to a rise in overall suicide rates in those states. 

These findings should not surprise us. We know that publicized cases of suicide tend to produce copycat cases, often disproportionately among young people. Recall the recent spate of adolescent suicides in Silicon Valley. Social scientists call this “the Werther effect,” from Goethe’s eighteenth-century novel The Sorrows of Young Werther, in which the protagonist, thwarted in his romantic pursuits, takes his own life with a pistol. After the book’s publication, a rash of suicides among young men using the same means alarmed authorities in Germany. 

A related phenomenon influences suicide trends in the opposite direction. Portrayals of people with suicidal ideation who do not attempt suicide, but instead find strategies to cope with adversity, are associated with decreased suicide rates. The so-called “­Papageno effect” is named after a lovesick character in Mozart’s opera The Magic Flute whose planned suicide is averted by three child spirits who remind him of alternatives to death.

The case of fourteen-year-old Valentina Maureira, a Chilean girl who suffered from cystic fibrosis, illustrates both effects while highlighting the power of social influences. Maureira made a YouTube video begging her government to legalize assisted suicide. She admitted that the idea to end her life began after she heard about the case of Brittany Maynard, the twenty-nine-year-old woman who campaigned for the legalization of assisted suicide before ending her own life. Maureira, however, later changed her mind after meeting another young woman suffering from cystic fibrosis who encouraged her to persevere in the face of adversity. Her father complained that the media were only interested in her story when she wanted to die.

Besides the impact of publicized cases, we have evidence that suicidal behavior tends to spread person to person through social networks. These effects are measurable and reach up to three degrees of ­separation. My decision to take my own life raises not just my friends’ suicide risk; it raises that risk for my friends’ friends’ friends. No man is an island. ­Living as though we are self-creating, self-­determining, atomized entities is dangerous to ourselves and to others.

As solidarity and mutual affection disappear from our public spaces, as the horizon darkens and loneliness grows, the small lights emanating from cohesive communities—grounded in faith and motivated by charity—will shine more brightly. Connections between one lonely individual and another will become all the more precious in a society that can only value individuals for their utility.

A few years ago, a man in his thirties took his own life by jumping off the Golden Gate Bridge (as more than fifteen hundred other people have done since the bridge was built). After his death, his ­psychiatrist went with the medical examiner to the man’s apartment, where they found his diary. The last entry, ­written just hours before he died, said, “I’m going to walk to the bridge. If one person smiles at me on the way, I will not jump.”  

Aaron Kheriaty is associate professor of psychiatry and director of the Medical Ethics Program at the University of California Irvine School of Medicine.

This is the first of your three free articles for the month.