Support First Things by turning your adblocker off or by making a  donation. Thanks!

The following is an official response from the Pew Research Center to Robert Wuthnow's article, “In Polls We Trust,” from the August/September issue of First Things. Read Wuthnow's response to Pew here.
—Ed.

In his essay in your August issue, “In Polls We Trust,” Robert Wuthnow highlights several important challenges researchers conducting surveys about religion face. Chief among them is the steep decline in response rates to telephone polls. This is indeed a serious issue; the response rate in a typical Pew Research Center poll fell from 36 percent in 1997 to 9 percent today.

Consumers of public opinion polls should always consider methodology and response rates when interpreting survey findings. To the extent that his essay inspires consumers of poll results to pay attention to the details of data collection, Wuthnow offers important advice.

But readers should not conclude that declining response rates make polls meaningless. An extensive body of academic research has examined the impact of non-response in surveys, and the basic conclusion is this: Non-response can produce biases but does not always do so, and it can affect some measures in a survey but not others. This research ultimately shows that response rate is not a reliable indicator of survey quality.

The not-for-profit Pew Research Center has conducted three major studies of the impact of declining response rates. The most recent study, from 2012, demonstrates that telephone surveys that include landlines and cell phones and are weighted to match the demographic composition of the population continue to provide accurate data on most political, social and economic measures. In most ways, the people interviewed in telephone polls with relatively low response rates closely resemble those who participate in government surveys with high response rates.

Our studies also show, however, that declining response rates are not without consequence. As we stated in the aforementioned study, “One significant area of potential non-response bias . . . is that survey participants tend to be significantly more engaged in civic activity than those who do not participate.” The report acknowledges that this may lead to overestimates of behaviors such as contacting elected officials, attending campaign events and attending worship services, since these are closely related to civic activity.

Wuthnow finds survey-based estimates of religious attendance particularly problematic, arguing that they overstate weekly attendance. But self-reported worship attendance has long exceeded actual attendance. The best recent research shows that worship attendance appears to be over-reported even in long-established academic surveys with high response rates. But the religious-attendance measure is still important and revealing. It can tell us about the characteristics of Americans who think of themselves as people who go to church regularly, making it a valuable indicator of religious commitment consistent with many additional traits and attitudes.

Other major trends in American religion that we have reported in recent years—such as the rise of the religiously unaffiliated and the decline in the share of Americans who identify as Protestants—are also seen in virtually every survey that consistently looks at these issues. The point estimates may vary because of differences in question wording and sampling methods, but the General Social Survey, Gallup polls, Pew Research surveys and others show the same major trends.

It is also worth noting that if non-response bias in religion surveys exists, it would inflate measures of religious affiliation and observance. As our studies show and as Wuthnow stipulates, civically engaged citizens are more likely to participate in telephone polls and more likely to attend religious services regularly. It follows, then, that the move away from organized religion that we and others have reported may be more dramatic than we document.

Pollsters certainly need to continue monitoring response rates and study their implications for the representativeness of public opinion polls. We also need to explore new forms of data collection by, for example, considering what “big data” can teach us about trends in American society, including religion.

And those interested in understanding religion should look for information from many sources. For example, deep, qualitative methods that allow individuals to share the details of their own experiences and practices are vital for understanding how religion matters in a person’s life. Religion's role in American society cannot be fully appreciated without an understanding of religious history. And the tenets and teachings of religious organizations are rooted in their respective theologies.

But we should not ignore poll findings. Surveys are uniquely positioned to inform the public about the broad contours of the U.S. religious landscape and of the religious trends impacting American society. How are Catholics in the U.S. responding to Pope Francis? What might explain the nationwide growth of those who say they have no particular religion? What do Americans think of recent historic Supreme Court rulings? These and other important questions can be answered, on a macro-scale, only by polling. Religion and society cannot be completely understood by deep qualitative methods or public opinion research alone. Both are critical.

Alan Cooperman is Director of Religion Research and Greg Smith is Associate Director of Research at the Pew Research Center.

Become a fan of First Things on Facebook, subscribe to First Things via RSS, and follow First Things on Twitter.

More on: Pew, Polls

Comments are visible to subscribers only. Log in or subscribe to join the conversation.

Tags

Loading...

Filter Web Exclusive Articles

Related Articles