Partner
Belief  /  Journal Article

Why Did Christianity Thrive in the U.S.?

Between 1870 and 1960, Christianity declined dramatically across much of Europe. Not in America. One historian explains why.
Doug Mills/Associated Press

The U.S. has always been a country of religious diversity, filled with people of many different religions as well as those with no religion at all. And yet, it’s common to hear the country labeled a “Christian nation.” The historian Jon Butler looked at some of the factors that made Christianity such a strong force in U.S. public life, particularly compared with many European nations.

Between 1870 and 1960, Butler writes, public Christianity declined dramatically across much of Europe. That might seem like a natural outgrowth of urbanization and scientific advancement, with people leaving the social constraints of their tight-knit communities behind. But the U.S. did not follow suit. Instead, Christianity grew in power.

Butler suggests that one reason Christianity didn’t decline in the U.S. is that both Protestant and Catholic denominations organized to keep it strong. They held conferences about the “problem” of religion in cities, proselytized to new immigrants, and used emerging sociological tools to their own advantage. In the 1890s and 1900s, the Federation of Churches and Christian workers surveyed almost 100,000 households in Manhattan and Brooklyn and used the data to organize congregations to suit their religious needs.

U.S. churches were also different from their European counterparts in their embrace of “therapeutic theology”—an idea that tied religious belief to worldly success. While many Christians criticized this line of thinking as shallow and self-serving, Butler writes that “therapeutic theology, together with the American congregational emphasis on social services from education to child care, probably kept more Christians in America’s pews and brought more inquiries inside its churches than any other lure or attraction in the twentieth century.”

In Butler’s view, American Christianity’s success was also a product of gender and racial dynamics. Despite being excluded from most formal leadership positions, women played a much larger role in Christian institutions in the U.S. than in Europe. That included running Sunday schools, organizing temperance groups, and helping to build the growing Catholic school system.