Monday, in his address to the Turkish Parlaiment, President Obama made a statement guaranteed to spark controversy. "America is not a Christian nation." Like clockwork, conservatives voiced their complaint. Though it is clear that people have vastly different interpretations of what the founding fathers intended for the religiosity of America, from a theological standpoint, we cannot assert that America is a Christian nation.
This particularly struck me after watching FOX and Friends Wendesday morning. They were asking a panel whether or not President Obama was attacking or disregarding American Christians by making this statement. To their credit, they mostly agreed that President Obama did not put down Christians in America. However, once the show cut to commercial, they aired an advertisement for what else but Hooters, the restaurant that respects the image of God in women as much as Exxon/Mobile respects the call for stewardship of God's creation.
How much more offended should we have been if President Obama did declare America to be a Christian nation? Would we really want our faith and our savior associated with a country that gave birth to a trashy restaurant that objectifies women, or worse, a country that legalized slavery for 200 years and now has a wide gap between the rich and the poor? No. But at the same time we cannot deny that our country has done many great things under the influence of Christianity, such as the abolition of slavery, the passage of civil rights legislation, and the creation of PEPFAR.
This dichotomy is reminiscent of St. Augustine's teachings in The City of God. Essentially, in the book he teaches that no human institution, even the institution which calls itself the church, can fully embody the teachings of Christ, but within these institutions are committed Christians who do God's will. This applies to America too. America is not a Christian nation, but there are followers of Christ within the country pushing the government and the nation to do the will of God. The only state, nation, principality, or country that can call itself a Christian "nation" is the kingdom of God fully ushered in by the second coming of Christ.
America is not a Christian nation because no nation is a Christian nation.
Paul Hartge is the communications and media intern for Sojourners.
Got something to say about what you're reading? We value your feedback!