Is America a Christian Nation?
It's not uncommon that I hear it expressed that 'America needs to get back to its roots as a Christian nation' or some similar sentiment, usually in response to a Leftist political view. This seems to be a desire to return to some former glory days of the past, but to which time I'm not so sure. The reason I don't believe this is a correct or healthy expression is because, no, I don't believe we are a Christian nation. I also don't believe that we ever were, nor were ever intended to be one. I am an individual who identifies as a conservative evangelical Christian, which happens to be the most likely demographic of those who hold to the belief I am offering a kind refute to. Although I self-identify in this way, I must also acknowledge that I am much less politically staunch than many of my comrades. Even in light of this, I don't believe my thoughts here are politically or emotionally driven, but are actually grounded in facts. What is a Christia...
Comments
Post a Comment