this blog is devoted to the stuff american evangelical culture likes
Friday, August 15, 2008
#16 Believing That America Is A Christian Nation
Christian culture is adamant that America is a Christian nation. The Constitution only mentions God and not Christ, but Christian culture very much wants our country to be Christian also. If God exists and if Jesus is who he said he is, he is much more concerned with individuals and their hearts than an entire nation conforming to a standard. But Christian culture clings to the idea that American is a Christian nation. It is a rather curious thing.