I hinted around the idea, in my last post, that there is a problem with the mentality that says that America is a “Christian” nation. Today, I want to discuss that idea in further detail.
Because of the First Amendment, the statement that America is a “Christian” nation is blatantly false. A case might be able to be made that it was founded on Christian principles, but even that is a stretch. America was founded as a Representative Republic and a land where the settlers could be free from the tyranny of their previous leaders. It had nothing, really, to do with being a Christian versus being a pagan. England was a Christian nation in that the church and the state were enmeshed with each other. America was to be something vastly different.
America was founded as a place where religious freedom would be respected. It was a place where one would be allowed to be a Hindu, Muslim, Jew, Buddhist, or a Christian and no one could say that it was illegal or ostracize you for your religious preference. Sadly, this vision died quickly in the history of our nation.
By the early to mid 1800’s, there was a new idea and that was “Manifest Destiny;” this idea that God wanted us to expand all across the Continent; the idea that God had given us this land and it was our duty to settle it. And who seemed to lead the way? The church.
As the United States expanded to include more territory, the church established schools to assimilate the Native peoples into the American way. They basically kidnapped children from their families, punished them for speaking their own languages, and forced them to learn English and the American way of doing things. All the while, these schools sought to indoctrinate and convert these children to the Christian religion. While it wasn’t illegal to be of another religious persuasion, it was discouraged, if not in word, then in deed.
In the midst of this grand expansion, groups like the Latter-Day Saints and Jehovah’s Witnesses sprang up, groups that revel in brainwashing and control of their adherents. And this was all done, of course, in the name of God and the name of religion, and, to lesser or greater degrees depending on the sect, in the name of America.
This is not to say that God did not destine European settlers to come here. In a way, He did give them this land, but I don’t believe that God intended for them to abuse it and the indigenous people as they did. God did, and does, have a plan for America, but it was not to be a “city on a hill” in any political sense. If anything, God was calling people to the same kind of inclusion of “outsiders” that Jesus did.
But God did not call on the settlers to create a “Christian” nation. And the framers of our Constitution knew this. They did indeed acknowledge that God had endowed mankind with certain rights, but those rights were not formulated by a necessarily Christian worldview and were not limited to only those who held to the tenants of the religion. All men were granted those rights, regardless of religion.
To claim that America is a Christian nation is to tell a lie. After all, if it were true, then wouldn’t we look a lot more like Jesus?