A Christian Nation?

Is the USA a Christian nation? No, it is not. If every citizen of the United States was a Christian then it might be a true statement. There are many faiths other than Christianity in this country and to call ourselves a Christian nation belittles and denies those of other faiths and makes a mockery of the First Amendment.

While most of the founding fathers were Christians and the majority religion was Christianity, they did not include anything in the Constitution to declare this nation to be a Christian nation. With the ratification of the Bill of Rights, the founders of this country guaranteed that all citizens would be free to follow whatever faith they chose.

Here’s a thought. Let’s just be a nation and start taking responsibility for our actions and treating other nations with respect. It’s time to leave our adolescence behind and become a mature, responsible nation, worthy of respect and emulation.

Advertisements

Author: Rick

I'm a simple man, trying to make my way in the universe.