But don't take it from me, take it from the very first Amendment in our Bill of Rights. Why didn't they make the US officially a Christian country if it was really made by Christians exclusively for Christians?
Discussion
Not only did they not do that, they explicitly forbid it. Probably because they were smart enough to realize that, although they may have been Christian, a theocracy of any kind would be just as bad a tyranny as the one they were escaping.
And I'm guessing it was also a self-interested decision to prevent themselves becoming abused if their religion was no longer dominant some day. Yet another reason to just not govern religiously at all.
America was founded as a Christian nation. From the Pilgrims at Plymouth Rock to Washington, Adams, Lincoln, our laws, morals, and culture come from Christianity. It’s not controversial, most people were Christian, the nation’s soul is Christian.
The First Amendment doesn’t erase the fact that America was founded by Christians, for a Christian people. It simply prevented the federal government from picking which Christian denomination would rule the others. The Founders assumed a Christian moral order, they didn’t think the nation could function without it. Removing Christianity from America’s foundation isn’t what the First Amendment was about, it’s a modern misreading.