A Nation Under God

1 month ago
8

Christian Nationalism is currently prominent in the United States. Has this always been the case? The belief that America was founded upon Christian principles dates all the way back to the drafting of the constitution. But how was the United States truly founded? A close examination will reveal the United States was founded as a republic whereby the separation of church and state is absolute. But why has there been such a push for church and state to unite, particularly after the Civil War, to bring America back to God?

Loading 1 comment...