Was America Ever Christian?
Founding, Awakening, and a Common Myth
Jan 9, 2023
A careful history of the United States reveals a kind of “Christian nation,” but not the one many expect: not a political faith, enshrined from above, but a cultural one, spread from below.