It doesn’t seem like this was always the case - obviously there’s a lot of myth making about the “founding fathers”, but it does seem that a lot of them were genuine Enlightenment men.

I’m not under any illusions that the USA was ever a secular nation, but it seems like the phenomenon we see now, of right wingers marrying America = Christianity, Christianity = America, in their worldview, wasn’t always there.

Is it just the result of Cold War propaganda, juxtaposing the American empire of Christendom with the evil atheist soviets?

  • kaka [he/him,they/them]
    ·
    4 years ago

    Don't forget that most Christians around the world aren't the evangelical kind you have in the USA which means that they believe the earth is older than 4000 years.