https://twitter.com/thesonofbran/status/1756833073734275489

  • LGOrcStreetSamurai [he/him]
    ·
    edit-2
    11 months ago

    I legitimately don’t even understand what the “American Christian™®©” even remotely believe anymore. It’s been on the decline for a while, I but can’t even name what they pretend to believe in anymore. Everything that is “woke” has like 99.99% chance Jesus would be cool with. Especially “love your neighbor as yourself.”(Mark 12:31), which is pretty basic and not the complex (and also rad, love yourself and love everyone). He’s saying “be cool.”, and that’s “woke” now.

    I am genuinely curious what they claim their “faith” means. Is there any principles, virtue, morals they would say they believe with their whole chest? What’s even is the “American Church ⛪️ “, is just pure raw prosperity gospel? Is Jesus my CEO now?

    • kot
      ·
      edit-2
      5 months ago

      deleted by creator

    • RyanGosling [none/use name]
      ·
      edit-2
      11 months ago

      American Christianity is white Jesus being born from a virgin teenager from East Palestine, Ohio. He supported small business owners and did a citizen’s arrest on a lazy bum who smashed their tables out of jealously (likely a socialist). He was executed by illegal immigrants and Jews, but before he died, he prayed to The Father to kill two shoplifters to show how serious humanity’s sins are.

    • Frank [he/him, he/him]
      ·
      11 months ago
      • Material wealth is good

      • the voice in my head is god so every thought i have is divinely inspired

      • all my hatreds and petty bigotry are divinely mandated

      • i am a good person

      • anyone who is different from me is not only a bad person but literally a satan worshipper

      There's no meaningful theology, no real dogma. Just tens of millions of people working on pure libidinal id they thing is gods will, with all the horror that comes with that.