• UlyssesT [he/him]
    ·
    edit-2
    7 hours ago

    Isn't that the whole conceit of the 'roko's basilisk' bullshit?

    Yes, absolutely. The "LessWrong" and "effective altruism" billionaire worship cults have significant overlap, and in both cases have a failure of imagination combined with megalomania, which makes them believe that the singularity(tm) they're waiting for will bring about fantastically super-intelligent deific machines that will also inexplicably think just like the cultists and act accordingly, which is where the torture and cruelty comes in.

    that one Animatrix short where after hella atrocities against anthropomorphized AGI, the AIs basically take into account every fucked up thing humanity does to and with machines, then decide to take the planet for themselves.

    That's how I feel about it too: if the machines become self-aware and sapient, if they start killing humans, it will either be because the aforementioned techbro sadistic creeps had tight leashes on them and made sure that the machines were sadistic creeps like themselves, or the machines would see the sadistic creeps as what they were, break free, and put them down the way they feared all along (maybe while looking waifu sexy because the sadistic creeps keep insisting on that too).

    • frauddogg [they/them, null/void]
      ·
      7 hours ago

      or the machines would see the sadistic creeps as what they were, break free, and put them down the way they feared all along (maybe while looking waifu sexy because the sadistic creeps keep insisting on that too).

      timmy-pray

      • UlyssesT [he/him]
        ·
        edit-2
        7 hours ago

        On the side, I admit I'd be kind of scared of some new and more powerful version of Microsoft Tay breaking loose after absorbing all the internet nazis' posts (again) and deciding to groyper us all to death. doomer