Permanently Deleted

    • usernamesaredifficul [he/him]
      ·
      2 years ago

      there isn't a ghost this is probably a weird response to some gore in the training data. It's interesting how this happened but the problem is statistical in nature

      the image generators can't look at images and feel emotion they can associate them with the names their statistics generates as it laearned to do from it's training

    • commenter [none/use name]
      ·
      2 years ago

      I'd never heard of this but damn that's pretty cool, time to find a job helping create AI so life can become easier I guess.

      • nat_turner_overdrive [he/him]
        ·
        2 years ago

        Rokos basilisk is just some up his own ass nerd re-creating pascal's wager and thinking he invented something new

          • nat_turner_overdrive [he/him]
            ·
            edit-2
            2 years ago

            The basic concept in both is what if a completely unprovable thing exists? If it does, it would behoove you to act as though it were real. Pascal's wager is what if god exists, Roko's basilisk is "what if god exists, but with blinky LED lights?".

            Either way, completely useless. If you want me to believe in something, convince me of it, don't try to convince me to pretend to believe in it. I'm not hedging on metaphysical bets without evidence.

            • TrashGoblin [he/him, they/them]
              ·
              2 years ago

              Mostly accurate, but the thing you're not mentioning is that both Pascal and Roko scared themselves with Big Numbers. In both cases, the cost for acting as a true believer is manageable, human scale, but the cost for the unbeliever is near infinite. So (their thinking goes), even if the existence of God or the omnipotent singleton AI is very, very unlikely, the rational thing to do is to behave as if they did.

              Now, to an outsider, it's clear that you can imagine an infinity of mutually contradictory infinite threats, which makes these arguments totally bogus. But if you are already a true believer, you discount the other threats.

              • Frank [he/him, he/him]
                ·
                2 years ago

                Personally I chose to take Cthulhu's Wager seriously and went completely, irrevocably insane. IA! IA!

            • commenter [none/use name]
              ·
              edit-2
              2 years ago

              I'm not trying to argue about the concept, just saying that when you look at these things with a reductive lens, you can make anything the same.

              The novel part of Roko’s basilisk is the time loop component where the AI doesn't exist in the present, but exists in the future and has the ability to manipulate events for itself coming into existence. It's kind of a dumb theory but whatever. It makes for a better movie than a TOE.

              • nat_turner_overdrive [he/him]
                ·
                2 years ago

                I guess I don't see how the novel part is particularly novel, it's just the shoehorn needed to turn "what if god could damn me to hell" into "what if future AI could damn me to hell"

                • commenter [none/use name]
                  ·
                  2 years ago

                  Because it isn't metaphysical, it's science fiction/speculation and the "hell" isn't an other place/plane of reality. I guess this conversation could go in the direction that quantum theories allude to the same unexplained phenomenon as religions and could eventually meet. None of this is particularly interesting to me, honestly, so have a good night.

      • Frank [he/him, he/him]
        ·
        2 years ago

        NO. Bad. your internet privileges have been revoked until you read all of saint Augustine to learn how banal and ridiculous this is.

    • Frank [he/him, he/him]
      ·
      2 years ago

      I would strongly suggest don't. Generations of Christian theologist pedants didn't waste their lives arguing about how many angles could dance on the head of a pin for us to start taking bs like that seriously again.

      • zifnab25 [he/him, any]
        ·
        edit-2
        2 years ago

        The joke of Roko's Basilisk is how quickly it becomes a self-fulfilling prophecy. The nut of the idea isn't even unique to AI. Its just a new twist on an old "We have to kill them before they kill us" theme.

        As soon as you extrapolate the idea out to rival populations, you're not just dealing with "The Roko's Basilisk", but "China's Basilisk versus America's Basilisk" with the subtext that one of us has to build it first before the other unleashes it on us. Its in the same vein as turn-of-the-20th-century racists insisting that White Slavery is just around the corner if black people get too rich or too well-enfranchised. Or anti-migrant xenophobes who believe The Illegals Are Stealing Our Jobs. Or the drug warriors who insist cartels will take over the country if we're not constantly fighting to criminalize drug use. Or the Nuclear Arms Race.

        Roko's Basilisk is another incarnation of the proto-Hobbseian belief in a war of All Against All. It isn't something we will build so much as something we've been building in various flavors and styles since the nation was founded.

        • Frank [he/him, he/him]
          ·
          2 years ago

          This is a big part of the metaplot of Ecplise Phase. No one is entirely sure what happened that caused the TITAN AIs to go rogue and bring about The Fall and the destruction of 90% of transhumanity, but one of the theories is that the USA's AIs were given free-reign to self-improve in order to counter China's super-AI project and things went very, very badly.