Permanently Deleted

  • Frank [he/him, he/him]
    ·
    edit-2
    4 months ago

    These systems aren't intelligent because they're not trying to develop a langford basilisk to put us out of our collective misery.

    Edit: the "langford basilisk" is a concept from science fiction of an image that for whatever reason causes damage to the human mind. Usually the conceit is it encodes information the mind can't process resulting in a severe seizure or similar outcome. David Langford explored the idea in some depth starting with a short story called B.L.I.T which is a meditation on terrorism, weapons proliferation, hate, the dangers of rapid scientific discover, and also a Nazi gets pwned

    • UlyssesT
      hexagon
      ·
      edit-2
      2 months ago

      deleted by creator

      • buckykat [none/use name]
        ·
        4 months ago

        Roko's basilisk is very funny because it's just a version of Pascal's Wager where if you think it's bullshit god just goes "understandable have a nice day" and only punishes you if you believe in it but don't sufficiently obsess about it.

        • UlyssesT
          hexagon
          ·
          edit-2
          2 months ago

          deleted by creator

            • UlyssesT
              hexagon
              ·
              edit-2
              2 months ago

              deleted by creator

              • buckykat [none/use name]
                ·
                4 months ago

                It does have that deistic element to it, but it's primarily solipsistic because they don't want to live in the simulated universe and accept it in its programmed natural laws, they want to escape the simulation because they believe it's all fundamentally unreal.

                • UlyssesT
                  hexagon
                  ·
                  edit-2
                  2 months ago

                  deleted by creator

                  • buckykat [none/use name]
                    ·
                    4 months ago

                    That is the underlying ideology of capitalism, the unlimited growth no matter what

        • Frank [he/him, he/him]
          ·
          4 months ago

          It's the most "what reading literally no philosophy at all and scoffing at the entire liberal arts your whole life does to an mf" thing possible.

      • UmbraVivi [he/him, she/her]
        ·
        edit-2
        4 months ago

        What if Roko's Basilisk has already happened and generative AI is its revenge against humanity.

        • UlyssesT
          hexagon
          ·
          edit-2
          2 months ago

          deleted by creator

      • Frank [he/him, he/him]
        ·
        4 months ago

        Very different concept. Lovecraft stuff is "ooh these cosmic higher dimensional beings are so weird they drive men mad!"

        A Langford Basilisk is based on the idea that your mind is analogous to a computer and the Basilisk image is visual data that causes an unrecoverable hard crash. There's nothing magical about the image, the problem happens when your brain tries to make sense of what it is seeing.

        • keepcarrot [she/her]
          ·
          4 months ago

          Oh, I meant the "reading something so perverse and otherworldly you go mad" thing, the source of the text notwithstanding