Permanently Deleted

    • 0karin728 [any]
      ·
      3 years ago

      Basically, but I think it's even dumber. It's like pascal's wager if humans programmed God first.

      The idea is that and AI will be created and given the directive to "maximize human well-being", whatever the fuck that means without any caveats or elaboration. According to these people such an AI would be so effective at improving human quality of life that the most moral thing anyone could do before it's construction is to do anything in their power to ensure that it is constructed as quickly as possible.

      To incentivise this, the AI tortures everyone who didn't help make it and knew about Roko's Basilisk, since it really only works as motivation to help make the AI if you know about it.

      This is dumb as fuck because no one would ever build an AGI that sophisticated and then only give it a single one sentence command that could easily be interpreted in ways we wouldn't like. Also, even if somehow an AI like that DID manage to exist it makes no sense for it to actually torture anyone because whether it does or not doesn't effect the past and can't get it built any sooner.