is there such a thing? do animals, plants, or other things have them?

  • StLangoustine [any]
    ·
    edit-2
    4 years ago

    they’d terminate that instance.

    What I'm trying to say is that even if you've terminated the instance after it used a different synonym in the diary and the memories of them being simulated are deleted, you have still subjected another copy to Holocaust.

    you’d need computational and predictive power far beyond anything available today

    It's seems like part of the suggestion is that we simulate everything until we get it right, but the other possibility the author is hoping for is that some fundamental advancement in understanding of physics will let us straight up look into the past and copy people's brain states.

    Funny titbit. In one of the Culture novels, Excession, I think. The Minds (superpowerful AIs that run everything) could simulate entire planets worth of people on molecular scale and get incredibly accurate predictions out of it, but (generally) don't, because that would be functionally the same thing as doing to the people the things you're simulating.

    • Bluegrass_Buddhist [none/use name]
      ·
      edit-2
      4 years ago

      What I’m trying to say is that even if you’ve terminated the instance after it used a different synonym in the diary and the memories of them being simulated are deleted, you have still subjected another copy to Holocaust.

      That's a fair point. Though I think in trying to reconcile that concern with the possibility - maybe moral duty? - of resurrecting everyone who's ever lived and died, you'd run up against the existential questions of what defines experience, memory and the "self." Can it be said that a being has been subjected to any kind of suffering, if immediately afterwards all memory and physical evidence of that suffering is completely erased? If you were to somehow have all your memories of your life up to this point erased tonight when you when you went to bed, would the "you" that woke up tomorrow without any of your current memories still regret any suffering the "you" reading this now has experienced?

      Honestly your point may be a good argument for not even bothering with quantum archaeology at all, because only beings that exist have the capacity to remember their past sufferings. Except that if conscious existence is inevitable (non-existing things can't experience non-existence, I don't think), than so is suffering, so at some point I think it might just become a kind of arbitrary moral algebra. What's worse? Subjecting millions of consciousnesses to suffering, or not allowing millions of consciousnesses a second chance at a better existence when their original lives were probably spent mostly in suffering?

      People keep telling me to read The Culture series but I've just never gotten around to it, lol.

      • StLangoustine [any]
        ·
        4 years ago

        Huh. My intuition is that inflicting suffering on a conscious being at bad no matter whether they remember it or not, whether there is someone to regret it or not. Like in the end we're all going to die and our memories are going to get lost. Making us suffer was still wrong in retrospect.

        There is this dude whom the Nolan movie Memento was based on. He can't form long term memories anymore and forgets anything new after like ten minutes. If I were to kidnap and torture him a bit and return him home it would be like if nothing happened in half an hour. Still seem like an incredibly shitty thing to do.

        If analyse the whole thing from the perspective of simple negative utilitarianism (where the only rule is to minimise suffering) you get a sort of antinatalism. There's no point of ressurecting people because dead people can't suffer while ressurected might, even without considering additional suffering during simulation.

        If go with usual utilitarianism (maximise pleasure minus suffering) you might be compelled to ressurect people considering you new world will give them more net pleasure than the suffering they experience in simulation. On the other hand in the framework like this it would make more sense just to make (or clone) more people conventional way because those people would not have to deal with painful simulation to get born.

        A recently popular form is preference utilitarianism. The idea is maximising not pleasure specifically but getting people what they think they want. Like if I decided I wanted to be hit in the balls the ethical action is to hit me in the balls even if you're sure it will only cause me suffering. In this context you'd have to ask the future ressurecty whether they want to deal with all the shit in the simulation to be reborn in the cool utopian future. Obviously if the person didn't leave a specific will you'd have to run them through the simulation to ask them this question which in itself would be a contradiction.