A lot of rationalism is just an intense fear of death. Simulation hypothesis? Means that maybe you can live forever if you're lucky. Superintelligence? Means that your robot god might grant you immortality someday.
Cryogenics? Means that there's some microscopic chance that even if you pass away you could be revived in the future at some point. Long terminism? Nothing besides maybe someday possibly making me immortal could possibly matter.
I mean don't get me wrong I'd give a lot for immortality, but I try to uhh... stay grounded in reality.
That reminds me. If the world is about to FOOM into a kill-all-humans doomscape, why is he wasting time worrying about seed oils?
A lot of rationalism is just an intense fear of death. Simulation hypothesis? Means that maybe you can live forever if you're lucky. Superintelligence? Means that your robot god might grant you immortality someday. Cryogenics? Means that there's some microscopic chance that even if you pass away you could be revived in the future at some point. Long terminism? Nothing besides maybe someday possibly making me immortal could possibly matter.
I mean don't get me wrong I'd give a lot for immortality, but I try to uhh... stay grounded in reality.