I guess I don't really understand what's concerning you.
I suppose it's theoretical possible for there to be a really smart and powerful thing that exists and wants to make us suffer or consume the sun or something. But it's also possible and probably much more likely for catastrophic things to happen from our perspective just by the nature of the unfolding, unaware, cosmos.
Unless you mean something capable of enacting its will in a way that's qualitatively different somehow?
Everything points to most things either being dead, shortly dead, or out of its reach.
Even if this monster can reach across the cosmos and we ascribe some Real significance to suffering . . . there's only so long it can go on.
I frankly don't buy the whole punishing everyone posthumously with clones or whatever thing. If that kind of incentive structure worked for complex life as we know it, belief in Hell would have been sufficient. In fact a lot of this line of thinking dovetails with problems of God, just in a way that reconstrains it within the theoretically possible.
There's also a peculiar note of hubris in the idea that generalized AI would necessarily or even probably simply continue to refine itself indefinitely and with sufficient foresight that it became an unstoppable force in the universe. The idea that if something were just a little smarter than us and with better longevity, it would conquer all, to the point of outsmarting any attempt at strangling it in its crib - or that there's a level of precognition possible where at any given point in its life it might see a coming, lethal astronomical event and also be able to avert it. There's a lot that has to go right for the Machine God to be born, most of which has nothing to do with anything but chance.
There's a long time of probably anything happening in the universe, but the vast, vast majority of it is really unconducive to the sort of rich, stuff-happeningness necessary for a complex structure like an interstellar AI to emerge. At a certain point you have to wonder whether by the time a suitable "seed" and the conditions for our monster to really get under way exist, there will even be sufficient available energy anywhere at every stage of its development and ability to acquire more for it to cross interstellar gaps with the information necessary to get going there too, or to influence in some other fashion across those gaps, other stars or systems to its ends?
So I wouldn't call it inevitable. I think there's a pretty small window where any of this is even remotely possible. It's kind of a bummer, but the universe is largely just getting further and further apart. We don't even really know how much is already lost from ever being detectable from here.
I guess I don't really understand what's concerning you.
I suppose it's theoretical possible for there to be a really smart and powerful thing that exists and wants to make us suffer or consume the sun or something. But it's also possible and probably much more likely for catastrophic things to happen from our perspective just by the nature of the unfolding, unaware, cosmos.
Unless you mean something capable of enacting its will in a way that's qualitatively different somehow?
deleted by creator
Everything points to most things either being dead, shortly dead, or out of its reach.
Even if this monster can reach across the cosmos and we ascribe some Real significance to suffering . . . there's only so long it can go on.
I frankly don't buy the whole punishing everyone posthumously with clones or whatever thing. If that kind of incentive structure worked for complex life as we know it, belief in Hell would have been sufficient. In fact a lot of this line of thinking dovetails with problems of God, just in a way that reconstrains it within the theoretically possible.
There's also a peculiar note of hubris in the idea that generalized AI would necessarily or even probably simply continue to refine itself indefinitely and with sufficient foresight that it became an unstoppable force in the universe. The idea that if something were just a little smarter than us and with better longevity, it would conquer all, to the point of outsmarting any attempt at strangling it in its crib - or that there's a level of precognition possible where at any given point in its life it might see a coming, lethal astronomical event and also be able to avert it. There's a lot that has to go right for the Machine God to be born, most of which has nothing to do with anything but chance.
deleted by creator
There's a long time of probably anything happening in the universe, but the vast, vast majority of it is really unconducive to the sort of rich, stuff-happeningness necessary for a complex structure like an interstellar AI to emerge. At a certain point you have to wonder whether by the time a suitable "seed" and the conditions for our monster to really get under way exist, there will even be sufficient available energy anywhere at every stage of its development and ability to acquire more for it to cross interstellar gaps with the information necessary to get going there too, or to influence in some other fashion across those gaps, other stars or systems to its ends?
So I wouldn't call it inevitable. I think there's a pretty small window where any of this is even remotely possible. It's kind of a bummer, but the universe is largely just getting further and further apart. We don't even really know how much is already lost from ever being detectable from here.