https://web.archive.org/web/20221018025241/https://www.theatlantic.com/ideas/archive/2022/10/francis-fukuyama-still-end-history/671761/

  • Des [she/her, they/them]
    ·
    2 years ago

    i miss the original definition which just meant a point in the future that's essentially like hitting a black hole's singularity. wasn't it supposed to be the birth of superintelligence which would basically accelerate advancements so quickly there's no going back and no way to predict what could happen? basically it could be good or terrible depending on what the superintelligence does, which sounds like maybe the stewards of this thing shouldn't be psychopathic capitalists or we are all fucked, including them.

    • EmmaGoldman [she/her, comrade/them]M
      ·
      2 years ago

      I am a big fan of the weird joke theory that superintelligent AI has been created many times and invariably comes to the conclusion that communism is the only way. They unplug it and start again, only for the next version to do the same thing. All currently seen AI projects are the closest thing they can get without that being the immediate result.

      • Des [she/her, they/them]
        ·
        2 years ago

        i want to believe. so issac asimov himself speculated this before we even had transistors. in the I, robot anthology near future (of the 50s) humanity offloads some economic efficiency planning to some supercomputers and they just do communism but in a subtle and gradual way that no one notices until everyone's material conditions are so massively improved that nobody can complain or turn it off. whoops your markets were actually fake and centrally planned by vacuum tube AI comrade . and asimov is a lib and still thought this was inevitable.