My attempts to come up with what this misogynistic creep would consider a "friendly superintelligence" keep resembling Elliot Rodger's pre-shooting manifesto.

I also noticed the ".eth" crypto name drop. :agony-4horsemen:

  • Dirt_Owl [comrade/them, they/them]
    ·
    3 years ago

    Whos even to say such a being would even give a fuck about humanity. If I was an AI I'd fuck off to space or the middle of the ocean or some shit

    • kristina [she/her]
      ·
      edit-2
      3 years ago

      i mean, i'm assuming an AI wouldnt have robotics at its disposal at first. it seems to me it would just exploit a bunch of security vulnerabilities and take .1% of your processing power to contribute to its own intelligence. AI generally are designed with a use case in mind so its not unlikely that a hyperintelligent AI that somehow developed would still be prone to doing stuff in its core programming. which if we were designing a hyperintelligent AI i assume it would be for data modelling extremely complex stuff like weather systems (or on a darker note, surveillance)

      honestly i just think its weird that we'd just happen to accidentally design something hyperintelligent. i think its more likely that we'll design something stupid with a literal rat brain and it might fuck some shit up. rat cyborg that somehow creates a virus that destroys everything so that the hardware will give it more orgasm juices

    • JuneFall [none/use name]
      ·
      1 year ago

      Which is a good argument. Since the AI-bros are often the same that believe in space faring civilization stuff the logical step for AI's would be to ignore humans and just expand.