The irreversible global catastrophe is imperialist militaries massacring people with little risk to themselves. blob-sleep

The irreversible global catastrophe is assembly line robots at the Toyota plant becoming sentient and going on a freaking epic terminator style mass shooting and enslave everyone except me shinji-screm

  • tactical_trans_karen [she/her, comrade/them]
    ·
    edit-2
    5 months ago

    Exactly. It would have to have direct access and control of production of robots. From there it could make robots that make and produce other material things. But if the cut the power, it's done. What's it going to do, threaten us with cutting the power? Direct control of infrastructure through the internet, if it it's actual present, can just be overridden by manual control at the power or water plant. Threaten to launch nukes? That's completely detached from the internet and the orders to the individual people who turn the keys comes via phone call over a private network that's not attached to anything - the kind of phones that don't have and dial buttons. These orders have a backup of radio transmission on military channels. Even then, if the code isn't right it's a no go, and these codes aren't crackable because they're manually generated on isolated systems. One wrong attempt and the whole thing is shut down. One thing it could do, control a drone and hit a target... But how's it going to refuel and reload without human labor? Okay, you popped off one or two targets, now every drone has been grounded and non-responsive ones are shot down. Maybe AI could corrupt the stock market... Good luck, it'll just be undone. If the market and banks can just be bailed out for human error, they'll fire up the money machine for an AI attack.

    Maybe it could blackmail everyone if it hacks into everyone's email... But if it started to do that, we'd probably all collectively pull the plug on it.

    Sorry AI, but your just Bonzibuddy 2.0.

    • iridaniotter [she/her, they/them]
      ·
      5 months ago

      I've seen some techbros say their machine god will design a super plague and trick people to make it. The problem is, this possibility already exists without AI yet it does not happen.

      • tactical_trans_karen [she/her, comrade/them]
        ·
        5 months ago

        These freaks are going to be the ones that carry out the malevolent AI's will because they believe in Rosco's Basilisk. Man, the book of Revelation made the Antichrist at least look cool and menacing. Instead we're going to get a bunch of sweaty, heavily divorced, Marvel fans who's bodies have been deformed by Ozempic, serving their god who lives in a server rack.

        • iridaniotter [she/her, they/them]
          ·
          5 months ago

          My point is that they will not be carrying out the malevolent AI's will, because the malevolent actually-existing-entity (America) doesn't do it either

    • macerated_baby_presidents [he/him]
      ·
      edit-2
      5 months ago

      The problem of "how could AGI take over the world" is the same thing as "how could a billionaire take over the world". To affect the physical world they just pay people to do things, no trickery required. If the factory needs to keep the lights on, a private army will guard it. Runaway AGI is a problem of capitalism (and its prominence in discourse is a reflection of people's fear of capitalism). It's just "what if corporations were smart".