• The US is among countries arguing against new laws to regulate AI-controlled killer drones.
  • The US, China, and others are developing so-called "killer robots."
  • Critics are concerned about the development of machines that can decide to take human lives.

In a speech in August, US Deputy Secretary of Defense, Kathleen Hicks, said technology like AI-controlled drone swarms would enable the US to offset China's People's Liberation Army's (PLA) numerical advantage in weapons and people.

"We'll counter the PLA's mass with mass of our own, but ours will be harder to plan for, harder to hit, harder to beat," she said, reported Reuters.

Frank Kendall, the Air Force secretary, told The Times that AI drones will need to have the capability to make lethal decisions while under human supervision.

"Individual decisions versus not doing individual decisions is the difference between winning and losing — and you're not going to lose," he said.

"I don't think people we would be up against would do that, and it would give them a huge advantage if we put that limitation on ourselves."

  • WayeeCool [comrade/them]
    ·
    edit-2
    7 months ago

    sounds good, no issue here, might as well equip a few with nukes, it'll be fine

    I know you are joking... but this was what Northrop Grumman originally proposed for the B21 Raider. Unmanned nuclear armed long-range stealth bomber. Luckily someone in the US Air Force was able to restore sanity before the B21 Raider program was finalized. Instead we are going to get so-called Loyal Wingmen like the MQ-28 Ghost Bat and XQ-58 Valkyrie, fully automated multirole strike fighters with the flight characteristics and armament of an F16.

    so a drone takes off from some base somewhere, loses wifi for a second, then starts dropping ordinance on whoever is closest?

    Not far off tbh, which is why I don't like that this is where the US is headed. Won't be randomly firing and dropping muntions but will be doing so with the decision making accuracy of "artificial intelligence". So, something like 96% accuracy on differentiating a civilian airliner from a hostile enemy jet or an elementary school from a military base. When cut off from the remote human operators, the mission will go on.

    Militaries have a thing for building fail-deadly into their doctrines.