• Jrockwar@feddit.uk
    ·
    13 hours ago

    Easy. Waymo specifically, because over 22 million miles, they have demonstrated that they have about 7 times fewer crashes with injuries than human drivers. (0.4 collisions with injuries per million miles, compared to 2.78 for human drivers).

    I don't work for Waymo and I have no particular interest in them "succeeding" but I don't have reason to believe this data is fabricated - NHTSA has tight grips on reporting of every single incident and even disengagements of the autonomous system.

    These things are not a Tesla with an inexpensive sensor suite and a deceitfully marketed advanced cruise control.

    • underisk [none/use name]
      ·
      edit-2
      6 hours ago

      So if they have confirmed cases of people being injured by these things, who gets held legally responsible for them when they kill or injure someone, and how? If I unleash a murderous robot that only kills and injures one seventh of the people cars do am I legally in the clear as long as he picks up a few fares during his rampage?

      • Jrockwar@feddit.uk
        ·
        4 hours ago

        I wish I could give you an actual solid answer but I'm not a lawyer. My suspicion is that if Boeing could kill 346 people with their 737 MAX negligence without anyone going to jail, then

        • The people responsible for the self driving car software
        • The people in Waymo who collate the evidence for why it can go on-road
        • The legislators reviewing that evidence and approving it

        Would have little to no legal consequences except if there is a very obvious negligence, like what happened to Uber. In this case this resulted in,

        • negligent homicide for the person legally driving
        • suspension of autonomous vehicle testing which led to shuttering the whole division

        But it's a different one as there was a driver "monitoring" the system. I don't know how it would pan out in a driverless case to be honest.