The company left out some key details regarding the incident involving one of its robotaxis and a pedestrian.


On October 2, 2023, a woman was run over and pinned to the ground by a Cruise robotaxi. Given the recent string of very public malfunctions the robotaxis have been experiencing in San Francisco, it was only a matter of time until a pedestrian was hurt by the self-driving cars. New reports, though, suggest that Cruise held back one of the most horrifying pieces of information: that the woman was dragged 20 feet by the robotaxi after being pushed into its path.

The LA Times reports:

A car with a human behind the wheel hit a woman who was crossing the street against a red light at the intersection of 5th and Market Streets. The pedestrian slid over the hood and into the path of a Cruise robotaxi, with no human driver. She was pinned under the car, and was taken to a hospital.

But this is what Cruise left out:

What Cruise did not say, and what the DMV revealed Tuesday, is that after sitting still for an unspecified period of time, the robotaxi began moving forward at about 7 mph, dragging the woman with it for 20 feet.

read more: https://jalopnik.com/woman-hit-by-cruise-robotaxi-was-dragged-20-feet-1850963884

archive link: https://archive.ph/8ENHu

  • usernamesaredifficul [he/him]
    ·
    8 months ago

    self driving cars are the answer to the question what if public transport was expensive and dangerous

  • Frogmanfromlake [none/use name]
    ·
    8 months ago

    Reddit techbros insisted that this was the future and I was being a luddite for saying that something like this was going to happen too often.

  • glibg10b@lemmy.ml
    ·
    8 months ago

    The pedestrian slid over the hood and into the path of a Cruise robotaxi, with no human driver. She was pinned under the car, and was taken to a hospital.

    That's one way to do it.

  • barrbaric [he/him]
    ·
    8 months ago

    Anyone who legalizes fully driverless vehicles should be forced to use them for the rest of their life.

  • TheEgoBot@lemmygrad.ml
    ·
    8 months ago

    I worked in this industry as a safety driver and various other positions for over 6 years in AZ, first with a company that also made national headlines, and then with the company that has connections with a certain search engine. From the inside it's easy to see these outcomes happening more and more frequently, these companies are concerned with getting as many driverless miles as they can because that's where their data comes from, the data is where the money comes from and that's all that matters. Drivers are typically subcontracted, forced to work long and stressful hours with few breaks. Safety is emphasized, but even dealing with fatigue in the appropriate ways can lead to disciplinary action if you're fatigued too often, so it goes unreported. I left the company specifically for safety concerns and despite making double there what I make now I won't be going back.

  • HowMany@lemmy.ml
    ·
    8 months ago

    Still working out some of the bugs. Not to worry. Not many of you will have to die in order for us to get the software right.

  • IvanOverdrive@lemm.ee
    ·
    8 months ago

    My vision of self driving cars was of an integrated system where all the parts weave together to create a safer and faster environment. But self driving cars are just not able to deal with the edge cases that will pop up. Even that would be okay, but GM tried to cover up this horrific accident. That inspires the opposite of trust. I gotta wonder how many other incidents have been covered up. GM is a company with limited resources. Alphabet, the parent company of Waymo, has a virtually infinite budget. How many incidents have they hidden from the public eye?

    • Omegamint [comrade/them, doe/deer]
      ·
      edit-2
      8 months ago

      Anyone with even a hobby level of coding knowledge knows it’s solving the edge cases that’s the real issue with resolving software problems. In this case I wouldn’t be surprised if automation reaches similar or lesser levels of traffic incidents, but the real shitty part is gonna be how much harder it is to get justice from a large corporate entity owning these robotaxi fleets versus nailing the little guy driving his Uber/taxi.

  • selokichtli@lemmy.ml
    ·
    8 months ago

    These cars should be monitored by human beings until their AI evolves enough to be actually more secure than human professional by-the-law pilots. If a human was monitoring the car, they probably could have stopped it immediately, or even hold it before it starts dragging that poor woman.

    Only if these cars can do the same or better than the human overseeing their activity, these cars will be safe enough to be offering a public service. Also, as shameful as it could be, this incident must get the most publicity because other competitors should test their AI against this specific situation as soon as possible.