• drinkinglakewater [he/him]
    ·
    3 years ago

    Out of morbid curiosity, is there any confirmed instances of fatalities or injuries due to people using FSD?

    • Tomboys_are_Cute [he/him, comrade/them]
      ·
      3 years ago

      https://www.tesladeaths.com/

      Literally the best website I've found for it. Everyone who dies in one of these cars eventually gets on this spreadsheet.

      • drinkinglakewater [he/him]
        ·
        3 years ago

        Christ, it's very bleak seeing all those people tracked in a spreadsheet. Looks like a good resource though, thank you for sharing

        • Tomboys_are_Cute [he/him, comrade/them]
          ·
          3 years ago

          It is bleak, they even comply with Tesla's extremely strict definition of "Autopilot-at-Fault" which is no hands on the wheels or gears, no feet on pedals, no contact with controls at all. Despite their 100% bullshit definition of Autopilot-at-Fault its still that many

          • invalidusernamelol [he/him]M
            ·
            3 years ago

            Autopilot is definitely an issue, but I think the biggest issue is the amount of "vehicle erupts into flames"

      • TankieTanuki [he/him]
        ·
        edit-2
        3 years ago

        Has Elon cried and talked smack about that website yet, or does he ignore it? Someone has got to have tweeted it at him.

      • ClimateChangeAnxiety [he/him, they/them]
        ·
        3 years ago

        I’m curious, what’s the ratio of autopilot deaths to miles driven? Because as much as we rightfully talk shit about self driving cars, especially teslas, humans are notoriously fucking awful at driving cars, and even if the autopilot fucks up and kills people sometimes, it’s good as long as it kills less people per amount of driving.

        But also train good car bad. :train-shining:

        • invalidusernamelol [he/him]M
          ·
          3 years ago

          Not really, because at least the mistakes are at the hands of the people. Moving it over to machines will open the door for actuarial calculations and insurance companies controlling the ethical decisions an automated driver makes.

          • ClimateChangeAnxiety [he/him, they/them]
            ·
            3 years ago

            I’ll be honest I don’t super care about the ultimate decisions behind why people are being killed by cars, I just care that less people get killed by cars.

              • ClimateChangeAnxiety [he/him, they/them]
                ·
                3 years ago

                The best way to do that is with less cars. If we’re stuck on having the same amount of cars, which unfortunately we seem to be, if the self driving ones kill less people I want to be using those ones.

                But again, train good, car bad. If I had unilateral power it would be Climate Stalin, and we’re banning new cars day 1.

    • invalidusernamelol [he/him]M
      ·
      3 years ago

      I'm not totally sure, there are a ton of Tesla fatalities, but obviously Tesla keeps the data hush hush when they can. There have been dozens of tests that people have done showing instances (usually common) that trip it up and can cause it to put everyone at risk.

      Construction signage, bad paint, damaged road signs, all can cause it to just suddenly decide to kill you and others around it.

      • drinkinglakewater [he/him]
        ·
        edit-2
        3 years ago

        Yeah I've seen a few of those videos, it definitely doesn't seem safe. I guess to the credit of the Telsa freaks testing it, they do know to intervene and people that aren't Tesla freaks are probably more likely to be caught unaware of the shortcomings of the software.

        I remember seeing something about Tesla offering to cover issues not covered by the warranty in exchange for people signing NDAs which is probably happening for FSD related incidents too.