• invalidusernamelol [he/him]M
    ·
    3 years ago

    It's not self driving though, you see him consistently interfering with it throughout the whole video. This is a simple "stay between the lines" machine and I'd be surprised if Tesla didn't explicitly use this road as a testing case because multiple people have taken their cars on it since the self driving feature came out.

    We've seen that the FSD is anything but, it's full of issues and will happily kill both the passenger and pedestrians. This video fails to carefully explain the difference between marketing and reality and chooses to entirely focus on parroting the marketing because doing so will bump you up in YouTube's algorithms.

    • drinkinglakewater [he/him]
      ·
      3 years ago

      Out of morbid curiosity, is there any confirmed instances of fatalities or injuries due to people using FSD?

      • Tomboys_are_Cute [he/him, comrade/them]
        ·
        3 years ago

        https://www.tesladeaths.com/

        Literally the best website I've found for it. Everyone who dies in one of these cars eventually gets on this spreadsheet.

        • drinkinglakewater [he/him]
          ·
          3 years ago

          Christ, it's very bleak seeing all those people tracked in a spreadsheet. Looks like a good resource though, thank you for sharing

          • Tomboys_are_Cute [he/him, comrade/them]
            ·
            3 years ago

            It is bleak, they even comply with Tesla's extremely strict definition of "Autopilot-at-Fault" which is no hands on the wheels or gears, no feet on pedals, no contact with controls at all. Despite their 100% bullshit definition of Autopilot-at-Fault its still that many

            • invalidusernamelol [he/him]M
              ·
              3 years ago

              Autopilot is definitely an issue, but I think the biggest issue is the amount of "vehicle erupts into flames"

        • TankieTanuki [he/him]
          ·
          edit-2
          3 years ago

          Has Elon cried and talked smack about that website yet, or does he ignore it? Someone has got to have tweeted it at him.

        • ClimateChangeAnxiety [he/him, they/them]
          ·
          3 years ago

          I’m curious, what’s the ratio of autopilot deaths to miles driven? Because as much as we rightfully talk shit about self driving cars, especially teslas, humans are notoriously fucking awful at driving cars, and even if the autopilot fucks up and kills people sometimes, it’s good as long as it kills less people per amount of driving.

          But also train good car bad. :train-shining:

          • invalidusernamelol [he/him]M
            ·
            3 years ago

            Not really, because at least the mistakes are at the hands of the people. Moving it over to machines will open the door for actuarial calculations and insurance companies controlling the ethical decisions an automated driver makes.

            • ClimateChangeAnxiety [he/him, they/them]
              ·
              3 years ago

              I’ll be honest I don’t super care about the ultimate decisions behind why people are being killed by cars, I just care that less people get killed by cars.

                • ClimateChangeAnxiety [he/him, they/them]
                  ·
                  3 years ago

                  The best way to do that is with less cars. If we’re stuck on having the same amount of cars, which unfortunately we seem to be, if the self driving ones kill less people I want to be using those ones.

                  But again, train good, car bad. If I had unilateral power it would be Climate Stalin, and we’re banning new cars day 1.

      • invalidusernamelol [he/him]M
        ·
        3 years ago

        I'm not totally sure, there are a ton of Tesla fatalities, but obviously Tesla keeps the data hush hush when they can. There have been dozens of tests that people have done showing instances (usually common) that trip it up and can cause it to put everyone at risk.

        Construction signage, bad paint, damaged road signs, all can cause it to just suddenly decide to kill you and others around it.

        • drinkinglakewater [he/him]
          ·
          edit-2
          3 years ago

          Yeah I've seen a few of those videos, it definitely doesn't seem safe. I guess to the credit of the Telsa freaks testing it, they do know to intervene and people that aren't Tesla freaks are probably more likely to be caught unaware of the shortcomings of the software.

          I remember seeing something about Tesla offering to cover issues not covered by the warranty in exchange for people signing NDAs which is probably happening for FSD related incidents too.

    • BynarsAreOk [none/use name]
      ·
      edit-2
      3 years ago

      It’s not self driving though, you see him consistently interfering with it throughout the whole video.

      Are you referring to his hand position? If you are realy scared you should definitely have the hands on the wheel anyway so you can react faster.

      He didn't disengage though you can see the blue icon at the top of the screen is the autopilot it stays on all the way.

      This is a simple “stay between the lines” machine and I’d be surprised if Tesla didn’t explicitly use this road as a testing case because multiple people have taken their cars on it since the self driving feature came out.

      This is correct this demonstration isn't as impressive it was on a clear day with perfect visibility and perfect road conditions no requirement to read signs, cross intersections, perform overtakes etc.

      Also this particular road seems to be relatively well maintained with very visible lines. Despite the challenging turns this is the perfect dream use case Tesla dreams about.