Not to defend Elon, but how many is that really? Cars are ridiculously unsafe and kill thousands of people every year, 736 crashes since 2019 is nothing.
Just wanted to add to the discussion: Tesla's definition of a "crash" is different from what everybody else uses. Tesla counts all incidents where an airbag deployed as a crash, most other agencies count any incident where a police report or insurance claim was made as a crash. This article talks about all of the problems with Tesla's self-reported safety data - those numbers are absolutely cherry picked to paint the rosiest picture possible.
:my-hero: "Umm, actually, it was not the crash that killed the victim, but the resulting electrical fire. Calling them a 'crash victim' seems a little unfair, lets put this one on the fire department."
Yeah what I really want is miles driven per crash compared between teslas and other cars
It's not really the frequency (even though it's not really a tiny number by any means) but it often fucks up in situations where an average sober driver wouldn't fuck up in a million years, like straight up running into the backs of trucks. It's not like it only fucks up in weird corner cases that happen once in a million, it fucks up in really obvious situations.
The number of such crashes has surged over the past four years, the data shows, reflecting the hazards associated with increasingly widespread use of Tesla’s futuristic driver-assistance technology as well as the growing presence of the cars on the nation’s roadways.
The number of deaths and serious injuries associated with Autopilot also has grown significantly, the data shows. When authorities first released a partial accounting of accidents involving Autopilot in June 2022, they counted only three deaths definitively linked to the technology. The most recent data includes at least 17 fatal incidents, 11 of them since last May, and five serious injuries.
Well yeah, if the technology becomes more widespread, there will be more crashes. I don't see how this indicates that Tesla autopilot specifically is more dangerous than the average 5 tons of steel piloted by drunk drivers texting at 80 miles per hour.
It's not replacing regular driving, it's massacring people while they test it out briefly.
After the engineer leaked footage of Tesla outside the core training region there were plenty of problems. The rate of accidents and dangerous situations outside that core training region were quite high. This includes pretty much every city center on Earth, public transportation, greenery near roads, bikes, kids, and places of road work, as well as different kind of bollards in use around the world.
for a single car brand that only rich people can afford, that has waiting lists to be able to even buy one?
if it was like ford, fiat, or renault then yeah, that would be low, but tesla?There were 43000 deaths caused by car accidents in the US in 2021 alone. 17 fatalities since 2019 is nothing.
if you only sell a thousand cars per year, that's a ridiculous failure rateedit: huh, they sold a lot more cars in the last few years than i thought, disregard
Tesla sold 1,2 million cars in 2022, 900000 in 2021 and 440000 in 2020.
I believe that's worldwide, I can't find statistics for the US for 2022 and stuff, but for 2021 300k were sold in America, so all that stats are probably around a third of that.
Also afaik most tesla drivers don't use autopilot constantly and they tell you it's not supposed to replace driving so they don't get sued, there's no fully self-driving car on the market
Yeah, that's the thing, I don't have the data either and neither does this article. Elon and Tesla can both get thrown into the blackest pits for all I care, but this headline is acting like I should be shocked at these numbers while giving 0 context for us to figure out what they mean.
Is this really because of the Tesla autopilot, or is it just another day of the car industry demanding its regularly scheduled human sacrifices? What is actually the problem here? I don't wanna blindly get mad at the wrong thing just because I don't like the guy behind it.
You don't need to convince me that Teslas suck. I'm willing to buy that Elon being a fraudster and Tesla being one entire grift are strong indicators that safety and reliability likely weren't great concerns when creating the autopilot.
My issue is how this is all being argued, because there should be proof readily available. We should have numbers on how many people are using the autopilot relative to how many people aren't, and we should be able to draw statistical conclusions from that. We shouldn't have to say "Tesla's manufacturing is bad so their autopilot is probably also bad", there should be more concrete evidence in this article of the autopilot sucking independently.
No, my default assumption is that the Tesla autopilot probably sucks because everything Elon touches sucks. That was my initial assumption and is still my current assumption as well. But that's all it is right now, and this article doesn't confirm my assumption in a satisfactory way, that is my issue with it. I would like to read this article and go "Ha, I knew it!" but with what little conclusive evidence it provides, it would be disingenuous of me to do so.
It is worth noting that tesla was #1 at 736 crashes since 2019 and subaru was #2 with 23 in the same time period. To your overall point this is still fairly formless since we don't have information on how many cars were using selfdriving / who was at fault / total hours used or w/e, but it still seems indicative.
Definitely possible, Tesla is well into "too big to fail" territory. Trust me that when the guy in the driver's seat tells me he's gonna activate the self-driving feature I'm jumping out of a moving Tesla :big-cool:
Yeah if it actually is a low crash rate that's great, but Tesla needs to be held liable for every crash and death
Tesla’s 17 fatal crashes reveal distinct patterns, The Post found: Four involved a motorcycle. Another involved an emergency vehicle.
They don't elaborate on these "patterns" at all after this. J o u r n a l i s m
If only :my-hero: was liable for and convicted for all this shit. :sicko-wistful:
From what i understand they went and made the self driving part with as close to "AI" as they could. Where there are simple solutions with redundant off the shelf kinda sensors that would have worked way better
A proper society would employ automated light metros and automated guideway transit instead of this trash :sicko-wistful:
This is one of the most cyberpunk news stories I think I've ever read. Computers taking over aspects of life unimaginable not long ago, major corporations acting without regard for consequences, totally ineffectual government unable to do anything about the problem (in this case, practically refusing to: "drivers are fully liable" says the NHTSA). All the themes of cyberpunk with none of the cool gadgets.
I was expecting it to be way more to be honest. For a car thats held together with duct tape, this is prety good. Probably safer than letting me drive.