Only one death where FSD was active. But FSD disables itself automatically as soon as it predicts an imminent collision. It literally just goes “Jesus take the wheel” and turns itself off. So Musk fanboys like you can make that exact “only one death related to FSD” argument, because Tesla absolves itself of responsibility as soon as the collision is expected.
Because on the flip side of Elon fanbois is those going full monkey brain and equating any risk at all with new thing with the new thing being worse than the old thing.
Cuz there’s a possibility that actually “FSD” may do some stupid shit from time to time (plenty of evidence on YouTube) but is still overall safer than a human driver. It’s just that monkey brain says we should spend trillions on fighting terrorism when heart disease is literally millions of times more likely to kill you.
The real numbers are that in 2023, Teslas had more crashes per car than any other automaker. Teslas clocked in at 23.54 crashes per 1000 cars. The next highest was Ram, at 22.76 per 1000. Third was Subaru, at 20.90. Those were the only three automakers with numbers over 20, and Tesla is obviously above all the others by a fairly large margin.
Then when we look at incident rates, Tesla comes in second. “Incidents” also includes things like speeding, DUIs, reckless driving, and other citations. Ram came in first at 32.90 per 1000, and Tesla came in second at 31.13. So Ram owners are more likely to get pulled over and cited, but less likely to crash.
In 2023, 322 frontal wrecks (that is, wrecks where the Tesla ran directly into something) were known to have happened immediately after the Tesla disengaged its self-driving.
FSD disables itself automatically as soon as it predicts an imminent collision.
… and immediately slows down. I’ve tried it during this month’s free trial and did have one disengagement. It warned that a camera might be obscured, disengaged, and immediately slowed within the lane. I didn’t let it go but fully believe it would have stopped the car. I’m not sure what else you’d want it to do if it got confused
lmao that's hilarious. Oh shit, i'm about to run that guy over, I abdicate all responsibility and it's your fault! Criminals everywhere are taking notes.
It feels like something from Trailer Park Boys. Ricky is driving drunk, but right before he gets pulled over he makes Jacob or Corey take the wheel because then he's not driving so he won't get in trouble.
I thought Congress reacted to this by passing a law that states a self-driving system is at least partially responsible if it was in use up to something like 30 seconds before a crash…
That's interesting if it really is so and it's skewing the numbers. That however doesn't mean that it's then less safe than average human driver. I think there's a good chance it already is much better or for the very least soon will be. Not perfect, but really good.
From this months free trial, it does seem very good where there are lane lines. It does a great job of staying in lane, following GPS, switching lanes to pass and exit. It’s great for highway use, and I could believe it may be better than humans
Local roads around here usually only paint the center line and FSD gets closer to the edge than I’m comfortable with, although I don’t give it a chance to see if it’s actually dangerous. Similar to crossing an intersection: on my way to work I have one where the other side of the intersection is offset and there is no lane lines, so it changes lanes in the intersection. Oops, but also bad road design. Plus I have had it make a couple wrong choices when it can’t see but I can.
"Self-driving-ish" 😂
cruise control with a higher fatality rate
Booze cruise control.
I'm stealing this... way too funny
There's literally one single death linked to FSD and even in that case the driver was drunk.
Only one death where FSD was active. But FSD disables itself automatically as soon as it predicts an imminent collision. It literally just goes “Jesus take the wheel” and turns itself off. So Musk fanboys like you can make that exact “only one death related to FSD” argument, because Tesla absolves itself of responsibility as soon as the collision is expected.
But like… what’s the real number?
Because on the flip side of Elon fanbois is those going full monkey brain and equating any risk at all with new thing with the new thing being worse than the old thing.
Cuz there’s a possibility that actually “FSD” may do some stupid shit from time to time (plenty of evidence on YouTube) but is still overall safer than a human driver. It’s just that monkey brain says we should spend trillions on fighting terrorism when heart disease is literally millions of times more likely to kill you.
The real numbers are that in 2023, Teslas had more crashes per car than any other automaker. Teslas clocked in at 23.54 crashes per 1000 cars. The next highest was Ram, at 22.76 per 1000. Third was Subaru, at 20.90. Those were the only three automakers with numbers over 20, and Tesla is obviously above all the others by a fairly large margin.
Then when we look at incident rates, Tesla comes in second. “Incidents” also includes things like speeding, DUIs, reckless driving, and other citations. Ram came in first at 32.90 per 1000, and Tesla came in second at 31.13. So Ram owners are more likely to get pulled over and cited, but less likely to crash.
In 2023, 322 frontal wrecks (that is, wrecks where the Tesla ran directly into something) were known to have happened immediately after the Tesla disengaged its self-driving.
… and immediately slows down. I’ve tried it during this month’s free trial and did have one disengagement. It warned that a camera might be obscured, disengaged, and immediately slowed within the lane. I didn’t let it go but fully believe it would have stopped the car. I’m not sure what else you’d want it to do if it got confused
I don't see why that was necessary
That's what happens the system disengages before the accident so then it's on the driver not FSD
lmao that's hilarious. Oh shit, i'm about to run that guy over, I abdicate all responsibility and it's your fault! Criminals everywhere are taking notes.
Yeah.
It feels like something from Trailer Park Boys. Ricky is driving drunk, but right before he gets pulled over he makes Jacob or Corey take the wheel because then he's not driving so he won't get in trouble.
I thought Congress reacted to this by passing a law that states a self-driving system is at least partially responsible if it was in use up to something like 30 seconds before a crash…
That's interesting if it really is so and it's skewing the numbers. That however doesn't mean that it's then less safe than average human driver. I think there's a good chance it already is much better or for the very least soon will be. Not perfect, but really good.
Bro, it's ACC + lane centering, fucking Hyundai has a better system, it's not full self driving.
Maybe it's gonna be better than American drivers, not European ones
You should perhaps see some videos of FSD V12 in action and hear reviews from users. You might be basing your opinion on outdated info.
From this months free trial, it does seem very good where there are lane lines. It does a great job of staying in lane, following GPS, switching lanes to pass and exit. It’s great for highway use, and I could believe it may be better than humans
Local roads around here usually only paint the center line and FSD gets closer to the edge than I’m comfortable with, although I don’t give it a chance to see if it’s actually dangerous. Similar to crossing an intersection: on my way to work I have one where the other side of the intersection is offset and there is no lane lines, so it changes lanes in the intersection. Oops, but also bad road design. Plus I have had it make a couple wrong choices when it can’t see but I can.
Full self crashing
Honestly that's the best possible name for it.