this post was submitted on 28 Aug 2023
410 points (97.7% liked)
Technology
59288 readers
4220 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Autopilot is not safe.
https://www.washingtonpost.com/technology/2023/06/10/tesla-autopilot-crashes-elon-musk/
Isn't it a glorified cruise control/lane guidance system, rather than an actual automated driving system? So it would be about as safe as those are, rather than being something that you can just leave along to handle its own business, like a robotic vacuum cleaner.
The main issue is that they market it like a fully autonomous system, and made it just good enough that it lulls people into a false sense of security that they don't need to pay attention, while also having no way to verify they are, unlike other systems from BMW, GM, or Ford.
Other systems have their capabilities intentionally hampered to insure that you're not going to feel it's okay to hop in the passenger seat and let your dog drive.
They are hands-on driver assists, and so they are generally calibrated in a way that they'll guide you in the lane, but will drift/sway just a bit if you completely take your hands off the wheel, which is intended to keep you, y'know, actually driving.
Tesla didn't want to do that. They wanted to be the "best" system, with zero safety considerations at any step other than what was basically forced upon them by the supplier so they wouldn't completely back out. The company is so insanely reckless that I feel shame for ever wanting to work for them at one point, until I saw and heard many stories about just how bad they were.
I got to experience it firsthand too working at a supplier, where production numbers were prioritized over key safety equipment, and while everyone else was willing to suck it up for a couple of bad quarters, they pushed it and I'm sure it's indirectly resulted in further injuries and potentially deaths because of it.
What does this remind me of... Oh yeah right, OceanGate
It is just a shit load of if then else statements. If the inputs don't have a corresponding if then it just defaults to doing nothing.
Driving a car is not safe. 40000 people die on car crashes every year in the US alone. Nothing in that article indicates that autopilot/FSD is more dangerous than a human driver. Just that they're flawed systems as is expected. It's good to keep in mind that 99.99% safety rating means 33000 accidents a year in the US alone.
You can't just put something on the streets without first verifying it's safe and working as intended. This is missing for Autopilot. And the data that's piling up is showing that Autopilot is deadly.
You can say the exact same thing for people.
Exactly, you can't just drive without verifying that you're a safe driver. That's why we have a process to get a driver's license. Has Autopilot passed licensing?
So if an autonomous car can drive around the block and parallel park it's licensed?
Humans have a lower accident rate than Tesla's autopilot, it says so in the article itself.
I don't see that claim anywhere in the article.
In fact any comparisons I've found show Tesla's autopilot performing better than humans. One crash 4.41 million miles driven on autopilot in a tesla vs one crash for every 1.2 million miles in a tesla without autopilot, NHTSA’s most recent data shows that in the United States there is an automobile crash every 484,000 miles.
https://cleantechnica.com/2021/12/07/tesla-1-crash-per-4-41-million-miles-traveled-on-autopilot/
Yeah thats why we make drivers get licenced.
First of all what is it that you consider safe? I'm sure you realize that 100% safety rating is just fantasy so what is the acceptable rate of accidents for you?
Secondly would you mind sharing the data "that's piling up is showing that Autopilot is deadly" ? Reports of individual incidents is not what I'm asking for because as I stated above; you're not going to get 100% safety so there's always going to be individual incidents to talk about.
You also seem to be talking about FSD beta and autopilot interchangeably thought they're a different thing. Hope you realize this.
There are very strict regulations around what is allowed to be in the streets and what isn't. This is what protects us from sloppy companies releasing unsafe stuff in the streets.
Driver assist features like the Autopilot are operating in a regulatory grey zone. The regulation has not caught up with technology and this allows companies like Tesla to release unsafe software in the streets, killing people.
This would indicate that FSD is more dangerous than a human driver, would it not?
That still doesn't tell are those accidents happening more compared to normal cars. If you have good driver assist systems which are able to prevent majority of minor crashes but not the severe ones then the total number of crashes goes down but the kinds that remain are the bad ones.
Humans my friend. We can hold humans accountable. We can't hold hunks of semi-sentient sand and nebulous transient configurations of electrons liable of anything. So, it has to be better than humans, which is not. If it isn't better than humans, then we'll rather just have a human in control. Because we can argue with and hold the human accountable for their actions and decisions.