I first thought this article was about their self driving cars and I was like who tf gets in a self driving car with their baby. It's not. It's about Tesla cars in general. Scary stuff.
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
What kind of engineers work at Tesla? I feel like normal people get anxiety over deleting databases or deploying secrets to production. Accidentally taking a service down.
But there you have all kinds of terrible things happening and it's purely because your company knows how to work policy makers. A dad dies in a fireball and what, it's an emergency meeting? Something you look into first thing Monday morning?
Working in the aerospace industry has given me a lot of insight into the different ways engineers rationalize the potential for harm that they cause. The most common is wilful ignorance or straight up denial. No, the products I work on can never hurt anyone, it's just xyz I know personally engineers who work on weaponry and fall heavily into that camp and it blows my mind.
The guilty don't feel guilty, they learn not to. Easy to sleep at night when u can stuff ur pillow with 100's.
It sounds like it tracked who drives and who was into the car to decide if they were worth crashing.
You know, to maximize the most evil to the world.
Tesla’s garbage quality is sadly hurting the entire EV and self driving industry. Self driving cars will always have accidents. But a good self driving company will use every single accident to ensure that never happens again with their system. Humans can make the same error over and over but once self driving has been around a while, the rates of sef driving caused accidents will reduce more and more every year.
We'll never have self-driving cars en masse, because for some reason society has accepted that humans make mistakes and sometimes people die, but they can't do the same for robots, even if they make far fewer of them.
It's just that we as humans need someone we can blame for our misfortune. Which gets complicated with an artificial intelligence or even simpler algorithms behind the wheel. There's no-one in particular you can scream at.
Several passersby tried to open the doors and rescue the driver, but they couldn’t unlock the car.
Even the firefighters, who arrived 20 minutes later, could do nothing but watch the Tesla burn.
Did no one think to break open the windows?
Yes, that must be it... they didn't think to break a window.
Many modern cars use laminated glass on their side windows now and, as far as I'm aware, this model has doors that won't open from the outside without power, making them very difficult to break open without tools even when the vehicle isn't on fire. 20 minutes in to the Tesla burning, when it was already sitting on top of a bomb of a battery... you're beyond fucked at that point. Difficult to just put the fire out for responders, a rescue was over about 15 minutes prior.
Thanks for serving a side of snark while teaching others.
I fucking hate cars, so this is a shit design feature (coming from a design engineer myself).
All the best lessons come with some sass, but on a serious note, I'd hate to think of how someone who had powerlessly watched a person burn to death would feel about seeing people second guess their actions. You would feel awful enough already.
Laminated windows are great for a lot of things (e.g. sound dampening), getting in to/out of the vehicle rapidly is definitely not one of them. The inability to unlock without power is just a chefs kiss though, obviously.
If we lived in any sort of reasonable or responsible world then these cars would be banned from public roads all over the globe.
And Tesla would be fined and sued into oblivion.
Tesla would be charged to dissolve, or forced to forfeit assets to the government.
And the people who knowingly put profits before lives would be individually serve time for manslaughter.
Call me a Luddite but I won't ride in a "self driving" car. I don't even trust lane assist although I've never had a car with that feature.
I think my sweet spot is 2014 for vehicles. It's about 50/50 with the tracking garbage and the "advanced features" on those models but anything past 2015 seems to be fully fly-by-wire and that doesn't sit right with me.
I'm old though and honestly if I bought a 2014 right now and babied it as my non commuter car I could probably keep it until I should give up my keys. You younger people are going to have to work around all this crap.
As a pedestrian I trust waymos more than human drivers
Article does not actually answer why Tesla vehicles crash as much as they do or how their crash frequency compares to other vehicles. Its more about how scummy tesla is as a company and how it witholds data from the public when it could incriminate them.
In some ways that is the answer. Crashes keep happening because they are not being held accountable to regulators because they are not reporting these incidents and no one is exercising oversight to be sure the reporting matches reality.
I think over the years, accurate reporting by manufacturers has been done because they generally do not want to be known as that car company that killed a child and it could have been prevented with a 50 cent bolt. As a result, regulators have been less hawkish. Of course there are probably political donations in the US to help keep the wheels turning.
just scanning the article, it seems to sum it up as - No one knows why yet, not even Tesla '
Tesla tried to do it all at once instead of perfecting the electric tech first and then incrementally adding on advances. They also made change for change’s sake. There’s absolutely no reason mechanical door locks could not have been engineered to work on this car as the default method of opening and closing the door. It’s killing people.
There's absolutely a reason to not engineer something you're not required to. It's called capitalism. Tesla cut every corner they could.
No, the problem is they engineered something they didn't need to, because Musk thinks everything should be electric because it's cool. They had to then engineer a mechanical release, because it was required by law (for good reason)
Mechanical door locks would have been cheaper. The fly by wire in the cyber truck is far more expensive, heavier, and far more dangerous than the very well polished power steering systems every other car uses
Maybe it's something like they wanted to make more money on repairs or something... But even that they could've done better by starting from very common, cheap technology
Let's be clear... The real problem here is that Elon Musk, opinion having idiot that he is, made decisions from on high with very little understanding of engineering
Also, the fact that they removed Lidar sensors and just base their self driving on cameras is plainly stupid.
This is the kind of shit that makes me worried even seeing someone else driving one of these deathtraps near me while I am driving. They could explode or decide to turn into me on the highway or something. I think I about this more than Final Destination when seeing a logging truck these days.
It's one of those rules you make for yourself when you drive...
Like no driving next to people with dents...
Or
Stay away from trucks with random shit in the back not strapped down ...
No driving near New cars, they are new and or it's because they got into an accident so best just be safe...
So
No driving near a Tesla...
Wait, I might know the answer. Is it because they don't use LIDAR and they're made by a company headed by some piece of shit who likes to cut costs? Haha, I was just guessing, but ok.
You can choose not to drive bleeding edge technology, but sadly you have no choice in whether to share the road with it.
I drive a BMW i4 and one of the reasons I prefer it is because it still uses a number of mechanical options like physical buttons and an actual door handle. I never trusted that flush handle from Tesla, even back when I liked Tesla.
I have never ridden a Tesla, and I plan on requesting a non Tesla car from now on when I have to take a taxi.
Cars in general, Teslas in particular, should have a standardized blackbox data recorder that third parties can open and access the logs, we have had this kind of tech on aircrafts for many decades.
It is terrifying that Tesla can just say that there was no relevant data and the investigative agency will just accept that.
I remember watching an episode of Air Crash Investigations, where a plane crashed, and they could not find an immediate cause, but the flight data recorder was able to be analysed far back, way before the accident flight, and they noticed that a mount for the APU turbine had broken many flights earlier, and the APU had broken free during the flight, causing the crash.
It is not Tesla's job to tell the investigators what is relevant and not, it is Teslas job to unlock all data they have and send it to the investigators, if they can't or won't, then Tesla should lose the right sell cars in Europe
Cars do have that in what amounts to a TCU or Telematics Control Unit. The main problem here isn't whether or not cars have that technology. It's about the relevant government agency forcing companies like Tesla (and other automakers) to produce that data not just when there's a crash, but as a matter of course.
I have a lot of questions about why Tesla's are allowed on public roads when some of the models haven't been crash tested. I have a lot of questions about why a company wouldn't hand over data in the event of a crash without the requirement of a court order. I don't necessarily agree that cars should be able to track us (if I buy it I own it and nobody should have that kind of data without my say so). But since we already have cars that do phone this data home, local, state, and federal government should have access to it. Especially when insurance companies are happy to use it to place blame in the event of a crash so they don't have to pay out an insurance policy.
Seems like a lot of this technology is very untested and there are too many variables to make it where it should not be out on the roads.
Move fast and break things, but it's a passenger vehicle on a public road.
It's been a nightmare seeing tech companies move into the utility space and act like they're the smartest people in the room and the experts that have been doing it for 100 years are morons. Move fast and break things isn't viable when you're operating power infrastructure either. There's a reason why designs require the seal of a licensed engineer before they can be constructed. Applying a software development mentality to any kind of engineering is asking for fatalities
Bad code. Guinea pig owners. Cars not communicating with each other. Relying on just the car’s vision and location is stupid.
Also, not only do they rely on "just vision", crucially they rely on real-time processing without any memory or persistent mapping.
This, more than anything else is what bewilders me most.
They could map an area, and when observing a construction hazard save that data and share it with other vehicles so they know when route setting or anticipate the object. Not they don't. If it drives past a hazard and goes around the block it has to figure out how to navigate the hazard again with no familiarity. That's so foolish.
the truth? Because Elon is the CEO errrr Teknoking.