this post was submitted on 20 Dec 2023
191 points (94.8% liked)

Technology

59588 readers
3091 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Tesla's recall of 2 million cars relies on a fix that may not even work::Tesla agreed to the recall last week after a federal investigation the system to monitor drivers was defective and required a fix.

top 24 comments
sorted by: hot top controversial new old
[–] Voroxpete@sh.itjust.works 67 points 11 months ago* (last edited 11 months ago) (3 children)

Tesla’s website says that Autopilot and more sophisticated “Full Self Driving” software cannot drive themselves

Full self driving

Cannot drive themselves

Christ I can smell the bullshit all the way from Canada.

[–] DreadPotato@sopuli.xyz 15 points 11 months ago* (last edited 11 months ago)

The "we can do it by end of this year" he's been toting since 2016 wasn't a giveaway?

[–] poopkins@lemmy.world 1 points 11 months ago

Dude, sit tight. Full self driving is coming at the end of 2017!

[–] Fapper_McFapper@lemmy.world 35 points 11 months ago (1 children)

Here I am hoping that Tesla, Twitter, Space X, and any other brand associated with Elon Musk burn to the fucking ground. Burn baby burn, show this wanna be emperor that he’s wearing nothing at all.

[–] ChicoSuave@lemmy.world 6 points 11 months ago (3 children)

Poor Boring Company never had a chance...

[–] Beetschnapps@lemmy.world 20 points 11 months ago

Had a chance to cancel a cross state rail project cause elon would rather crowd cars into a tunnel with no fire exit than let people take a fucking train.

https://time.com/6203815/elon-musk-flaws-billionaire-visions/

“Musk later admitted to his biographer that he had never planned to build a Hyperloop system in California, and primarily promoted it in order to prevent conventional HSR proposals from breaking ground.”

[–] barsoap@lemm.ee 3 points 11 months ago

Herrenknecht employees probably still make jokes about how he was going to "improve" the boring machine they built that he bought.

[–] KingThrillgore@lemmy.ml 1 points 11 months ago

I got a flamethrower out of it, and Ryerson got a TBM contract out of it. Wasn't all that bad.

[–] Thteven@lemmy.world 33 points 11 months ago (3 children)

Off topic but I've never seen a more horrible beard on a man.

[–] Guntrigger@feddit.ch 13 points 11 months ago

It looks like a character creator when they have really low quality hair rendering, so the most beard you can have is a few sparse stubble-like hairs.

[–] KingThrillgore@lemmy.ml 8 points 11 months ago

I swear to god, I see so many narcissists with shitty fashion sense and personal appearance awareness.

[–] Daxtron2@startrek.website 2 points 11 months ago

you haven't seen my beard

[–] ExLisper@linux.community 13 points 11 months ago (1 children)

So they are pretty much trying to figure out how to make sure the driver is paying attention to the road? IDK, maybe make the car respond to the steering wheel so that the driver has to move it or the car will not turn? That would ensure the driver is actually looking at the road.

Alternatively ask them questions about the surroundings. "Driver, what state is the car in front of you from? You have 3 second to answer or FSD will be disabled".

[–] abhibeckert@lemmy.world 10 points 11 months ago* (last edited 11 months ago) (2 children)

Just because a driver has their hands on the wheel doesn't mean they're watching the road. They might be watching a movie.

As for asking about number plates - that sounds like a distraction that would cause accidents rather than prevent them.

For me these systems need to be really clear. Either the person is driving, in which case they are fully responsible for every crash, or the car is driving, in which case the car is fully responsible. There's no room for any grey area in the middle.

In my opinion Tesla should be forced to refund anyone who was told their car has "full self driving". I'm OK with autopilot though, since the airplane and boat version of that feature has always pretty much been "just keep going in a straight line until a human disengages autopilot".

[–] ExLisper@linux.community 2 points 11 months ago

Asking questions was obviously a joke.

As for the rest I don't know what would it take to make sure the driver is paying attention. Distracted driving is the most common cause of accidents so clearly even in normal cars we can't be sure drivers are paying attention. I think we can agree cruise control is generally good but I have no idea what happens once the car has line following. Is it the same? You focus on the road more? Or do you stop paying attention completely? I think it's a questions to scientists really. Someone has to test it rigorously before it's actually added to the cars. My feeling is that once you don't have to drive by yourself (as in turn and brake) you eventually stop paying attention, so yeah, either the car drives itself 100% or you drive.

[–] alienangel@sffa.community 1 points 11 months ago

Note: it's not the HTSB or any other agency's responsibility to figure out a solution for Tesla. They just need to figure out what the bar for safety is, and tell tesla "make it as safe as full low light eye tracking, with whatever solution you want. But if you can't make it at least that safe your cars shouldn't be allowed back on the roads".

I was the biggest cheerleader for self driving cars because i hate driving - but "our best self driving car still can't self drive at all" isn't good enough, and letting them keep doing half assed shit like this does more harm to bringing people around to the technology than good.

[–] autotldr@lemmings.world 4 points 11 months ago (1 children)

This is the best summary I could come up with:


Tesla’s recall of more than 2 million of its electric vehicles — an effort to have drivers who use its Autopilot system pay closer attention to the road — relies on technology that research shows may not work as intended.

But research conducted by NHTSA, the National Transportation Safety Board and other investigators show that merely measuring torque on the steering wheel doesn’t ensure that drivers are paying sufficient attention.

“I do have concerns about the solution,” said Jennifer Homendy, the chairwoman of the NTSB, which investigated two fatal Florida crashes involving Teslas on Autopilot in which neither the driver nor the system detected crossing tractor trailers.

Missy Cummings, a professor of engineering and computing at George Mason University who studies automated vehicles, said it’s widely accepted by researchers that monitoring hands on the steering wheel is insufficient to ensure a driver’s attention to the road.

But they don’t see well at night, unlike those in General Motors or Ford driver monitoring systems, said Philip Koopman, a professor at Carnegie Mellon University who studies vehicle automation safety.

Kelly Funkhouser, associate director of vehicle technology for Consumer Reports, said she was able to use Autopilot on roads that weren’t controlled access highways while testing a Tesla Model S that received the software update.


The original article contains 1,028 words, the summary contains 212 words. Saved 79%. I'm a bot and I'm open source!

[–] Fiivemacs@lemmy.ca 1 points 11 months ago

Tesla agreed to the recall? Did they really have a choice?

A better solution, experts say, would be to require Tesla to use cameras to monitor drivers’ eyes to make sure they’re watching the road. Some Teslas do have interior-facing cameras. But they don’t see well at night, unlike those in General Motors or Ford driver monitoring systems, said Philip Koopman, a professor at Carnegie Mellon University who studies vehicle automation safety.

In case you were wondering who wrote the article