this post was submitted on 14 Aug 2023
494 points (96.8% liked)

Technology

59300 readers
4640 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

New Footage Shows Tesla On Autopilot Crashing Into Police Car After Alerting Driver 150 Times::Six officers who were injured in the crash are suing Tesla despite the fact that the driver was allegedly impaired

you are viewing a single comment's thread
view the rest of the comments
[–] Technoguyfication@lemmy.ml 8 points 1 year ago (3 children)

This is literally exactly how it works already. The driver must have been pulling on the steering wheel right before it gave him a strike. The system will warn you to pay attention for a few seconds before shutting down. Here’s a video: https://youtu.be/oBIKikBmdN8

[–] Md1501@lemmy.world 7 points 1 year ago (1 children)

Ah, so its just people defeating the system

[–] stealin@lemmy.world 5 points 1 year ago (2 children)

The system with cars is that you don't distract the driver from driving, having a system that takes over driving is exactly that, so the idea of the system is flawed to begin with.

[–] Technoguyfication@lemmy.ml 1 points 1 year ago* (last edited 1 year ago) (1 children)

I have to say this is extremely inaccurate imo. Self driving takes over the menial tasks of keeping the car in the lane, watching the speed, etc. and allows an attentive driver to focus on more high level tasks like looking at the road ahead, watching the sides of the road for potential hazards, and keeping more aware of their blind spots.

Just because the feature can be abused does not inherently make it unsafe. A drunk driver can use cruise control to more accurately control the vehicle’s speed and avoid a ticket, does that make it a bad feature? I wouldn’t say so.

Autopilot and other driver assist systems are good when used responsibly and cautiously. It’s frustrating to see people cause an accident after misusing the system and blame the technology instead. This is why we can’t have nice things.

[–] NeoNachtwaechter@lemmy.world 0 points 1 year ago

It’s frustrating to see

This is why we can’t have nice things

It is also frustrating to see people whining for technology when they should rather think about dead policemen and rescuers.

You should get your priorities straight if you ever hope to be taken seriously

[–] vinceman@lemmy.blahaj.zone -1 points 1 year ago

Screenshotting this because it's so well put.

[–] PipedLinkBot@feddit.rocks 2 points 1 year ago

Here is an alternative Piped link(s): https://piped.video/oBIKikBmdN8

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I'm open-source, check me out at GitHub.

[–] NeoNachtwaechter@lemmy.world 2 points 1 year ago* (last edited 1 year ago) (1 children)

The system will warn you to pay attention

... and if we have learned anything from that incident, it is that the warnings have been worthless.

The system can be tricked even by the worst drunkards! 150 times in a row.

for a few seconds before shutting down.

Few seconds are not enough. The crash was already unavoidable.

[–] Technoguyfication@lemmy.ml 1 points 1 year ago (1 children)

You’re misinterpreting what I said and conflating two separate scenarios in your 2nd statement. I didn’t say anything about the system warning “for a few seconds before shutting down” in the event of an eminent collision. It warns the driver before shutting down if the driver fails to hold the steering wheel during normal driving conditions.

The warnings were worthless because the driver kept responding to them just before they timed out and shut autopilot down. It would be even worse if the car immediately pulled off the road and stopped in traffic without warning the driver first.

They aren’t subtle either, after failing to touch the wheel for about 5-10 seconds it starts beeping loudly and flashing an icon on the screen.

This is not a case of autopilot causing an accident, this is a case of an impaired driver operating a vehicle when they should not have been. If the driver was using standard cruise control, would we be blaming the vehicle because their foot wasn’t touching the accelerator when the accident happened? No, we wouldn’t.

[–] NeoNachtwaechter@lemmy.world 1 points 1 year ago (1 children)

This is not a case of autopilot causing an accident, this is a case of an impaired driver

It is both, of course. The drunkard and the autopilot, both have added their share to create such danger, that ended deadly.

Driving drunk is already forbidden.

What Tesla has brought on the road here should be forbidden as well: lane assist combined with adaptive cruise control AND such a bunch of blind sensors.

[–] Iheardyoubutsowhat@lemmy.world -1 points 1 year ago

The driver was in autopilot. Auto pilot is cruise control and lane assist. It's not FSD. Tesla didnt bring that " to the road ". The driver was drunk, and with most auto pilot or FSD accidents...its user error.

Still unaware of a proven FSD accident.