this post was submitted on 07 Dec 2023
619 points (96.1% liked)

Technology

58780 readers
2811 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Tesla Whistleblower Says 'Autopilot' System Is Not Safe Enough To Be Used On Public Roads::"It affects all of us because we are essentially experiments in public roads."

you are viewing a single comment's thread
view the rest of the comments
[–] linearchaos@lemmy.world -3 points 10 months ago (5 children)

Unfortunately this is one of those things that you can't significantly develop/test on closed private streets. They need the scale, and the public traffic, and the idiots in the drunkards and the kids speeding. The only thing that's going to stop them from working on autopilot will be that it's no longer financially reasonable to keep going. Even a couple handfuls of deaths aren't going to stop them.

[–] Ottomateeverything@lemmy.world 34 points 10 months ago (1 children)

Unfortunately this is one of those things that you can't significantly develop/test on closed private streets.

Even if we hold this to be true (and I disagree in large part), the point is that Tesla's systems aren't at that stage yet. Failing to recognize lights correctly during live demos and such are absolutely things you can test and develop on closed streets or in a lab. Tesla's shouldn't be allowed on roads until they're actually at a point where there are no glaring flaws. And then they should be allowed in smaller numbers.

[–] TheGrandNagus@lemmy.world 10 points 10 months ago* (last edited 10 months ago) (1 children)

That's true, but I think the issue people have with "AutoPilot" is about marketing.

Tesla brands their cars' solution as being a full replacement for human interaction and word from Musk, other Tesla employees, media personalities close to Tesla, and fanboys all make out like the car drives itself and the only reason you need a driver in place is to satisfy laws.

It's bullshit. They know exactly what they're doing when they do the above, when they call their system "AutoPilot", when Musk makes claims his cars can travel from one side of the US to the other without human interaction (only to never actually do it, of course!), and sells car upgrades as Full Self Driving support.

If they branded it as Assisted Driving, Advanced Cruise Control, Smart Cruise, or something along those lines, like all the other carmakers do with their similar systems, I'd be less inclined to blame Tesla when there's an unfortunate incident. I think most would agree with me, too.

But Tesla markets and encourages, both officially and unofficially, that their cars have the ability to drive themselves, look after themselves, and that you're safe when using the system. It's a lie and I'm absolutely astounded they've had little more than a series of slaps on the wrist for it in most markets.

[–] linearchaos@lemmy.world -2 points 10 months ago

100% accurate.

They want people to use it so they get data from it. Accidents and deaths will happen... honestly, they'll always happen... they happen now without it, it's just more acceptable because it's human error. Road safety is absolutely awful.

The reason they get away with it is Lobbying, Money and Political favors. They got where they are by greasing a whole shit ton of wheels with dumptrucks of money.

Shitty means, but pretty righteous ways.

[–] Imgonnatrythis@sh.itjust.works 0 points 10 months ago (1 children)

Should a couple handfuls of deaths if as you said you can't test it any other way? Autopilot systems could already be saving thousands of lives if more widely deployed and a lack of good reliable autopilot systems has the opportunity cost of blood on our hands. Human drivers are well established to be dangerous. Testing and release of autopilot systems should be done as safely as possible, but to think the first decade or so of these systems will be flawless seems unreasonable.

[–] linearchaos@lemmy.world -5 points 10 months ago

Same happened with airplanes

[–] gregorum@lemm.ee -3 points 10 months ago (1 children)

The fact is that most technology that we take for granted today went through a similar evolutionary phase with public use before they became as safe as they are now, especially cars themselves. For well over a century, the automobile has made countless leaps and bounds in safety improvements due to data gathered from public use studies.

We learn by doing.

[–] KingThrillgore@lemmy.ml 4 points 10 months ago (1 children)

That's fine but Waymo, Cruise et al do trials on closed courses and in co-operation with states to assure a high degree of public safety. Tesla is testing without asking regulators.

[–] gregorum@lemm.ee 2 points 10 months ago

Do they? I actually, and honestly, have very little to no knowledge of how companies gather, which is why I did not mention them. Can you provide any links to any information about them? I honestly would like to learn more.