this post was submitted on 30 Oct 2024
461 points (89.0% liked)

Technology

58981 readers
4152 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

OK, its just a deer, but the future is clear. These things are going to start kill people left and right.

How many kids is Elon going to kill before we shut him down? Whats the number of children we're going to allow Elon to murder every year?

(page 2) 50 comments
sorted by: hot top controversial new old
[–] NutWrench@lemmy.world 21 points 10 hours ago (2 children)

For the 1000th time Tesla: don't call it "autopilot" when it's nothing more than a cruise control that needs constant attention.

[–] GoodEye8@lemm.ee 10 points 10 hours ago

It is autopilot (a poor one but still one) that legally calls itself cruise control so Tesla wouldn't have to take responsibility when it inevitably breaks the law.

load more comments (1 replies)
[–] whotookkarl@lemmy.world 9 points 10 hours ago (2 children)

It doesn't have to not kill people to be an improvement, it just has to kill less people than people do

[–] rigatti@lemmy.world 5 points 8 hours ago (1 children)

True in a purely logical sense, but assigning liability is a huge issue for self-driving vehicles.

[–] Kecessa@sh.itjust.works 1 points 5 hours ago* (last edited 4 hours ago) (2 children)

As long as there's manual controls the driver is responsible as they're supposed to be ready to take over

load more comments (2 replies)
load more comments (1 replies)
[–] Hubi@feddit.org 185 points 16 hours ago (49 children)

The poster, who pays Tesla CEO Elon Musk for a subscription to the increasingly far-right social media site, claimed that the FSD software “works awesome” and that a deer in the road is an “edge case.” One might argue that edge cases are actually very important parts of any claimed autonomy suite, given how drivers check out when they feel the car is doing the work, but this owner remains “insanely grateful” to Tesla regardless.

How are these people always such pathetic suckers.

[–] teft@lemmy.world 117 points 16 hours ago (2 children)

I grew up in Maine. Deer in the road isn’t an edge case there. It’s more like a nightly occurrence.

[–] spankmonkey@lemmy.world 44 points 15 hours ago (3 children)

Same in Kansas. Was in a car that hit one in the 80s and see them often enough that I had to avoid one that was crossing a busy interstste highway last week.

Deer are the opposite of an edge case in the majority of the US.

[–] leftytighty@slrpnk.net 17 points 14 hours ago* (last edited 14 hours ago) (1 children)

Putting these valid points aside we're also all just taking for granted that the software would have properly identified a human under the same circumstances..... This could very easily have been a much more chilling outcome

load more comments (1 replies)
load more comments (2 replies)
load more comments (1 replies)
[–] leftytighty@slrpnk.net 29 points 15 hours ago

Being a run of the mill fascist (rather than those in power) is actually an incredibly submissive position, they just want strong daddies to take care of them and make the bad people go away. It takes courage to be a "snowflake liberal" by comparison

load more comments (47 replies)
[–] bluGill@fedia.io 31 points 13 hours ago (5 children)

Driving is full of edge cases. Humans are also bad drivers who get edge cases wrong all the time.

The real question isn't is Tesla better/worse in anyone in particular, but overall how does Tesla compare. If a Tesla is better in some situations and worse in others and so overall just as bad as a human I can accept it. Is Tesla is overall worse then they shouldn't be driving at all (If they can identify those situations they can stop and make a human take over). If a Tesla is overall better then I'll accept a few edge cases where they are worse.

Tesla claims overall they are better, but they may not be telling the truth. One would think regulators have data for the above - but they are not talking about it.

[–] Semi_Hemi_Demigod@lemmy.world 6 points 8 hours ago (1 children)

Humans are also bad drivers who get edge cases wrong all the time.

It would be so awesome if humans only got the edge cases wrong.

load more comments (1 replies)
[–] spankmonkey@lemmy.world 18 points 12 hours ago (2 children)

Tesla claims overall they are better, but they may not be telling the truth. One would think regulators have data for the above - but they are not talking about it.

https://www.reuters.com/business/autos-transportation/nhtsa-opens-probe-into-24-mln-tesla-vehicles-over-full-self-driving-collisions-2024-10-18/

The agency is asking if other similar FSD crashes have occurred in reduced roadway visibility conditions, and if Tesla has updated or modified the FSD system in a way that may affect it in such conditions.

It sure seems like they aren't being very forthcoming with their data between this and being threatened with fines last year for not providing the data. That makes me suspect they still aren't telling the truth.

[–] billiam0202@lemmy.world 10 points 11 hours ago (1 children)

It sure seems like they aren't being very forthcoming with their data between this and being threatened with fines last year for not providing the data. That makes me suspect they still aren't telling the truth.

I think their silence is very telling, just like their alleged crash test data on Cybertrucks. If your vehicles are that safe, why wouldn't you be shoving that into every single selling point you have? Why wouldn't that fact be plastered across every Gigafactory and blaring from every Tesla that drives past on the road? If Tesla's FSD is that good, and Cybertrucks are that safe, why are they hiding those facts?

[–] spankmonkey@lemmy.world 3 points 8 hours ago

If the cybertruck is so safe in crashes they would be begging third parties to test it so they could smugly lord their 3rd party verified crash test data over everyone else.

Bu they don't because they know it would be a repeat of smashing the bulletproof window on stage.

load more comments (1 replies)
[–] AA5B@lemmy.world 1 points 7 hours ago* (last edited 7 hours ago)

Given that they market it as “supervised”, the question only has to be “are humans safer when using this tool than when not using it?”

One of the cool things I’ve noticed since recent updates, is the car giving a nudge to help me keep centered, even when I’m not using autopilot

load more comments (2 replies)
[–] Turbonics@lemmy.sdf.org 28 points 12 hours ago (3 children)

The autopilot knows deers can't sue

[–] Semi_Hemi_Demigod@lemmy.world 5 points 8 hours ago

What if it kills the deer out of season?

[–] pirat@lemmy.world 1 points 6 hours ago

Right, most animals can only zoo!

[–] pirat@lemmy.world 1 points 6 hours ago

I guess that's the big game ...

[–] brbposting@sh.itjust.works 20 points 12 hours ago (1 children)

Tesla’s approach to automotive autonomy is a unique one: Rather than using pesky sensors, which cost money, the company has instead decided to rely only on the output from the car’s cameras. Its computers analyze every pixel, crunch through tons of data, and then apparently decide to just plow into deer and keep on trucking.

[–] Demdaru@lemmy.world 12 points 11 hours ago* (last edited 11 hours ago) (1 children)

I mean, to be honest...if you are about to hit a deer on the road anyway, speed up. Higher chance the scrawny fucker will get yeeted over you after meeting your car, rather than get juuuuust perfectly booped into air to crash through windshield and into your face.

Official advice I heard many times. Prolly doesn't apply if you are going slow.

Edit: Read further down. This advice is effing outdated, disregard. -_- God I am happy I've never had to put it i to test.

load more comments (1 replies)
[–] independantiste@sh.itjust.works 70 points 16 hours ago* (last edited 16 hours ago) (2 children)

Only keeping the regular cameras was a genius move to hold back their full autonomy plans

[–] cm0002@lemmy.world 36 points 16 hours ago (2 children)

The day he said that "ReGULAr CAmErAs aRe ALl YoU NeEd" was the day I lost all trust in their implementation. And I'm someone who's completely ready to turn over all my driving to an autopilot lol

load more comments (2 replies)
load more comments (1 replies)
[–] Nytixus@kbin.melroy.org 7 points 11 hours ago (2 children)

I roll my eyes at the dishonest bad faith takes people have in the comments about how people do the same thing behind the wheel. Like that's going to make autopiloting self-driving cars an exception. Least a person can react, can slow down or do anything that an unthinking, going-by-the-pixels computer can't do at a whim.

load more comments (2 replies)
[–] daniskarma@lemmy.dbzer0.com 7 points 12 hours ago* (last edited 12 hours ago)

People are well known for never ever running over anything or anyone.

load more comments
view more: ‹ prev next ›