this post was submitted on 05 Nov 2023
-18 points (17.9% liked)

Fuck Cars

9374 readers
987 users here now

A place to discuss problems of car centric infrastructure or how it hurts us all. Let's explore the bad world of Cars!

Rules

1. Be CivilYou may not agree on ideas, but please do not be needlessly rude or insulting to other people in this community.

2. No hate speechDon't discriminate or disparage people on the basis of sex, gender, race, ethnicity, nationality, religion, or sexuality.

3. Don't harass peopleDon't follow people you disagree with into multiple threads or into PMs to insult, disparage, or otherwise attack them. And certainly don't doxx any non-public figures.

4. Stay on topicThis community is about cars, their externalities in society, car-dependency, and solutions to these.

5. No repostsDo not repost content that has already been posted in this community.

Moderator discretion will be used to judge reports with regard to the above rules.

Posting Guidelines

In the absence of a flair system on lemmy yet, let’s try to make it easier to scan through posts by type in here by using tags:

Recommended communities:

founded 1 year ago
MODERATORS
 

Can be extended to self-driving cars that need to "decide", who they rather run over.

top 7 comments
sorted by: hot top controversial new old
[–] Damaskox@lemmy.world 2 points 10 months ago* (last edited 10 months ago) (1 children)

Concerning the negative votes -

My intention was not to upvote or downvote such an AI system.
My point was to bring it here for discussion and to think about it, neutrally 😁

[–] SkyNTP@lemmy.ml 1 points 10 months ago (1 children)

It's a scarecrow argument. The fact remains that human drivers are far worse safety hazards. Unquestionably. In the best case, this philosophical argument just becomes pointless navel gazing that we bring out for cars, but conveniently ignore regarding things like airplanes, assembly line machines, and virtually any process of human activity with engineering decision making.

Worst case, it serves to distract from actual moral hazards, like continuing to let people operate 2t steel boxes around vulnerable people.

[–] Damaskox@lemmy.world 1 points 10 months ago

Care to elaborate on the "2t steel boxes"?
Doesn't ring a bell what it means

[–] teejay@lemmy.world 1 points 10 months ago* (last edited 10 months ago) (1 children)

Radiolab did a great episode on this very topic.

It gets really interesting when you think about the clash of corporate greed in this area. It's not hard to imagine car companies selling you a premium option (or worse, subscriptions) where the car will make decisions prioritizing your life and safety over people outside, even if it's multiple people who would get maimed or killed to keep the driver safe.

[–] Damaskox@lemmy.world 1 points 10 months ago (1 children)

At least I'd hope that they informed the driver/car owner about the AI system in their car and - if it so is - that The AI could decide against their life.
Then they "just" need to decide is it worth it to get such a car with such an AI system or not.

Not telling them that their car has such an AI would be unethical, to say the least.

[–] teejay@lemmy.world 1 points 10 months ago (1 children)

Sure. Buried in some cryptic legalese in paragraph 3 on page 400 in 1 of 8 different EULAs that the car owner had to accept when first buying the car.

[–] Damaskox@lemmy.world 1 points 10 months ago

Not made easy 😬