this post was submitted on 24 Oct 2024
79 points (100.0% liked)

Technology

37716 readers
394 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] TherapyGary@lemmy.blahaj.zone 26 points 2 weeks ago (10 children)
[–] davehtaylor@beehaw.org 8 points 2 weeks ago (3 children)

If a HumanA pushed and convinced HumanB to kill themselves, then HumanA caused it. IMO they murdered them. It doesn't matter if they didn't pull the trigger. I don't care what the legal definitions say.

If a chatbot did the same thing, it's no different. Except in this case, it's a team of developers behind it that did so, that allowed it to do so. Character.ai has blood on their hands, should be completely dismantled, and every single person at that company tried for manslaughter.

[–] Buttons@programming.dev 3 points 2 weeks ago* (last edited 2 weeks ago)

Your comment might cause me to do something. You're responsible. I don't care what the legal definitions say.

If we don't care about legal definitions, then how do we know you didn't cause all this?

load more comments (2 replies)
load more comments (8 replies)