this post was submitted on 23 Oct 2024
38 points (100.0% liked)
TechTakes
1384 readers
237 users here now
Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.
This is not debate club. Unless it’s amusing debate.
For actually-good tech, you want our NotAwfulTech community
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Okay, quick prediction time:
Even if character.ai manages to win the lawsuit, this is probably gonna be the company's death knell. Even if the fallout of this incident doesn't lead to heavy regulation coming down on them, the death of one of their users is gonna cause horrific damage to their (already pretty poor AFAIK) reputation.
On a larger scale, chatbot apps like Replika and character.ai (if not chatbots in general) are probably gonna go into a serious decline thanks to this - the idea that "our chatbot can potentially kill you" is now firmly planted in the public's mind, and I suspect their userbase is gonna blow their lid from how heavily the major apps are gonna lock their shit down.
I'm not familiar with Character.ai. Is their business model that they take famous fictional characters, force feed their dialog into LLMs, then offer these LLMs as personalized chatbots?
From the looks of things, that's how their business model works.
I found an Ars piece:
https://arstechnica.com/tech-policy/2024/10/chatbots-posed-as-therapist-and-adult-lover-in-teen-suicide-case-lawsuit-says/
Updated 'do no evil' into 'if you are going to convince children to kill themselves while doing massive copyright infringement, at least dont hurt the brand'