this post was submitted on 29 Jun 2025
452 points (95.9% liked)
Technology
72156 readers
3157 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Funny, I was just reading comments in another thread about people with mental health problems proclaiming how terrific it is. Especially concerning is how they had found value in the recommendations LLMs make and "trying those out." One of the commenters described themselves as "neuro diverse" and was acting upon "advice" from generated LLM responses.
And for something like depression, this is deeply bad advice. I feel somewhat qualified to weigh in on it as somebody who has struggled severely with depression and managed to get through it with the support of a very capable therapist. There's a tremendous amount of depth and context to somebody's mental condition that involves more deliberate probing to understand than stringing together words until it forms sentences that mimic human interactions.
Let's not forget that an LLM will not be able to raise alarm bells, read medical records, write prescriptions or work with other medical professionals. Another thing people often forget is that LLMs have maximum token lengths and cannot, by definition, keep a detailed "memory" of everything that's been discussed.
It's is effectively self-treatment with more steps.
this is like the "benefit" of what LLM-therapy would provide if it worked. The reality is that, it doesn't but it serves as a proof of concept that there is a need for anonymous therapy. Therapy in the USA is only for people with socially acceptable illnesses. People rightfully live in fear of getting labeled as untreatable, a danger to self and others, and then at best dropped from therapy and at worst institutionalized.
yep, almost nobody wants to be committed to a psych ward without consent
Also worth noting that:
1. AI is arguably a surveillance technology that's built on decades of our patterns
2. The US government is increasingly authoritarian and has expressed interest in throwing neurodivergent people into labor camps
3. Large AI companies like OpenAI are signing contracts with the Department of defense
If I were a US citizen, I would be avoiding discussing my personal life with AI like the plague.
And for many people it's better than nothing and likely the best they can do. Waiting lists for a basic therapist in my area are months long. Shorter if you pay out of pocket, but that isn't affordable to average people because it's like 300-400 for a one hour session.
I get it, but I'm not sure that "something is better than nothing" in this case. I don't judge any individual for using it, but the risks are huge, as others have documented. And the benefits are questionable.
something is always better than nothing. esp if you are starving.
I can't find the story for the life of me right now but I'm pretty sure there was one a few months back where someone was talking with an LLM about their depression and suicide and the LLM essentially said "yeah you should probably do it." because to the LLM, that was the best solution to the problem.