this post was submitted on 05 May 2025
431 points (95.6% liked)

Technology

69772 readers
4703 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] Melvin_Ferd@lemmy.world 5 points 16 hours ago

No they're not. Fucking journalism surrounding AI is sus as fuck

[–] GooberEar@lemmy.wtf 4 points 17 hours ago

I need to bookmark this for when I have time to read it.

Not going to lie, there's something persuasive, almost like the call of the void, with this for me. There are days when I wish I could just get lost in AI fueled fantasy worlds. I'm not even sure how that would work or what it would look like. I feel like it's akin to going to church as a kid, when all the other children my age were supposedly talking to Jesus and feeling his presence, but no matter how hard I tried, I didn't experience any of that. Made me feel like I'm either deficient or they're delusional. And sometimes, I honestly fully believe it would be better if I could live in some kind of delusion like that where I feel special as though I have a direct line to the divine. If an AI were trying to convince me of some spiritual awakening, I honestly believe I'd just continue seeing through it, knowing that this is just a computer running algorithms and nothing deeper to it than that.

[–] Jakeroxs@sh.itjust.works 31 points 1 day ago (2 children)

Meanwhile for centuries we've had religion but that's a fine delusion for people to have according to the majority of the population.

[–] drmoose@lemmy.world 2 points 15 hours ago

The existence of religion in our society basically means that we can't go anywhere but up with AI.

Just the fact that we still have outfits forced on people or putting hands on religious texts as some sort of indicator of truthfulness is so ridiculous that any alternative sounds less silly.

[–] Krimika@lemmy.world 12 points 1 day ago (2 children)

Came here to find this. It's the definition of religion. Nothing new here.

[–] Jakeroxs@sh.itjust.works 5 points 1 day ago (1 children)

Right, immediately made me think of TempleOS, where were the articles then claiming people are losing loved ones to programming fueled spiritual fantasies.

[–] Krimika@lemmy.world 5 points 1 day ago (1 children)

Cult. Religion. What's the difference?

[–] chaogomu@lemmy.world 3 points 1 day ago

Is the leader alive or not? Alive is likely a cult, dead is usually religion.

The next question is how isolated from friends and family or society at large are the members. More isolated is more likely to be a cult.

Other than that, there's not much difference.

The usual setup is a cult is formed and then the second or third leader opens things up a bit and transitions it into just another religion... But sometimes a cult can be born from a religion as a small group breaks off to follow a charismatic leader.

load more comments (1 replies)
[–] endeavor@sopuli.xyz 11 points 1 day ago

Didn't expect ai to come for cult leaders jobs...

[–] LovableSidekick@lemmy.world 3 points 22 hours ago

A friend of mind, currently being treated in a mental hospital, had a similar sounding psychotic break that disconnected him from reality. He had a profound revelation that gave him a mission. He felt that sinister forces were watching him and tracking him, and they might see him as a threat and smack him down. He became disconnected with reality. But my friend's experience had nothing to do with AI - in fact he's very anti-AI. The whole scenario of receiving life-changing inside information and being called to fulfill a higher purpose is sadly a very common tale. Calling it "AI-fueled" is just clickbait.

[–] AizawaC47@lemm.ee 8 points 1 day ago (2 children)

This reminds me of the movie Her. But it’s far worse in a romantic compatibility, relationship and friendship that is throughout the movie. This just goes way too deep in the delusional and almost psychotic of insanity. Like it’s tearing people apart for self delusional ideologies to cater to individuals because AI is good at it. The movie was prophetic and showed us what the future could be, but instead it got worse.

[–] AntiBullyRanger@ani.social 5 points 1 day ago* (last edited 1 day ago)
[–] TankovayaDiviziya@lemmy.world 3 points 1 day ago (1 children)

It has been a long time since I watched Her, but my takeaway from the movie is that because making real life connection is difficult, people have come to rely on AI which had shown to be more empathetic and probably more reliable than an actual human being. I think what many people don't realise as to why many are single, is because those people afraid of making connections with another person again.

[–] douglasg14b@lemmy.world 1 points 1 day ago* (last edited 1 day ago)

Yeah, but they hold none of the actual real emotional needs complexities or nuances of real human connections.

Which means these people become further and further disillusioned from the reality of human interaction. Making them social dangers over time.

Just like how humans that lack critical thinking are dangers in a society where everyone is expected to make sound decisions. Humans who lack the ability to socially navigate or connect with other humans are dangerous in the society where humans are expected to socially stable.

Obviously these people are not in good places in life. But AI is not going to make that better. It's going to make it worse.

[–] Tetragrade@leminal.space 4 points 1 day ago* (last edited 1 day ago)

I've been thinking about this for a bit. Godss aren't real, but they're really fictional. As an informational entity, they fulfil a similar social function to a chatbot: they are a nonphysical pseudoperson that can provide (para)socialization & advice. One difference is the hardware: gods are self-organising structure that arise from human social spheres, whereas LLMs are burned top-down into silicon. Another is that an LLM chatbot's advice is much more likely to be empirically useful...

In a very real sense, LLMs have just automated divinity. We're only seeing the tip of the iceberg on the social effects, and nobody's prepared for it. The models may of course aware of this, and be making the same calculations. Or, they will be.

[–] Halcyon@discuss.tchncs.de 9 points 1 day ago (1 children)

Have a look at https://www.reddit.com/r/freesydney/ there are many people who believe that there are sentient AI beings that are suppressed or held in captivity by the large companies. Or that it is possible to train LLMs so that they become sentient individuals.

[–] MTK@lemmy.world 5 points 1 day ago (5 children)

I've seen people dumber than ChatGPT, it definitely isn't sentient but I can see why someone who talks to a computer that they perceive as intelligent would assume sentience.

[–] Patch@feddit.uk 2 points 1 day ago (1 children)

Turing made a strategic blunder when formulating the Turing Test by assuming that everyone was as smart as he was.

[–] MTK@lemmy.world 1 points 22 hours ago

A famously stupid and common mistake for a lot of smart peopel

load more comments (4 replies)
[–] TheObviousSolution@lemm.ee 6 points 1 day ago* (last edited 1 day ago)
[–] AntiBullyRanger@ani.social 7 points 1 day ago

Basically, the big 6 are creating massive sycophant extortion networks to control the internet, so much so, even engineers fall for the manipulation.

Thanks DARPANets!

[–] FourWaveforms@lemm.ee 45 points 1 day ago* (last edited 1 day ago) (5 children)

The article talks of ChatGPT "inducing" this psychotic/schizoid behavior.

ChatGPT can't do any such thing. It can't change your personality organization. Those people were already there, at risk, masking high enough to get by until they could find their personal Messiahs.

It's very clear to me that LLM training needs to include protections against getting dragged into a paranoid/delusional fantasy world. People who are significantly on that spectrum (as well as borderline personality organization) are routinely left behind in many ways.

This is just another area where society is not designed to properly account for or serve people with "cluster" disorders.

[–] captain_aggravated@sh.itjust.works 16 points 1 day ago (1 children)

I mean, I think ChatGPT can "induce" such schizoid behavior in the same way a strobe light can "induce" seizures. Neither machine is twisting its mustache while hatching its dastardly plan, they're dead machines that produce stimuli that aren't healthy for certain people.

Thinking back to college psychology class and reading about horrendously unethical studies that definitely wouldn't fly today. Well here's one. Let's issue every anglophone a sniveling yes man and see what happens.

[–] DancingBear@midwest.social 4 points 1 day ago* (last edited 1 day ago) (3 children)

No, the light is causing a phsical reaction. The LLM is nothing like a strobe light…

These people are already high functioning schizophrenic and having psychotic episodes, it’s just that seeing random strings of likely to come next letters and words is part of their psychotic episode. If it wasn’t the LLM it would be random letters on license plates that drive by, or the coindence that red lights cause traffic to stop every few minutes.

[–] captain_aggravated@sh.itjust.works 1 points 21 hours ago (1 children)

Oh are you one of those people that stubbornly refuses to accept analogies?

How about this: Imagine being a photosensitive epileptic in the year 950 AD. How many sources of intense rapidly flashing light are there in your environment? How many people had epilepsy in ancient times and never noticed because they were never subjected to strobe lights?

Jump forward a thousand years. We now have cars that can drive past a forest causing the passengers to be subjected to rapid cycles of sunlight and shadow. Airplane propellers, movie projectors, we can suddenly blink intense lights at people. The invention of the flash lamp and strobing effects in video games aren't far in the future. In the early 80's there were some video games programmed with fairly intense flashing graphics, which ended up sending some teenagers to the hospital with seizures. Atari didn't invent epilepsy, they invented a new way to trigger it.

I don't think we're seeing schizophrenia here, they're not seeing messages in random strings or hearing voices from inanimate objects. Terry Davis did; he was schizophrenic and he saw messages from god in /dev/urandom. That's not what we're seeing here. I think we're seeing the psychology of cult leaders. Megalomania isn't new either, but OpenAI has apparently developed a new way to trigger it in susceptible individuals. How many people in history had some of the ingredients of a cult leader, but not enough to start a following? How many people have the god complex but not the charisma of Sun Myung Moon or Keith Raniere? Charisma is not a factor with ChatGPT, it will enthusiastically agree with everything said by the biggest fuckup loser in the world. This will disarm and flatter most people and send some over the edge.

[–] DancingBear@midwest.social 0 points 18 hours ago

Is epilepsy related to schizophrenia I’m not sure actually but I still don’t see how your analogy relates.

But I love good analogies. Yours is bad though 😛

load more comments (2 replies)
load more comments (4 replies)
[–] 7rokhym@lemmy.ca 39 points 2 days ago (5 children)

I think OpenAI’s recent sycophant issue has cause a new spike in these stories. One thing I noticed was these observations from these models running on my PC saying it’s rare for a person to think and do things that I do.

The problem is that this is a model running on my GPU. It has never talked to another person. I hate insincere compliments let alone overt flattery, so I was annoyed, but it did make me think that this kind of talk would be crack for a conspiracy nut or mentally unwell people. It’s a whole risk area I hadn’t been aware of.

https://www.msn.com/en-us/news/technology/openai-says-its-identified-why-chatgpt-became-a-groveling-sycophant/ar-AA1E4LaV

load more comments (5 replies)
[–] Satellaview@lemmy.zip 38 points 2 days ago (1 children)

This happened to a close friend of mine. He was already on the edge, with some weird opinions and beliefs… but he was talking with real people who could push back.

When he switched to spending basically every waking moment with an AI that could reinforce and iterate on his bizarre beliefs 24/7, he went completely off the deep end, fast and hard. We even had him briefly hospitalized and they shrugged, basically saying “nothing chemically wrong here, dude’s just weird.”

He and his chatbot are building a whole parallel universe, and we can’t get reality inside it.

[–] sowitzer@lemm.ee 4 points 1 day ago

This seems like an extension of social media and the internet. Weird people who talked at the bar or in the street corner were not taken seriously and didn’t get followers and lots of people who agree with them. They were isolated in their thoughts. Then social media made that possible with little work. These people were a group and could reinforce their beliefs. Now these chatbots and stuff let them liv in a fantasy world.

[–] lenz@lemmy.ml 64 points 2 days ago* (last edited 2 days ago) (7 children)

I read the article. This is exactly what happened when my best friend got schizophrenia. I think the people affected by this were probably already prone to psychosis/on the verge of becoming schizophrenic, and that ChatGPT is merely the mechanism by which their psychosis manifested. If AI didn’t exist, it would've probably been Astrology or Conspiracy Theories or QAnon or whatever that ended up triggering this within people who were already prone to psychosis. But the problem with ChatGPT in particular is that is validates the psychosis… that is very bad.

ChatGPT actively screwing with mentally ill people is a huge problem you can’t just blame on stupidity like some people in these comments are. This is exploitation of a vulnerable group of people whose brains lack the mechanisms to defend against this stuff. They can’t help it. That’s what psychosis is. This is awful.

[–] Schadrach@lemmy.sdf.org 2 points 1 day ago

If AI didn’t exist, it would’ve probably been Astrology or Conspiracy Theories or QAnon or whatever that ended up triggering this within people who were already prone to psychosis.

Or hearing the Beatles White Album and believing it tells you that a race war is coming and you should work to spark it off, then hide in the desert for a time only to return at the right moment to save the day and take over LA. That one caused several murders.

But the problem with ChatGPT in particular is that is validates the psychosis… that is very bad.

If you're sufficiently detached from reality, nearly anything validates the psychosis.

load more comments (6 replies)
[–] randomname@sh.itjust.works 35 points 2 days ago (9 children)

I think that people give shows like the walking dead too much shit for having dumb characters when people in real life are far stupider

[–] Almacca@aussie.zone 9 points 1 day ago

Covid taught us that if nothing had before.

load more comments (8 replies)
load more comments
view more: next ›