this post was submitted on 08 Jul 2024
825 points (96.8% liked)

Science Memes

11189 readers
2251 users here now

Welcome to c/science_memes @ Mander.xyz!

A place for majestic STEMLORD peacocking, as well as memes about the realities of working in a lab.



Rules

  1. Don't throw mud. Behave like an intellectual and remember the human.
  2. Keep it rooted (on topic).
  3. No spam.
  4. Infographics welcome, get schooled.

This is a science community. We use the Dawkins definition of meme.



Research Committee

Other Mander Communities

Science and Research

Biology and Life Sciences

Physical Sciences

Humanities and Social Sciences

Practical and Applied Sciences

Memes

Miscellaneous

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] Rolando@lemmy.world 13 points 4 months ago (2 children)

Contemporary LLMs are static. LLMs are not static by definition.

[–] wizardbeard@lemmy.dbzer0.com 1 points 4 months ago (1 children)

Could you point me towards one that isn't? Or is this something still in the theoretical?

I'm really trying not to be rude, but there's a massive amount of BS being spread around based off what is potentially theoretically possible with these things. AI is in a massive bubble right now, with life changing amounts of money on the line. A lot of people have very vested interest in everyone believing that the theoretical possibilities are just a few months/years away from reality.

I've read enough Popular Science magazine, and heard enough "This is the year of the Linux desktop" to take claims of where technological advances are absolutely going to go with a grain of salt.

[–] match@pawb.social 7 points 4 months ago

Remember that Microsoft chatbot that 4chan turned into a nazi over the course of a week? That was a self-updating language model using 2010s technology (versus the small-country-sized energy drain of ChatGPT4)

[–] CeeBee_Eh@lemmy.world 0 points 4 months ago

But they are. There's no feedback loop and continuous training happening. Once an instance or conversation is done all that context is gone. The context is never integrated directly into the model as it happens. That's more or less the way our brains work. Every stimulus, every thought, every sensation, every idea is added to our brain's model as it happens.