this post was submitted on 04 Jun 2025
113 points (98.3% liked)
Showerthoughts
34840 readers
426 users here now
A "Showerthought" is a simple term used to describe the thoughts that pop into your head while you're doing everyday things like taking a shower, driving, or just daydreaming. The most popular seem to be lighthearted clever little truths, hidden in daily life.
Here are some examples to inspire your own showerthoughts:
- Both “200” and “160” are 2 minutes in microwave math
- When you’re a kid, you don’t realize you’re also watching your mom and dad grow up.
- More dreams have been destroyed by alarm clocks than anything else
Rules
- All posts must be showerthoughts
- The entire showerthought must be in the title
- No politics
- If your topic is in a grey area, please phrase it to emphasize the fascinating aspects, not the dramatic aspects. You can do this by avoiding overly politicized terms such as "capitalism" and "communism". If you must make comparisons, you can say something is different without saying something is better/worse.
- A good place for politics is c/politicaldiscussion
- Posts must be original/unique
- Adhere to Lemmy's Code of Conduct and the TOS
If you made it this far, showerthoughts is accepting new mods. This community is generally tame so its not a lot of work, but having a few more mods would help reports get addressed a little sooner.
Whats it like to be a mod? Reports just show up as messages in your Lemmy inbox, and if a different mod has already addressed the report, the message goes away and you never worry about it.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Is it? Is random variance the source of all hallucinations? I think it's not; it's more the fact that they don't understand what they're generating, they're just looking for the most statistically probable next character.
Yeah, they aren't trained to make "correct" responses, but reasonably looking responses; they aren't truth systems. However, I'm not sure what a truth system would even look like. At a certain point truth/fact become subjective, meaning that we probably have a fundamental problem with how we think about and evaluate these systems.
I mean, it's the whole reason programming languages were created, natural language is ambiguous.
Yeah, solipsism existing drives the point about truth home. Thing is, LLMs outright lie without knowing they're lying, because there's no understanding there. It's statistics at the character level.
AI is not my field, so I don't know, either.
If there are 800 sentences/whatever chunk of information it uses about what color a ball is, using the average can result in that sentence using red when it should be blue based on the current question or it could add information about balls that are a different type because it doesn't understand what kind of ball it is talking about. It might be randomness, it might be using an average, or a combination of both.
Like if asked about 'what color is a basketball' and the training set includes a it of custom color combinations by each team it might return a combination of colors that doesn't match a team like brown (default leather) and yellow. This could also be the answer if you asked for an example of a basketball that matched team colors, because it might keep the default color from a ball that just has a team logo.
If someone doesn't know the training set it would probably look like it made something ip. To someone who knows it is impossible to tell of it is random, due to a lack of knowing what it is talking about, or if it had some other less obvious connection that combines the two which lead to yellow and brown result.