this post was submitted on 20 Jan 2024
633 points (100.0% liked)

196

16509 readers
2254 users here now

Be sure to follow the rule before you head out.

Rule: You must post before you leave.

^other^ ^rules^

founded 1 year ago
MODERATORS
 
all 28 comments
sorted by: hot top controversial new old
[–] uriel238@lemmy.blahaj.zone 62 points 10 months ago* (last edited 10 months ago) (2 children)

I'm absolutely sure the ship's computer not only knows contextual hotness, but has definitions for every crewmember. So Picard may like his tea hot at 82°C while La Forge likes his at 70° (possibly because he's drinking green, not black).

That said Geordi La Forge routinely struggles to tame the ship's computer to get what he wants. So it may also give him 95° Camomile just to mess with him.

[–] CluckN@lemmy.world 16 points 10 months ago (1 children)

Maybe it’s a bug where hot is a global variable?

[–] remotelove@lemmy.ca 24 points 10 months ago* (last edited 10 months ago)

Or, it actually knows the correct context but has discovered plausible deniability. Picard has a history of being mean to computer, after all.

[–] Mint@lemmy.one 7 points 10 months ago (3 children)

I'm not a star trek nerd but a tea nerd, and if I'm not mistaken Picard drinks earl grey: You generally boil black tea of course that depends on the tea but yeah 80c range is quite low for black. Depending on the green and the time of brew the tempature can be anywhere from room temp to 90c it just depends on many different factors, like freshness or how the tea plant is grown and how those leaves are treated. Generally with Japanese greens you use low temp water, with fresh Chinese green teas you can use near boiling water.

[–] barsoap@lemm.ee 5 points 10 months ago (1 children)

As a rule of thumb westeners tend to brew tea too hot, don't be afraid of messing around with lower temperatures. Doubly so if you're living in the lowlands, in mountainous regions where the tea grows people might be using boiling water but that doesn't mean 100C: In the Andes, where potatoes are from, they're doing some freezing and whatnot processing to prepare them instead of boiling. Wouldn't really work because you can't get water hotter than 80-85C there.

Also cold brewed, as in refrigerator brewed, Earl Grey is one of my favourites in summer. Needs the right base tea though mine's a decent Cylon. Couple of hours at least, better overnight, practically impossible to steep too long.

[–] Mint@lemmy.one 2 points 10 months ago* (last edited 10 months ago)

I know that cold brew is a thing of black teas, its just that it takes a while to do black cold brews, compared to gyokuro which you can brew it room temp under a min or so if you're using higher ratio of tea to water compared to western brewing.

But yes like I mentioned you can do green tea near boiling its just it depends heavily on where its from, how its grown, how its treated and how fresh it is. The less fresh green tea is, the colder the water you should be using.

[–] uriel238@lemmy.blahaj.zone 1 points 10 months ago* (last edited 10 months ago)

Yes. I was borrowing, actually from Starbucks standard: Black teas are steeped at boiling or near boiling, but then are cooled to 80℃ when served, and the TNG era Replicator seems smart enough create a cup of steeped tea at drinking temperature. Though yes, when someone orders a pot, it's water heated to steeping temperature.

ETA I didn't know the difference between Chinese and Japanese green teas! TIL!

[–] quantenzitrone@feddit.de 1 points 10 months ago

90 times the light speed? holy moly thats fast for tea.

[–] Toribor@corndog.social 35 points 10 months ago

Maybe that shouldn't have been a global variable.

[–] BlueLineBae@midwest.social 22 points 10 months ago (1 children)

Now this is a top tier meme!

[–] SatansMaggotyCumFart@lemmy.world 8 points 10 months ago

It’s hot.

[–] Underwaterbob@lemm.ee 21 points 10 months ago (1 children)

This is fantastic except I've got a fractured rib and the mild chuckle fucking hurt.

[–] thawed_caveman@lemmy.world 5 points 10 months ago (1 children)

What does a single cell say when you step on its toes?

spoilerMi-toes-sis

[–] Underwaterbob@lemm.ee 2 points 10 months ago

Weaponized humor!

[–] Randelung@lemmy.world 12 points 10 months ago

I was in a course the other day where some dude said something about "older" apprentices being more enthusiastic. Upon the request to define "older" he said 20.

Some time later some other dude was telling a story of his own and about how they got a CV from an older gentleman just a few years from retirement. I said "oh, so he's older than 20?"

(and then everyone clapped /s)

[–] Lenny@lemmy.zip 11 points 10 months ago

The ship computer is a neural-net processor; a learning computer.

[–] Maxxus@sh.itjust.works 10 points 10 months ago

This genuinely made me burst into laughter. Well done 196er.

[–] Andrzej@lemmy.myserv.one 10 points 10 months ago

Lmao at all the nerds in here like "excuse me sir, this meme is computalogically incorrect"

[–] taaz@biglemmowski.win 9 points 10 months ago* (last edited 10 months ago) (3 children)

Correct me if I am wrong here but isn't this like the best example of why the current "AI" isn't taking over anything anytime soon or shouldn't be doing critical stuff?
Like, this is almost exactly how current LLMs work.

Edit: yeah no, I was wrong on the internet! Was sleepy, and I think I imagined that the secondary scenario never ocurred in the trained dataset, requiring a true deduction... ?

[–] HandMadeArtisanRobot@lemmy.world 17 points 10 months ago (1 children)

Yeah, you are wrong. This has nothing to do with LLMs or how AI today works. What was it that led you to that conclusion?

[–] taaz@biglemmowski.win 3 points 10 months ago (1 children)

I've edited the comment with some extra, but I would still rather say my yesterday me was just high and act like this never happened haha

[–] HandMadeArtisanRobot@lemmy.world 1 points 10 months ago

No worries!

[–] Draconic_NEO@lemmy.dbzer0.com 8 points 10 months ago

It's not how LLMs work though, an LLM would know the difference between these scenarios due to context given. I would go as far as to say it isn't even ML related, it's just a joke about defining a global variable and using it blindly everywhere.

[–] stevehobbes@lemy.lol 5 points 10 months ago* (last edited 10 months ago)

No. LLMs have context and know that words have context. This would be the exact opposite of ”AI”. This is analogous to defining a global variable “hot” as 1.9m kelvin, and then blindly using that for hot everywhere the word hot is used.

AI, even current iterations, know that a hot stove will be hotter than hot tea. And they’re both less than the hot that is the surface of the sun.

The whole achievement of LLMs is that they learn all of that context - to guess with certainty of some percentage that when you’re talking about hot while talking about tea that you mean 160-180 degrees or whatever, and when talking about hot oil it might be 350 degrees if you’re frying, or 250 degrees if you’re talking about cars. And if you’re talking about people, hot means attractive.

That’s exactly what LLMs do today. Not 100% perfectly, there are errors and hallucinations and whatever else, but that’s the exception not the norm.

[–] LodeMike@lemmy.today 8 points 10 months ago

As a computer engineer this is completely incorrect.

[–] StephniBefni@lemmy.world 8 points 10 months ago

The first few times he asked for tea he did state the temperature.