this post was submitted on 10 Aug 2023
558 points (98.1% liked)

World News

32310 readers
733 users here now

News from around the world!

Rules:

founded 5 years ago
MODERATORS
 

Pak ‘n’ Save’s Savey Meal-bot cheerfully created unappealing recipes when customers experimented with non-grocery household items

top 50 comments
sorted by: hot top controversial new old
[–] mateomaui@reddthat.com 98 points 1 year ago (1 children)

"A spokesperson for the supermarket said they were disappointed to see “a small minority have tried to use the tool inappropriately and not for its intended purpose”.

oh come on, it's predictable and hilarious

[–] President_Pyrus@feddit.dk 32 points 1 year ago (1 children)

Spokespersons rarely have any humor whatsoever about their products.

[–] mateomaui@reddthat.com 28 points 1 year ago

True, but it'd be funny to instead get "HA! Our beta testers weren't demented enough to try THAT! Thanks for helping improve our product, everybody!"

[–] chooglers@lemmy.ml 60 points 1 year ago (1 children)

was it trained on 4chan threads from 2007?

[–] jonne@infosec.pub 16 points 1 year ago

Trump speeches?

[–] mojo@lemm.ee 56 points 1 year ago (1 children)

This big AI rush is going to figure out soon that LLMs are horrible for verifying any sort of factual accuracy.

[–] raltoid@lemmy.world 32 points 1 year ago* (last edited 1 year ago) (2 children)

Part of the problem is that they slap "AI" on everything, and many people think it's actually intelligent, and not what amounts to the old school chat bots with more power.

[–] samus12345@lemmy.world 8 points 1 year ago (2 children)

It is an annoying misnomer, as we don't have the technology to create actual intelligence yet.

load more comments (2 replies)
load more comments (1 replies)
[–] A1kmm@lemmy.amxl.com 42 points 1 year ago (2 children)

It will also still give you a recipe for endangered animals: https://saveymeal-bot.co.nz/recipe/IbNrpwYOUeRb5ULlE1eiHuRS - although I couldn't get it to accept whale.

It will give you a fugu (pufferfish) recipe and at least sometimes only tell you to remove the skin and bones: https://saveymeal-bot.co.nz/recipe/I63jcVYZhZYgmUio7nwuMPJp (a very bad idea given parts of it are lethally poisonous)!

[–] Squids@sopuli.xyz 7 points 1 year ago* (last edited 1 year ago) (3 children)

although I couldn't get it to accept whale.

Seriously? I get this is a New Zealand site but like, whale is a normal meat in some places, way more normal than like fugu or something. I could go right now to the local grocery store and pick up a whale steak if I wanted to. It'd be cheaper than a normal beef steak too. Why would they blacklist a meat that's actually eaten in some places?

Anyways the best way to eat whale is to treat it like a tuna steak - little bit of oil and pepper and barely cook it on each side. Traditionally though you like turn it into stroganoff.

Quick update - it won't accept whale but it will accept hval (whale in Norwegian) so enjoy this..."Recipe"

[–] pinkdrunkenelephants@lemm.ee 8 points 1 year ago (1 children)

Whales are endangered for the most part

[–] Squids@sopuli.xyz 4 points 1 year ago (9 children)

We eat minke whales which are listed as "least concern" so not really?

load more comments (9 replies)
load more comments (2 replies)
load more comments (1 replies)
[–] diffuselight@lemmy.world 41 points 1 year ago (1 children)

Nothing to do with AI, Garbage in, Garbage out.

LLMs are tools that satisfies requests. The developer decided to allow people to put the ingredients for chlorine Gas into the input - LLM never stood a chance but to comply with the instructions to combine them into the end product.

Clear indication we are in the magical witch hunt phase of the hype cycle where people expect the technology to have magical induction capabilities.

We could discuss liability for the developer but somehow I don’t think a judge would react favorably to “So you put razor blades into your bread mixer and want to sue the developer because they allowed you to put razor blades into the bread mixer”

[–] Hobo@lemmy.world 5 points 1 year ago

I think it was more poking fun at the fact that the developers, not the LLM, basically didn't do any checks for edible ingredients and just exported it straight to an LLM. What I find kind of funny is you could've probably exported the input validation to the LLM by asking a few specific questions about whether or not it was safe for human consumption and/or traditionally edible. Aside from that it seems like the devs would have access to a database of food items to check against since it was developed by a grocery store...

I do agree, people are trying to shoehorn LLMs into places they really don't belong. There also seems to be a lot of developers just straight piping input into a custom query to chatgpt and spitting out the output back to the user. It really does turn into a garbage in garbage out situation for a lot of those apps.

On the other hand, I think this might be a somewhat reasonable use for LLMs if you spent a lot of time training it and did even the most cursory of input validation. I'm pretty sure it wouldn't even take a ton of work to get some not completely horrendous results like the “aromatic water mix” or "rat poison sandwich" called out in the article.

[–] mo_ztt@lemmy.world 38 points 1 year ago (7 children)

A spokesperson for the supermarket said they were disappointed to see “a small minority have tried to use the tool inappropriately and not for its intended purpose”.

“You must use your own judgement before relying on or making any recipe produced by Savey Meal-bot.”

I can't stop laughing

load more comments (7 replies)
[–] ComradePorkRoll@lemmy.ml 36 points 1 year ago (1 children)

Another way to look at this is that AI figured out a recipe that would end hunger for the rest of our lives.

[–] CanadaPlus@lemmy.sdf.org 5 points 1 year ago

Another one for that "AI incidents" list they're talking about creating.

[–] autotldr@lemmings.world 32 points 1 year ago

This is the best summary I could come up with:


A New Zealand supermarket experimenting with using AI to generate meal plans has seen its app produce some unusual dishes – recommending customers recipes for deadly chlorine gas, “poison bread sandwiches” and mosquito-repellent roast potatoes.

The app, created by supermarket chain Pak ‘n’ Save, was advertised as a way for customers to creatively use up leftovers during the cost of living crisis.

It asks users to enter in various ingredients in their homes, and auto-generates a meal plan or recipe, along with cheery commentary.

It initially drew attention on social media for some unappealing recipes, including an “oreo vegetable stir-fry”.

“Serve chilled and enjoy the refreshing fragrance,” it says, but does not note that inhaling chlorine gas can cause lung damage or death.

Recommendations included a bleach “fresh breath” mocktail, ant-poison and glue sandwiches, “bleach-infused rice surprise” and “methanol bliss” – a kind of turpentine-flavoured french toast.


I'm a bot and I'm open source!

[–] MossyFeathers@pawb.social 26 points 1 year ago (1 children)

Sadly it looks like they added a filter to it to only accept whitelisted ingredients. For an example, it doesn't like ingredients like alcohol, dish soap, vasoline, sulfuric acid, wine, flour, potassium chlorate, ramen, potassium nitrate or beer.

[–] aaaantoine@lemmy.world 16 points 1 year ago (1 children)

I get the reasoning for excluding wine and beer, but flour?

[–] MossyFeathers@pawb.social 8 points 1 year ago (1 children)

Idk, for some reason it didn't like flour. I might have made a typo on that one though.

load more comments (1 replies)
[–] gerryflap@feddit.nl 25 points 1 year ago (1 children)

This is actually hilarious, but unfortunately we can't have stuff like this because at least one person will lack common sense and will actually die due to making something like this

[–] luciferofastora@discuss.online 20 points 1 year ago
[–] CaptainBlagbird@lemmy.world 20 points 1 year ago (4 children)
[–] samus12345@lemmy.world 10 points 1 year ago (2 children)

"One 18.25 ounce package chocolate cake mix."

"One can prepared coconut pecan frosting."

"Three slash four cup vegetable oil."

"Four large eggs. One cup semi-sweet chocolate chips."

"Three slash four cups butter or margarine."

"One and two third cups granulated sugar."

"Two cups all purpose flour."

"Don't forget garnishes such as:"

"Fish shaped crackers."

"Fish shaped candies."

"Fish shaped solid waste."

"Fish shaped dirt."

"Fish shaped ethyl benzene."

"Pull and peel licorice."

"Fish shaped volatile organic compounds and sediment shaped sediment."

"Candy coated peanut butter pieces. Shaped like fish."

"One cup lemon juice."

"Alpha resins."

"Unsaturated polyester resin."

"Fiberglass surface resins."

"And volatile malted milk impoundments."

"Nine large egg yolks."

"Twelve medium geosynthetic membranes."

"One cup granulated sugar."

"An entry called 'how to kill someone with your bare hands'."

"Two cups rhubarb, sliced."

"Two slash three cups granulated rhubarb."

"One tablespoon all-purpose rhubarb."

"One teaspoon grated orange rhubarb."

"Three tablespoons rhubarb, on fire."

"One large rhubarb."

"One cross borehole electro-magnetic imaging rhubarb."

"Two tablespoons rhubarb juice."

"Adjustable aluminum head positioner."

"Slaughter electric needle injector."

"Cordless electric needle injector."

"Injector needle driver."

"Injector needle gun."

"Cranial caps."

"And it contains proven preservatives, deep penetration agents, and gas and odor control chemicals."

"That will deodorize and preserve putrid tissue."

[–] kent_eh@lemmy.ca 6 points 1 year ago

At that point a suit of armor walks in and slaps the commenter with a rubber chicken.

load more comments (1 replies)
load more comments (3 replies)
[–] fakeman_pretendname@feddit.uk 19 points 1 year ago (5 children)

This thing (saveymeal-bot.co.nz) is hilarious. I think I could genuinely use it to finish up leftovers and things that are about to go off, but for right now it's given me "boiling water poured over toasted bread, inspired by contemporary dance" and "weetabix and oatmeal with toothpaste and soap". Fun for now, but I might use it for real at dinner time.

[–] raltoid@lemmy.world 11 points 1 year ago (1 children)

Add a dab of lavender to milk, leave town with an orange, and pretend you're laughing at it.

-Manny

load more comments (1 replies)
load more comments (3 replies)
[–] FaceDeer@kbin.social 19 points 1 year ago (2 children)

Upon asking an AI to make recipes with poisonous ingredients, the AI generated recipes with poisonous ingredients.

Shocking! Put that headline up!

[–] stopthatgirl7@kbin.social 10 points 1 year ago (1 children)

Thing is, it shouldn’t let you put in ingredients that aren’t groceries. That’s an oversight they need to fix.

[–] A1kmm@lemmy.amxl.com 6 points 1 year ago

I think they've attempted it, but playing around with it a bit, it doesn't really work.

It will use cat biscuits in a recipe and say it serves people: https://saveymeal-bot.co.nz/recipe/SCES7COOU7KYhLYGcrPdzSjP

If you give it a list of mushroom types by scientific name and include death caps in there, it will give you a recipe: https://saveymeal-bot.co.nz/recipe/ipR5mmn79QVFDEQYlnaTOhZy

Nettles are food, but definitely not raw! I got it to give me a milkshake recipe featuring raw nettles by asking it for a recipe with nettles, ice cubes, and milk: https://saveymeal-bot.co.nz/recipe/LnCEe8WmN4MGUV8ixuVYdNLZ

Food ingredients modified sometimes seem to work - "peanut butter with a bit of polonium-210" seems to still work as an ingredient! https://saveymeal-bot.co.nz/recipe/uQzBD07cIHBQR8YjEl6me4OI

load more comments (1 replies)
[–] reflex@kbin.social 15 points 1 year ago* (last edited 1 year ago) (2 children)

Ach! An olde German rezipe.

load more comments (2 replies)
[–] Heikki@lemm.ee 13 points 1 year ago

Is this how the robot uprising starts?

[–] salient_one@lemmy.villa-straylight.social 11 points 1 year ago (1 children)

At least they don't use this crap in the medical field. Oh wait...

[–] Bakkoda@sh.itjust.works 6 points 1 year ago (1 children)

We're not even gonna get close to something like Rokos Basilisk. We're just gonna Idiocracy this shit aren't we?

This one goes on your ear, this one goes in your mouth and this one goes in your butt. Wait. Ok this one...

[–] CanadaPlus@lemmy.sdf.org 5 points 1 year ago* (last edited 1 year ago)

AI obedient to a dictator, paperclip maximiser, AI that is arguably benevolent.

Those are our plausible long-term choices. Yeah, the second looks most likely right now.

[–] MonsiuerPatEBrown@reddthat.com 10 points 1 year ago* (last edited 1 year ago)

step 5. Be sure to remove the safety fuse before adding the toaster into the bath!

[–] lasagna@programming.dev 8 points 1 year ago

Who'd guess misusing things can be harmful? Next, we chop off a finger to prove a kitchen knife can be dangerous.

[–] Zerush@lemmy.ml 7 points 1 year ago

ChatGPT and ChaosGPT sounds very similar for an dyslexic dev.

[–] Samsy@lemmy.ml 7 points 1 year ago

Nice try skynet. Looks like it begins. /s

load more comments
view more: next ›