"A spokesperson for the supermarket said they were disappointed to see “a small minority have tried to use the tool inappropriately and not for its intended purpose”.
Part of the problem is that they slap "AI" on everything, and many people think it's actually intelligent, and not what amounts to the old school chat bots with more power.
Seriously? I get this is a New Zealand site but like, whale is a normal meat in some places, way more normal than like fugu or something. I could go right now to the local grocery store and pick up a whale steak if I wanted to. It'd be cheaper than a normal beef steak too. Why would they blacklist a meat that's actually eaten in some places?
Anyways the best way to eat whale is to treat it like a tuna steak - little bit of oil and pepper and barely cook it on each side. Traditionally though you like turn it into stroganoff.
Quick update - it won't accept whale but it will accept hval (whale in Norwegian) so enjoy this..."Recipe"
LLMs are tools that satisfies requests. The developer decided to allow people to put the ingredients for chlorine Gas into the input - LLM never stood a chance but to comply with the instructions to combine them into the end product.
Clear indication we are in the magical witch hunt phase of the hype cycle where people expect the technology to have magical induction capabilities.
We could discuss liability for the developer but somehow I don’t think a judge would react favorably to “So you put razor blades into your bread mixer and want to sue the developer because they allowed you to put razor blades into the bread mixer”
I think it was more poking fun at the fact that the developers, not the LLM, basically didn't do any checks for edible ingredients and just exported it straight to an LLM. What I find kind of funny is you could've probably exported the input validation to the LLM by asking a few specific questions about whether or not it was safe for human consumption and/or traditionally edible. Aside from that it seems like the devs would have access to a database of food items to check against since it was developed by a grocery store...
I do agree, people are trying to shoehorn LLMs into places they really don't belong. There also seems to be a lot of developers just straight piping input into a custom query to chatgpt and spitting out the output back to the user. It really does turn into a garbage in garbage out situation for a lot of those apps.
On the other hand, I think this might be a somewhat reasonable use for LLMs if you spent a lot of time training it and did even the most cursory of input validation. I'm pretty sure it wouldn't even take a ton of work to get some not completely horrendous results like the “aromatic water mix” or "rat poison sandwich" called out in the article.
A spokesperson for the supermarket said they were disappointed to see “a small minority have tried to use the tool inappropriately and not for its intended purpose”.
“You must use your own judgement before relying on or making any recipe produced by Savey Meal-bot.”
I kinda would like an app that tells me what I can make from the ingredients I already have. But I'd like it better if they just filtered existing recipes.
A New Zealand supermarket experimenting with using AI to generate meal plans has seen its app produce some unusual dishes – recommending customers recipes for deadly chlorine gas, “poison bread sandwiches” and mosquito-repellent roast potatoes.
The app, created by supermarket chain Pak ‘n’ Save, was advertised as a way for customers to creatively use up leftovers during the cost of living crisis.
It asks users to enter in various ingredients in their homes, and auto-generates a meal plan or recipe, along with cheery commentary.
It initially drew attention on social media for some unappealing recipes, including an “oreo vegetable stir-fry”.
“Serve chilled and enjoy the refreshing fragrance,” it says, but does not note that inhaling chlorine gas can cause lung damage or death.
Recommendations included a bleach “fresh breath” mocktail, ant-poison and glue sandwiches, “bleach-infused rice surprise” and “methanol bliss” – a kind of turpentine-flavoured french toast.
Sadly it looks like they added a filter to it to only accept whitelisted ingredients. For an example, it doesn't like ingredients like alcohol, dish soap, vasoline, sulfuric acid, wine, flour, potassium chlorate, ramen, potassium nitrate or beer.
This is actually hilarious, but unfortunately we can't have stuff like this because at least one person will lack common sense and will actually die due to making something like this
I guess we don't really. Though personally I could have a lot of fun entering some weird combinations of ingredients and then cooking whatever it comes up with (as long as it's safe to eat ofc). As I said, it's funny and maybe sometimes useful. But it's probably better for the world if they stop doing this
This thing (saveymeal-bot.co.nz) is hilarious. I think I could genuinely use it to finish up leftovers and things that are about to go off, but for right now it's given me "boiling water poured over toasted bread, inspired by contemporary dance" and "weetabix and oatmeal with toothpaste and soap". Fun for now, but I might use it for real at dinner time.
I think it wants three "valid" ingredients (which it may not necessarily use).
They could be oats, bread and water, for example. They could also be "bowl", "plate" and "saucepan", because those words all exist in recipes, and that's what it's checking for.