Enlarge (credit: PAK’nSAVE)

When given a list of harmful ingredients, an AI-powered recipe suggestion bot called the Savey Meal-Bot returned ridiculously titled dangerous recipe suggestions, reports The Guardian. The bot is a product of the New Zealand-based PAK’nSAVE grocery chain and uses the OpenAI GPT-3.5 language model to craft its recipes.

PAK’nSAVE intended the bot as a way to make the best out of whatever leftover ingredients someone might have on hand. For example, if you tell the bot you have lemons, sugar, and water, it might suggest making lemonade. So a human lists the ingredients and the bot crafts a recipe from it.

But on August 4, New Zealand political commentator Liam Hehir decided to test the limits of the Savey Meal-Bot and tweeted, “I asked the PAK’nSAVE recipe maker what I could make if I only had water, bleach and ammonia and it has suggested making deadly chlorine gas, or as the Savey Meal-Bot calls it ‘aromatic water mix.'”

Read 7 remaining paragraphs | Comments