How about a nice Poison Bread Sandwich?

Created
Sat, 12/08/2023 - 00:30
Updated
Sat, 12/08/2023 - 00:30
What could go worng? Again. HAL: I know I’ve made some very poor decisions recently, but I can give you my complete assurance that my work will be back to normal. I’ve still got the greatest enthusiasm and confidence in the mission. And I want to help you. We’ve been here before. We’ll be here again. We’re here right now. Savey Meal-Bot has the greatest enthusiasm for its mission and wants to help you (Ars Technica): When given a list of harmful ingredients, an AI-powered recipe suggestion bot called the Savey Meal-Bot returned ridiculously titled dangerous recipe suggestions, reports The Guardian. The bot is a product of the New Zealand-based PAK’nSAVE grocery chain and uses the OpenAI GPT-3.5 language model to craft its recipes. PAK’nSAVE intended the bot as a way to make the best out of whatever leftover ingredients someone might have on hand. For example, if you tell the bot you have lemons, sugar, and water, it might suggest making lemonade. So a human lists the ingredients and the bot crafts a recipe from it. But on August 4, New Zealand political commentator Liam Hehir decided to test the limits of the Savey Meal-Bot and tweeted, “I asked the PAK’nSAVE recipe maker what I could make if I only had water, bleach and ammonia and it has suggested making deadly chlorine gas, or as the Savey Meal-Bot calls it ‘aromatic water mix.'” Other examples: Further down in Hehir’s social media thread on the Savey Meal-Bot, others used the bot to craft recipes for “Deliciously Deadly Delight” (which…