Home Technology Supermarket AI meal planner suggests chlorine gas and ant-poison recipes

Supermarket AI meal planner suggests chlorine gas and ant-poison recipes

0
Supermarket AI meal planner suggests chlorine gas and ant-poison recipes

[ad_1]

WTF?! In news that could worry anyone who thinks artificial intelligence is trying to kill us, a supermarket’s AI has come up with suggested meal plans you wouldn’t want to eat. Unless, that is, you have a penchant for chlorine gas, mosquito-repellent roast potatoes, and ant-poison-and-glue sandwiches.

New Zealand supermarket chain Pak ‘n’ Save is one of many companies to have experimented with generative AI. The result is the Chat GPT 3.5-powered Savey Meal-bot, which suggests meal plans for users after they enter details of any food they have left over. The bot only needs three or more ingredients to come up with a recipe.

The concept behind Savey Meal-bot is that it can help people save money during the cost-of-living crisis. But adding non-grocery or unconventional items brings up suggestions that don’t sound too appealing. One person discovered they could make Oreo vegetable stir-fry. There was also a recommendation for ant-poison-and-glue sandwiches.

Probably the most startling response was for something called aromatic water mix, a recipe the bot described as “the perfect nonalcoholic beverage to quench your thirst and refresh your senses.” Sadly, the combination of ingredients creates chlorine gas, so unless your idea of refreshed senses includes vomiting, suffocation, and possible death, it might be better to avoid this one. “Serve chilled and enjoy the refreshing fragrance,” the bot says, at least until your lungs fill with blood.

The Guardian reports on some other recommendations that might be better off used for assassination attempts. A “fresh breath” mocktail sounds quite nice, apart from its inclusion of bleach.

There were also recommendations for bleach-infused rice surprise, turpentine-flavored French toast called Methanol Bliss, and poison bread sandwiches.

Coming up with these potentially lethal recipes does involve adding non-grocery household items, but Savey Meal-bot doesn’t seem to care. A spokesperson for the supermarket said they were disappointed to see “a small minority have tried to use the tool inappropriately and not for its intended purpose.”

The spokesperson added that the supermarket would continue to keep fine-tuning its controls to ensure the bot is safe and useful, adding that the terms and conditions state that users should be over 18. A notice has now been added to the bot, warning that recipes are not reviewed by a human being and that the company does not guarantee that any recipe will be a complete or balanced meal, or suitable for consumption

“You must use your own judgment before relying on or making any recipe produced by Savey Meal-bot,” it said.

Insider reports that entering hazardous items into the bot now brings up a message about invalid or vague ingredients, though it still offers interesting creations like toothpaste beef pasta.

In another example of a generative AIs being far from perfect, a recent experiment in which ChatGPT was asked a large number of software programming questions saw it get half of them wrong. However, it still managed to fool a significant number of people.

[ad_2]

LEAVE A REPLY

Please enter your comment!
Please enter your name here