By Uri Gal
Publication Date: 2026-02-27 05:37:00
Recently, some Australian shoppers got more than they bargained for when they chatted with Olive, Woolworths’ artificial intelligence (AI) assistant.
Instead of following food, recipes and basket suggestions, Olive reportedly reacted strangely and all too humanly. It talked about his “mother” and offered other personal-sounding details.
Further testing revealed pricing errors on basic items. And when I asked about the price of a particular product, Olive didn’t give a clear answer. Instead, they checked whether the item was in stock and explained the collection fees.
So what exactly is going on here? And what lessons might these incidents hold for businesses and consumers alike?
What actually happened?
Olive is based on a large language model (LLM). These models do not “know” things like humans do, nor do they have mothers. Using complex statistical analyses, they generate language that sounds plausible.
Comments from a Woolworths…