“Hot enough for you?” Probably not, given the season. Or maybe the mercury has climbed since I wrote this column. Who’s to say? Certainly not Google Overview or any other AI tool doing its robotic best to grasp our inner temperature, sifting what humans mean behind the words we say.
English is never easy. Ask Felipe, my Chilean barista, who fills the café with his bossa nova covers before his boss arrives. One day, an Ipanema travesty of Radiohead’s Creep mugging my senses, Felipe asked, “You like this song?” I gave a brave smile: “Yeah right”. Meaning no, though all Felipe heard was a double positive, so he turned it up.
AI struggles to fully grasp our inner temperature.Credit: Getty
With that mix-up in mind, imagine a chatbot’s chances of grappling with idiom, puns, nuance, vernacular. Words on the surface will often disguise their meaning beneath. If a robot heard about a pop-up taco truck charging like a wounded bull, it would likely run for cover. Even Aditya Joshi, when first arriving in Australia from India, was thrown by our way of saying things. In 2018, say, a colleague at the University of NSW asked “How are you going?” Joshi did his best to answer: “I’m taking the bus.”
Since then, doing a doctorate in natural language processing, Joshi has been helping AI tools communicate more reliably with non-American versions of English. Sentiment and sarcasm are the major hurdles, the algorithm struggling to isolate a speaker’s vibe imbued in the local dialects of here, India and England. His research bestie is BESSTIE, the Benchmark for Sentiment and Sarcasm for Varieties of English, a system fed with Reddit posts and Google reviews (drawn from each target language) to help the software decipher a comment’s drift.
“Good chat” can mean just that, or often its opposite. “Not bad” is another weasel to wrangle.
“I love being ignored,” say, can mean the speaker loves to be overlooked, or doesn’t. Humans depend on context and tone – often emoji in written form– to determine which, plus the bias of the expression’s cultural history. “Don’t you just love…” - as a construction in Australian English – will often precede an object the speaker does not love, like bossa nova covers.
Loading
A similar confusion arose at Future Science Talks at the 2024 Sydney Fringe, Joshi telling the crowd “I grew up in India, in this beautiful city called Mumbai.” I took the line to be a gentle dig at Mumbai’s non-beauty, though Joshi was being sincere, he later told me. His birthplace is truly beautiful, making my misreading one more quandary for Grok or Perplexity or any other AI platform to tackle.
“Just as a vacuum cleaner sucks air,” says the sarcasm professor, “AI sucks data. But to be the most effective, a vacuum needs the right attachment.” The challenge, then, is to customise the “dialect attachments” for each English beyond the American default, especially when it comes to our homegrown knack for masked sentiment or inbuilt irony.
“Good chat” can mean just that, or often its opposite. “Not bad” is another weasel to wrangle. So too our fetish for qualified positives, such as “You’re not wrong”. Or derisive accolades: “Good effort, champ!” Phrase by phrase, Joshi and his team are equipping the robots to spot the difference, though always with a caveat. Since one person’s welfare check – “how are you going?” – is another person’s transport inquiry. Question being, will a non-human conversationalist ever master Australian English? My hunch? Yeah-nah.
Get tips, tricks and word games from our crosswords guru, plus links to our online puzzles and quizzes, delivered to your inbox every Saturday. Sign up for our Puzzled with David Astle newsletter.