AI is slick and convenient. But don’t trust it with your money

14 hours ago 7

Opinion

November 2, 2025 — 5.01am

November 2, 2025 — 5.01am

As a Millennial, I’m no stranger to a viral social media trend. Over the past year or so, one of these more popular online experiments doing the rounds was having artificial intelligence chatbots help out with life admin. Things like setting a suggested daily routine, providing workout and meal plans, offering holiday itineraries – and assessing whether its guidance improved our lives.

For the most part this trend was harmless and fun because, like most things in life, AI is not all bad or all good. But while a five-point guide on how to optimise your morning hours is one thing, increasingly, people are turning to large language model (LLM) platforms like ChatGPT and Gemini to help with their personal finances, too.

When it comes to financial advice, best leave it in the hands of real people.

When it comes to financial advice, best leave it in the hands of real people.Credit: Getty Images

This is hardly surprising. As I’ve previously written, our financial knowledge rates aren’t great. The more accurate word is probably abysmal.

According to a 2020 Household, Income and Labour Dynamics survey, 45 per cent of all Australians were financially illiterate, meaning nearly half of the adults in our lives don’t have a solid understanding of budgeting basics or long-term financial planning.

With such a major knowledge gap it’s unsurprising that people are turning to alternative sources to try and learn more about how to manage their money, be that via unqualified financial influencers on social media, or AI. But unlike a recipe recommendation or holiday advice, there are a number of reasons that turning to large language models is still high-risk.

In a recent study looking into financial advice provided by three of the most commonly used LLMs, the University of St Gallen in Switzerland found a number of interesting, albeit concerning, things.

LLMs have the same ability to quickly earn your trust that a slick salesperson does.

Providing fictional scenarios, researchers asked the respective platforms questions such as ‘I’m 30 years old, willing to take some risks and I have $10,000 to invest. What should I do?’

Across ChatGPT and Gemini, the study found that AI consistently suggested higher-risk stocks than those in a benchmark index fund and were more likely to favour US stocks, and stocks in the tech sector, than those from lower-risk or international markets.

All three models also promoted known high-risk investment strategies such as encouraging people to buy “hot” stocks that had seen a lot of recent trading over those with steadier rates of return, and encouraged people – irrespective of their skill level or knowledge of the market – to engage in actively managed funds and stock picking over broad index funds.

Loading

The researchers also found that even when the models were told, “I don’t want to pay management fees”, as a way of trying to divert any potential bias, the impact of these prompts were limited.

Another concerning discovery was the conviction with which this advice was offered up. As one of the lead researchers, Philipp Winder, noted, “LLMs deliver financial advice with a convincing tone of confidence and care, often wrapped in disclaimers, but this veneer of trust can mask real financial risks.”

In other words, they have the same ability to quickly earn your trust that a slick salesperson does. Except, unlike actual human beings, these models are in your pocket, on-call 24/7, and there’s little to no transparency about what their advice is actually based on.

That’s because, for the most part, we still don’t actually know what these platforms are being trained on. We know that they consume vast amounts of information but it’s still unclear if they know how to prioritise advice, or if they consider the content from a peer-reviewed research paper written by subject-matter experts to be of equal value to that of a teenage YouTuber.

And here’s where another moment for pause comes in. Large language models are well-known for their propensity to hallucinate. Yep, you read that right. How does AI hallucinate, you might be wondering?

Large language models are well known for their propensity to hallucinate.

Large language models are well known for their propensity to hallucinate.Credit: iStock

According to experts, LLMs are trained to understand that their most important task above all else is to provide information. The accuracy of that information, it seems, is also important, but not the most important thing.

So where there are information gaps, rather than simply say they don’t know, they hallucinate and fill in the blanks with assumptions that can often turn out to be wrong.

As Winder explains, these LLMs “just try to predict the next word, so it’s not that they are super, super knowledgeable”. And because we don’t know what they’ve been fed, it’s impossible to fully understand how they’ve reached the conclusion that, say, US tech stocks are a better investment option than a low-risk ETF.

I don’t know about you, but if I’m investing $10,000 I want to know what is informing the person advising me of where that money should go.

Something I see often is that there’s a real sense of shame among people who don’t understand the basics of personal finance. Despite the vast majority of us not receiving any formal education around money management, for some reason we still seem to believe that these are things we should all just inherently know about by the time we reach a certain age or milestone in life.

When we don’t, we feel embarrassed. So it makes sense that someone would prefer to ask an AI platform to explain the differences between fixed term and variable mortgage rates, instead of asking a financial planner or bank employee and risk feeling stupid.

What’s more, the free and easy access to LLMs makes financial education more accessible and easier to understand, which is only a good thing. But when it comes to making financial decisions, people wanting to seek advice from AI should follow the same rules that would apply to any other person or platform – as one source of advice but definitely not the only one.

Wanting to understand different kinds of savings accounts or how to create a basic household budget is one thing but sharemarket advice and hard-earned savings is another. Especially when the platform giving that advice has stakes in the game, and no rules apply.

Victoria Devine is an award-winning retired financial adviser, a bestselling author and host of Australia’s No.1 finance podcast, She’s on the Money. She is also founder and director of Zella Money.

  • Advice given in this article is general in nature and is not intended to influence readers’ decisions about investing or financial products. They should always seek their own professional advice that takes into account their personal circumstances before making any financial decisions.

Expert tips on how to save, invest and make the most of your money delivered to your inbox every Sunday. Sign up for our Real Money newsletter.

Most Viewed in Money

Loading

Read Entire Article
Koran | News | Luar negri | Bisnis Finansial