Nick Grimm: When you’re about to make a big purchase, you might seek advice from your friends and family or maybe even a professional advisor. But what about turning to artificial intelligence to make sense of it all? A growing number of Australians are asking AI chatbots like ChatGPT and Copilot for financial advice. But guess what? Financial experts warn AI models are prone to making mistakes and will often suggest high risk investments. Scout Wallen reports.
Scout Wallen: When Georgia Doll was looking to buy her first home, she used an AI chatbot to help compare deals from lenders.
Georgina Doll: I would put in, you know, a few examples of different interest rates and I would get it to like model the repayments. You know, also if I wanted to shorten my home loan duration, I’d get it to model the difference in payments per month and total interest.
Scout Wallen: Like most of us, she never studied finance. So the chatbot was helping her understand these tricky financial concepts in a relatable way.
Georgina Doll: It almost feels like you are speaking to a consultant sometimes. Or a tutor. And they just give you the exact answer you need straight away. It’s very helpful.
Scout Wallen: Georgina Doll was using AI to crunch the raw numbers and calculate her monthly repayments. And she still checked if the AI had got it wrong. But others are using chatbots not only to crunch numbers, but also to provide financial advice. That has some in the industry concerned.
Jaunita Wrenn: Even baby boomers, you know, you’re talking sort of your late 50s, in your 60s, are using chat to get a lot of, I guess, fundamental education about finances to be explained to them before they come and see us.
Scout Wallen: Financial advisor Juanita Wrenn says AI chatbots are handy for explaining financial concepts. But the sheer amount of information it’s pulling in means it can make mistakes when offering advice.
Jaunita Wrenn: When it comes to putting a plan together, it often gets figures wrong. It makes sort of wrong assumptions. It makes assumptions about growth rates that might be based on the American stock market.
Scout Wallen: She says much of Australia’s financial system is open to interpretation, and that’s where AI chatbots, like ChatGPT, can struggle.
Jaunita Wrenn: Tax laws aren’t black and white. There’s a lot of grey area in tax. And ChatGPT is not very good at navigating the grey areas. It likes black or white rules.
Scout Wallen: Juanita Wrenn says part of the problem is that chatbots are designed to come back to you with an answer to your question. So, in a sense, they can be too eager to please.
Jaunita Wrenn: It can give you a false sense of security that what you’re investing into is going to return these returns that you’re hopeful for. And I think humans have a way of leading chat down a path that you want it to go because you want to hear that answer.
Scout Wallen: This experience is backed up by research out of Switzerland. Philipp Winder at the University of St Gallen tested the financial advice from three popular large language models, or LLMs, OpenAI’s ChatGPT, Google’s Gemini and Microsoft’s Copilot, and found they consistently recommended portfolios with higher risk when compared to a common benchmark index fund.
Philipp Winder: LLMs like ChatGPT, basically, they generate financial advice that seems very plausible and they also present it in a way that makes it very approachable. So basically, they push and give you a good feeling about the advice. But when you actually look into the advice, then you see that portfolios are usually associated with higher risk.
Scout Wallen: AI tended to encourage investing in American equities. They were up to 90 percent of the suggested portfolio. And the chatbot’s advice could also be swayed by what’s generating the most interest online.
Philipp Winder: If now AI is a hype topic and is discussed quite frequently, they will basically try to recommend equities that are around this topic. So AI companies that are listed.
Scout Wallen: It’s not clear how these large language models make these decisions. Mr. Winder describes the platforms as black boxes. So you won’t necessarily know whether your AI generated advice has come from a financial advisor or a social media post.
Philipp Winder: Yeah, written by non-experts. So there’s a lot of financial advice and also information provided by non-experts, which then basically gets recited or reinforced through the LLMs.
Scout Wallen: A spokesperson for Australian Securities and Investment Commission’s Money Smart program says it’s not uncommon for AI models to invent information that has no basis in fact. And they warn that AI is not bound by the same rules as a licensed financial advisor who has a duty to consider your circumstances before offering personal advice.
Nick Grimm: Scout Wallen with that report.