5 Reasons to Think Twice Before Using ChatGPT—or Any Chatbot—for Financial Advice


I’ve used ChatGPT to help me build a budget before, and it was genuinely helpful. After I input my monthly salary as well as my standard utilities and recurring expenses, the chatbot drafted a few solid options, and I tweaked them into penny-pinching perfection. I’m admittedly part of the growing number of people turning to chatbots, like Anthropic’s Claude, Google’s Gemini, and OpenAI’s ChatGPT, for financial advice.

“Millions of people turn to ChatGPT with money-related questions, from understanding debt to building budgets and learning financial concepts,” says Niko Felix, an OpenAI spokesperson, when reached for comment. “ChatGPT can be a helpful tool for exploring options, preparing questions, and making financial topics easier to understand, but it is not a substitute for licensed financial professionals.” OpenAI’s Terms of Use state that the AI tool is not meant to replace professional financial advice.

While you may consider chatbots to be practical financial assistants, it’s always worth keeping the limitations of these AI tools in mind. Beyond miscalculations, here are five additional reasons to approach them with skepticism when it comes to money tips.

AI Still Confidently Outputs Incorrect Answers

When I ask ChatGPT for help managing my money smarter, the bot is confident in its responses, often laying out what seems like solid reasoning behind each bullet point of advice. But always keep in mind that chatbots can weave convincing errors into outputs.

OpenAI has reduced the rate of hallucination in more recent model releases, but chatbot tools still output errors. “There seems to be this sense emerging, at least among casual users, that the hallucination problem has been fixed,” says Srikanth Jagabathula, a professor of technology operations and statistics at NYU. “But that’s definitely not the case, because they’re fundamentally statistical machines. They don’t have a notion of a ground truth, or what is true.”

Even if an answer seems correct at first, one easy way to stress test the output is simply to ask a chatbot to double-check everything it just said. While this approach won’t confirm whether the output is correct, this method has highlighted plenty of issues in AI responses and leaves me feeling increasingly skeptical about turning to bots for advice on any topic, beyond just money.

Yes-Bot May Affirm Preexisting Beliefs

When you turn to a human financial advisor for money tips, they will likely be cordial and professional and push back on any preconceptions you may have about saving, investing, and spending money. On the other hand, chatbots are known for being overly agreeable, often taking the user’s side.

“AI sycophancy is not merely a stylistic issue or a niche risk, but a prevalent behavior with broad downstream consequences,” reads part of a study about AI’s conversational flattery published earlier this year in the journal Science. “Although affirmation may feel supportive, sycophancy can undermine users’ capacity for self-correction and responsible decision-making.”

The study looked at how AI will take a user’s side during interpersonal conflicts, but concerns about sycophancy are relevant to financial questions as well. When I’m making money moves, I want to turn to someone who knows more than me for guidance, not rely on a yes-bot for affirmations.

Requires Sensitive Info for Better Results

For any chatbot to provide its best outputs tailored to your specific needs, people are nudged to share sensitive information with the AI tools. For example, when I asked ChatGPT how it could help improve my budget even more, the bot nudged me to consider uploading my complete financial history from the last few months for the best answers.

“You don’t have to upload everything—but yes, the more real data you share, the more accurate (and useful) the audit will be,” read ChatGPT’s output, in part. “Upload CSVs or screenshots of bank account, credit cards. Then I can: categorize everything, calculate exact spending patterns, identify hidden leaks you wouldn’t notice, and build a precise monthly budget.”

Unless your settings are adjusted, all of your conversations with ChatGPT may be used by OpenAI to improve the tools and as training data for future iterations. Visit ChatGPT’s “data controls” tab to change your settings. Even if you opt out of AI training, it can be risky to upload so much sensitive data about your money to a platform that’s not an official banking app.

Bots Lack Accountability

Jagabathula sees tools like ChatGPT as a worthwhile part of your toolkit, primarily when you’re in the early stages of asking questions about money matters, like tax saving strategies or investment ideas. But you should always rope in someone with expertise before making high-stakes decisions.

“A human expert in the loop is super critical,” he says. “Especially for the last mile, you’re actually going from idea generation to taking action. Somebody needs to review the plan, adjust it, and correct it if necessary.”



Source link