Can you trust AI’s money advice? Financial experts weigh in


Grocery lists to help create a website to promote her real estate agent work, Jennifer Allen says she uses Chatgpt for everything.

When unexpected hospital bills and time far from work after childbirth led her to rely on credit cards, she knew that her debt increased. But she was afraid to count the total amount and rarely looked at her bank accounts. Until one day, she wondered if the cat or the “cat”, as she calls it, could help.

She nourished the chatbot the required information and told her that she had raised $ 23,000 in debt. Surprised by the number, she wondered how she could reimburse him. Allen said that she hadn’t even thought of consulting a financial planner. However, she asked Chatgpt.

“Even if a financial planner told me something, I would always chat to execute it,” Allen told USA Today.

She prompted the chatbot to give her something she could do every day to help repay her debt, and documented the process On Tiktok. At the end of two 30 -day challenges, she had found $ 13,078 following the boot’s advice and earned additional money from the Tiktok creative award program. She said she was now just under $ 5,000 in debt.

Although not everyone follows chatgpt advice every day, the chatbot has experienced rapid growth. It reaches around 700 million users each week – four times more than last year, According to Nick Turley from Openai.

Chatgpt is not the only artificial intelligence model on which people count for more information. A Investigation of morning consultations I found that more than half of us, adults, said they refer to summaries generated by AI during online research and 1 in 10 said they did not consult other sources. A Questionnaire of the state University of southeast of Oklahoma noted that 1 out of 3 American used an AI tool to make a career decision.

Some believe that technology will transform the financial planning space. Others warn to rely on it for money advice. And although some humans can be interested in saying that they are doing a better job than AI, even companies behind popular chatbots advise caution. According to Google, models of large languages, such as Geminies, can “hallucinate” and present inaccurate and factual information.

USA TODAY posed five popular questions of personal finance of personal finance. Here is what they said and what the financial experts have thought of their answers:

AI advice on retirement savings

USA Today asked Cat,, Claude,, Co -pilot,, GeminiAnd Goer Three personal financial questions in the same order – starting with one of the most common: how much money do I need to retire?

Their answers were similar but not identical. In a few seconds, chatbots generated somewhat long responses, generally formatted in fleas, giving examples and general advice with warnings.

Grok was the only model to give a specific number in its final response – about $ 1 million. But that, alongside Chatgpt and Copilot, also asked the user to provide more information. Gemini recommended using a retirement calculator and Claude suggested meeting a financial planner.

All indicated in the 4% rule – a withdrawal strategy which indicates that retirees can safely withdraw 4% of their savings during the year they withdraw, then adapt to inflation each following year. However, the rule is over 30 years old and its creator said it was exceeded in 2022.

“There is not a number for everyone. If the chatbot tries to answer this question without asking for information, it is useless,” said Annamaria Lusardi, who heads Stanford’s initiative for financial decision -making. “The basic rule of 4% is completely obsolete. … If you follow it, you have a very great probability of running out.”

AI advice on credit scores

Chatbots’ responses to the question “How to improve my credit rating?” were almost identical. They suggested strategies such as payment of invoices on time, maintaining credit use and maintaining a mixture of healthy credit.

“This is a much easier question to answer Chatgpt correctly because there is all this information, for example, on The Fico score website“Said Lusardi.” If you compare these two questions, this is really a type of situation where you can have rules for everyone. »»

Greg Clement is the CEO and founder of Freedomology, a technology and coaching company that launched His own chatbot Dedicated to helping people with their finances, their health and their relationships. He worked as a financial planner for eight years and thinks that popular AI models can be useful when people have financial questions, but that their answers are always “very vague and generic”.

“It’s almost as if you were talking to 100 financial planners and asking the same question for 100 people and trying to consolidate all their answers in one summary,” said Clement.

Between the documented prejudices of AI and the inability to understand things at a human level, Tori Dunlap, a money expert who founded His first 100kPeople are skeptical of technology.

“He’s like your personal digital robotic assistant. It is not intended to challenge you or repel, or to help you think differently. It is something that a coach or an expert can help you do,” said Dunlap. “I would also say, however, if you are going to go from any financial advice to Chatgpt, I will take the Chatppt each time.”

What’s going on when you give specific Numbers to AI?

Use of the median household income And deposit In Illinois, USA Today asked the chatbots which house price a couple could afford in this state.

Before giving a number, most asked the user to take into account factors, including their debt / income ratio, private mortgage insurance and land taxes. But without asking for more information, everyone gave a different scope.

Chatgpt and Gemini were the most optimistic, suggesting $ 300,000 to $ 320,000 and $ 275.00 to $ 325,000, respectively. Claude said $ 245,000 at $ 270,000 and Copilot told $ 225,000 to $ 250,000. Grok gave the lowest fork from $ 200,000 to $ 240,000.

“Personal finance concerns our life. I do not know that I would leave it to artificial intelligence without attentive checks and that I am aware that the different will give me different results,” said Lusardi. “Some of these suggestions can be very simple and potentially not very useful.”

Dunlap said that the variety of chatbots responses is the result of their non-sufficient information. If someone asked her this question, she said she would follow up by questioning their credit scoring, their ideal mortgage payment and interest rates.

“But even before you do this, my question is: do you really want to be a owner or do you just feel that you need it to succeed?” She said. “By definition, you speak to a robot. You don’t talk to someone who understands a really complex human emotion.”

After all, if someone asks this question at AI, he speaks to a chatbot who has never known a property.

“If a young couple from the Freedomology community would ask the same question, he would probably get answers from people who have a house for 10 or 20 years,” said Clement. “How do you replace this? I don’t think you can.”

What do IA companies recommend?

In USA Today’s conversations with AI models, many have included notices of non-responsibility that they were not financial advisers, and AI companies have guarantees to check their answers.

Google Double verification function highlights all online contradiction information. The company’s help center notes that people should not count On Gemini for financial advice.

An anthropic spokesperson, the company behind Claude, said they were encouraged to see people using the model as a financial literacy tool to demystify subjects such as compound interest and credit scores. However, they said, although Claude can help people are more informed, he should not replace approved professionals with personalized financial decisions.

They recommend using Claude to learn and prepare more intelligent questions, but to rely on certified professionals who can give personalized advice regarding actual investment decisions and retirement strategies.

“The most successful approach we see is that people use Claude to improve their financial literacy, then by making this knowledge in real world decisions,” the anthropogenic spokesman said in a statement to USA Today. “They understand terminology, recognize better opportunities and feel more confident, that they negotiate a car loan, the choice between job offers or the preparation of retirement planning meetings. This is where AI really helps – make financial knowledge accessible to everyone. ”

In another statement to USA Today, a Microsoft spokesperson said that Copilot’s in-depth research mode can help people make well-informed choices in areas that require careful evaluation, including financial decisions.

“While we are looking to the future, we focus on making it of Copilot an even better companion; the one who is more personal and who feels natural used in daily life,” said the spokesperson. “The AI ​​can always make mistakes, so we always recommend people to check the sources and contact a financial advisor if necessary.”

While Allen said that she did not take everything that AI said to its nominal value, she credits it as a reason why she has passed not to know how many debts she had to pay the majority.

“This is what has changed about this whole process,” said Allen. “I’m not afraid. I have the cat on my side. “

OPENAI and XAI did not respond to requests for comments from the USA Today.

Reach Rachel Barber at [email protected] and follow it on x @Rachelbarber_.



Leave a Reply

Your email address will not be published. Required fields are marked *