AI financial advice risks spark concern as 66% of Americans turn to ChatGPT for money decisions

2 Sources

Share

Two-thirds of Americans now use generative AI tools like ChatGPT and Gemini for financial advice, from budgeting to investing. But as the robo-advisor market grows to $1.4 trillion, financial planners warn that AI lacks the emotional capacity and human judgment needed for complex money decisions. Data privacy concerns and algorithm transparency issues add to the risks.

Americans Increasingly Turn to AI for Money Management

Artificial intelligence has rapidly infiltrated personal finance, with 66% of Americans who have used generative AI tools like ChatGPT or Gemini now relying on them for AI financial advice, according to a September report by Intuit Credit Karma

1

. The trend is even more pronounced among younger demographics, with 82% of Gen Z and millennials using these platforms for everything from budgeting to tax planning and investing

1

.

Source: Quartz

Source: Quartz

The appeal is understandable. AI-powered robo-advisors offer automated investing solutions at a fraction of the cost of traditional human advisors, allowing users to open accounts and start investing within minutes from their smartphones

2

. The global robo-advisor market reached $1.4 trillion in value in 2024 and is projected to grow to $3.2 trillion by 2033, with key players including Betterment, Wealthfront, Charles Schwab, Vanguard, and SoFi

2

.

The Personal and Emotional Dimensions Missing from AI

Despite the convenience, financial experts are raising alarms about what gets lost when algorithms replace human judgment. "GenAI is a powerful tool for learning, planning, and managing your money," said Courtney Alev, Intuit Credit Karma's consumer financial advocate. "However, finances are nuanced and deeply personal"

1

.

The risks of using AI for money advice extend beyond simple miscalculations. Kevin Estes, founder of Scaled Finance in Seattle, highlighted a fundamental limitation: "AI is very good at answering the questions that you ask; the challenge is that people may not know the right questions to ask"

2

. This gap becomes critical during market downturns or life transitions when emotional support and nuanced guidance matter most.

Financial planning encompasses far more than investing, Estes noted, including insurance, estate planning considerations, and saving for college expenses—elements that AI lacks emotional capacity to address comprehensively

2

. Human advisors vs AI becomes a question of depth versus convenience.

Lack of Transparency in Algorithms Raises Concerns

Ohan Kayikchyan, a certified financial planner in North Carolina, expressed particular worry about the lack of transparency in algorithms used by robo-advisors. "In many cases, they will not make their algorithms public," he said. "In many cases, it's like a black box; we don't know what is inside"

2

.

This opacity makes it difficult for investors to understand why certain stocks underperform or when market conditions shift. Unlike human advisors who can explain their reasoning and adjust strategies through nuanced conversations, AI-powered robo-advisors rely on standardized questionnaires that may oversimplify complex financial situations

2

. Risk tolerance assessments, in particular, benefit from human intercession and communication that generative AI tools for financial advice cannot replicate.

Data Privacy Concerns Loom Large

Data privacy concerns present another significant risk. "Oftentimes, in the fine print, they'll say, 'We share our information with our partners.' But who are they and who all has access to this personal information, this private information?" Estes questioned

2

. With access to sensitive financial and personal information, these platforms present attractive targets for hackers and scammers.

Kayikchyan pointed to the cautionary tale of 23andMe, which suffered a data breach compromising nearly 7 million customers' genetic data, resulting in a $30 million lawsuit settlement. The company's subsequent bankruptcy proceedings raised questions about what happens to user data when companies change hands

2

. Many automated platforms operate under larger financial institutions, giving multiple divisions access to sensitive data for cross-selling and marketing purposes.

As retirement investments and other critical financial decisions increasingly flow through AI systems, the short-term implications involve potential mismatches between algorithmic recommendations and individual circumstances. Long-term, the industry faces questions about accountability when AI risks materialize into actual financial losses during volatile periods.

Today's Top Stories

TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

© 2025 Triveous Technologies Private Limited
Instagram logo
LinkedIn logo