2 Sources
2 Sources
[1]
AI may replace your financial advisor, MIT professor says -- but there's one big hurdle
That means generative AI models -- examples of which include OpenAI's ChatGPT, Anthropic's Claude and Google's Gemini -- may not always give financial advice in users' best interest, experts said. The financial capability of artificial intelligence platforms is improving to the extent that it will likely be able to replace human financial advisors in the future, according to finance experts. However, AI has a major drawback relative to human advisors: a lack of fiduciary duty, they said. And a resolution to that legal gray area doesn't seem near at hand, they said. A fiduciary duty is a legal obligation that many financial advisors -- and professionals in other fields, such as lawyers and doctors -- owe their clients. It essentially means they will put their clients' best interest ahead of their own. "The problem that we have to solve is not whether AI has enough expertise," said Andrew Lo, a finance professor and director of the Laboratory for Financial Engineering at the MIT Sloan School of Management. "The answer right now is, clearly, AI has the [financial] expertise." "What they don't have is that fiduciary duty," Lo said. "They don't have the ability to suffer consequences if they make a mistake to the same degree that a human advisor does." An advisor who violates their fiduciary responsibility can be subject to fairly serious consequences, including regulatory penalties, civil liabilities and criminal charges, Lo said. The notion of putting a client's interest ahead of yours "has no teeth" without responsibility or legal liability, he said.
[2]
MIT Expert Finds Limits in AI's Ability to Offer Financial Advice | PYMNTS.com
By completing this form, you agree to receive marketing communications from PYMNTS and to the sharing of your information with our sponsor, if applicable, in accordance with our Privacy Policy and Terms and Conditions. However, the technology has a significant limitation, experts told CNBC Monday (April 6): AI has no sense of fiduciary duty, nor any obligation to act in a client's best interests. "The problem that we have to solve is not whether AI has enough expertise," Andrew Lo, a finance professor and director of the Laboratory for Financial Engineering at the MIT Sloan School of Management, said in an interview with CNBC. "The answer right now is, clearly, AI has the [financial] expertise." However, "What they don't have is that fiduciary duty," Lo said. "They don't have the ability to suffer consequences if they make a mistake to the same degree that a human advisor does." He added that an advisor who violates these duties can face regulatory penalties, civil liabilities and criminal charges. The idea of placing a client's interest above your own "has no teeth" without responsibility or legal liability, said Lo. There are still some good uses for AI in financial planning, Lo acknowledged, saying the technology is "really good" at offering resources online for financial concepts that most people don't understand, like issues with Medicare. The report also quotes Sebastian Benthall, a senior research fellow at the Information Law Institute at New York University's law school, who said there is a major regulatory question around consumer use of AI for financial advice. "Who's really responsible, and can people really be relying on a product to do this if it's not being backed up by a corporation with a fiduciary duty?" Benthall said. "It's really unresolved." These arguments are happening as consumers increasingly use AI for tasks like organizing their personal finances, as PYMNTS Intelligence research has found. For example, the data shows that 62% of Generation Z consumers surveyed by PYMNTS are open to using AI for "what if" financial planning. The research also shows that 54% of adults in the U.S. now turn to AI for personal tasks, with the average user depending on two to three different tools. Among AI's most devoted adherents, more than 60% access AI primarily through a smartphone app, indicating that artificial intelligence has "moved from occasional browser experimentation to habitual daily behavior," PYMNTS wrote recently. "Every additional touchpoint where consumers engage with AI expands the surface area where AI can trigger or influence a financial outcome."
Share
Share
Copy Link
AI platforms now possess the financial expertise to replace human advisors, according to MIT professor Andrew Lo. But a critical problem remains: AI lacks fiduciary duty and legal accountability. Without legal liability for mistakes, generative AI models like ChatGPT may not act in clients' best interests, creating unresolved regulatory questions as consumer reliance on AI for financial guidance grows.
The financial capability of AI platforms has advanced to a point where they could feasibly replace human financial advisors, according to finance experts. Andrew Lo, a finance professor and director of the Laboratory for Financial Engineering at the MIT Sloan School of Management, stated that AI's increasing financial expertise is no longer the central issue. "The problem that we have to solve is not whether AI has enough expertise," Lo told CNBC. "The answer right now is, clearly, AI has the [financial] expertise."
1
This assessment comes as generative AI models like ChatGPT, Anthropic's Claude, and Google's Gemini demonstrate sophisticated understanding of complex financial concepts.1
Despite AI's ability to offer financial advice, a fundamental problem persists: the technology has no fiduciary duty. This legal obligation requires financial advisors—along with professionals like lawyers and doctors—to place their client's best interest ahead of their own. "What they don't have is that fiduciary duty," the MIT professor explained. "They don't have the ability to suffer consequences if they make a mistake to the same degree that a human advisor does."
1
An advisor who violates their fiduciary responsibility faces regulatory penalties, civil liabilities, and even criminal charges. Without this legal liability for mistakes, the notion of prioritizing client welfare "has no teeth," Lo emphasized.2
This means generative AI models may not always act in users' best interest, creating significant regulatory questions about legal accountability.1
Sebastian Benthall, a senior research fellow at the Information Law Institute at New York University's law school, highlighted the uncertainty surrounding consumer use of AI for financial planning. "Who's really responsible, and can people really be relying on a product to do this if it's not being backed up by a corporation with a fiduciary duty?" Benthall asked. "It's really unresolved."
2
The legal obligation for advisors to act in client interests has no equivalent framework for AI systems, leaving consumer reliance on AI for financial guidance in a precarious position. A resolution to this legal gray area doesn't appear imminent, experts noted.1
Related Stories
These debates unfold as consumers increasingly turn to AI for personal finance tasks. PYMNTS Intelligence research reveals that 62% of Generation Z consumers surveyed are open to using AI for "what if" financial planning scenarios.
2
Additionally, 54% of adults in the U.S. now use AI for personal tasks, with the average user depending on two to three different tools. Among AI's most devoted users, more than 60% access AI primarily through a smartphone app, indicating that artificial intelligence has "moved from occasional browser experimentation to habitual daily behavior."
Source: PYMNTS
Every additional touchpoint where consumers engage with AI expands the surface area where AI can trigger or influence financial outcomes, making the fiduciary duty question increasingly urgent.
2
Lo acknowledged that AI is "really good" at offering resources online for financial concepts most people don't understand, like Medicare issues, suggesting practical applications exist even within current limitations.2
Summarized by
Navi
03 Dec 2025•Business and Economy

23 Jul 2025•Business and Economy

16 Oct 2024•Business and Economy

1
Policy and Regulation

2
Policy and Regulation

3
Technology
