Curated by THEOUTPOST
On Wed, 16 Oct, 12:02 AM UTC
3 Sources
[1]
Do people trust AI with their money? Here's what research shows
When it comes to investing and planning your financial future, are you more willing to trust a person or a computer? This isn't a hypothetical question any more. Big banks and investment firms are using artificial intelligence (AI) to help make financial predictions and give advice to clients. Morgan Stanley uses AI to mitigate the potential biases of its financial analysts when it comes to stock market predictions. And one of the world's biggest investment banks, Goldman Sachs (GS), recently announced it was trialling the use of AI to help write computer code, though the bank declined to say which division it was being used in. Other companies are using AI to predict which stocks might go up or down. But do people actually trust these AI advisers with their money? Our new research examines this question. We found it really depends on who you are and your prior knowledge of AI and how it works. To examine the question of trust when it comes to using AI for investment, we asked 3,600 people in the United States to imagine they were getting advice about the stock market. In these imagined scenarios, some people got advice from human experts. Others got advice from AI. And some got advice from humans working together with AI. In general, people were less likely to follow advice if they knew AI was involved in making it. They seemed to trust the human experts more. But the distrust of AI wasn't universal. Some groups of people were more open to AI advice than others. For example, women were more likely to trust AI advice than men (by 7.5%). People who knew more about AI were more willing to listen to the advice it provided (by 10.1%). And politics mattered - people who supported the Democratic Party were more open to AI advice than others (by 7.3%). We also found people were more likely to trust simpler AI methods. When we told our research participants the AI was using something called "ordinary least squares" (a basic mathematics technique in which a straight line is used to estimate the relationship between two variables), they were more likely to trust it than when we said it was using "deep learning" (a more complex AI method). This might be because people tend to trust things they understand. Much like how a person might trust a simple calculator more than a complex scientific instrument they have never seen before. As AI becomes more common in the financial world, companies will need to find ways to improve levels of trust. This might involve teaching people more about how the AI systems work, being clear about when and how AI is being used, and finding the right balance between human experts and AI. Furthermore, we need to tailor how AI advice is presented to different groups of people and show how well AI performs over time compared to human experts. The future of finance might involve a lot more AI, but only if people learn to trust it. It's a bit like learning to trust self-driving cars. The technology might be great, but if people don't feel comfortable using it, it won't catch on. Our research shows that building this trust isn't just about making better AI. It's about understanding how people think and feel about AI. It's about bridging the gap between what AI can do and what people believe it can do. As we move forward, we'll need to keep studying how people react to AI in finance. We'll need to find ways to make AI not just a powerful tool, but a trusted advisor that people feel comfortable relying on for important financial decisions. The world of finance is changing fast, and AI is a big part of that change. But in the end, it's still people who decide where to put their money. Understanding how to build trust between humans and AI will be key to shaping the future of finance.
[2]
Do people trust AI on financial decisions? We found it really depends on who they are
When it comes to investing and planning your financial future, are you more willing to trust a person or a computer? This isn't a hypothetical question any more. Big banks and investment firms are using artificial intelligence (AI) to help make financial predictions and give advice to clients. Morgan Stanley uses AI to mitigate the potential biases of its financial analysts when it comes to stock market predictions. And one of the world's biggest investment banks, Goldman Sachs, recently announced it was trialing the use of AI to help write computer code, though the bank declined to say which division it was being used in. Other companies are using AI to predict which stocks might go up or down. But do people actually trust these AI advisers with their money? Our new research examines this question. We found it really depends on who you are and your prior knowledge of AI and how it works. Trust differences To examine the question of trust when it comes to using AI for investment, we asked 3,600 people in the United States to imagine they were getting advice about the stock market. In these imagined scenarios, some people got advice from human experts. Others got advice from AI. And some got advice from humans working together with AI. In general, people were less likely to follow advice if they knew AI was involved in making it. They seemed to trust the human experts more. But the distrust of AI wasn't universal. Some groups of people were more open to AI advice than others. For example, women were more likely to trust AI advice than men (by 7.5%). People who knew more about AI were more willing to listen to the advice it provided (by 10.1%). And politics mattered -- people who supported the Democratic Party were more open to AI advice than others (by 7.3%). We also found people were more likely to trust simpler AI methods. When we told our research participants the AI was using something called "ordinary least squares" (a basic mathematics technique in which a straight line is used to estimate the relationship between two variables), they were more likely to trust it than when we said it was using "deep learning" (a more complex AI method). This might be because people tend to trust things they understand. Much like how a person might trust a simple calculator more than a complex scientific instrument they have never seen before. Trust in the future of finance As AI becomes more common in the financial world, companies will need to find ways to improve levels of trust. This might involve teaching people more about how the AI systems work, being clear about when and how AI is being used, and finding the right balance between human experts and AI. Furthermore, we need to tailor how AI advice is presented to different groups of people and show how well AI performs over time compared to human experts. The future of finance might involve a lot more AI, but only if people learn to trust it. It's a bit like learning to trust self-driving cars. The technology might be great, but if people don't feel comfortable using it, it won't catch on. Our research shows that building this trust isn't just about making better AI. It's about understanding how people think and feel about AI. It's about bridging the gap between what AI can do and what people believe it can do. As we move forward, we'll need to keep studying how people react to AI in finance. We'll need to find ways to make AI not just a powerful tool, but a trusted advisor that people feel comfortable relying on for important financial decisions. The world of finance is changing fast, and AI is a big part of that change. But in the end, it's still people who decide where to put their money. Understanding how to build trust between humans and AI will be key to shaping the future of finance.
[3]
Do people trust AI on financial decisions? We found it really depends on who they are
University of Auckland, Waipapa Taumata Rau provides funding as a member of The Conversation NZ. When it comes to investing and planning your financial future, are you more willing to trust a person or a computer? This isn't a hypothetical question any more. Big banks and investment firms are using artificial intelligence (AI) to help make financial predictions and give advice to clients. Morgan Stanley uses AI to mitigate the potential biases of its financial analysts when it comes to stock market predictions. And one of the world's biggest investment banks, Goldman Sachs, recently announced it was trialling the use of AI to help write computer code, though the bank declined to say which division it was being used in. Other companies are using AI to predict which stocks might go up or down. But do people actually trust these AI advisers with their money? Our new research examines this question. We found it really depends on who you are and your prior knowledge of AI and how it works. Trust differences To examine the question of trust when it comes to using AI for investment, we asked 3,600 people in the United States to imagine they were getting advice about the stock market. In these imagined scenarios, some people got advice from human experts. Others got advice from AI. And some got advice from humans working together with AI. In general, people were less likely to follow advice if they knew AI was involved in making it. They seemed to trust the human experts more. But the distrust of AI wasn't universal. Some groups of people were more open to AI advice than others. For example, women were more likely to trust AI advice than men (by 7.5%). People who knew more about AI were more willing to listen to the advice it provided (by 10.1%). And politics mattered - people who supported the Democratic Party were more open to AI advice than others (by 7.3%). We also found people were more likely to trust simpler AI methods. When we told our research participants the AI was using something called "ordinary least squares" (a basic mathematics technique in which a straight line is used to estimate the relationship between two variables), they were more likely to trust it than when we said it was using "deep learning" (a more complex AI method). This might be because people tend to trust things they understand. Much like how a person might trust a simple calculator more than a complex scientific instrument they have never seen before. Trust in the future of finance As AI becomes more common in the financial world, companies will need to find ways to improve levels of trust. This might involve teaching people more about how the AI systems work, being clear about when and how AI is being used, and finding the right balance between human experts and AI. Furthermore, we need to tailor how AI advice is presented to different groups of people and show how well AI performs over time compared to human experts. The future of finance might involve a lot more AI, but only if people learn to trust it. It's a bit like learning to trust self-driving cars. The technology might be great, but if people don't feel comfortable using it, it won't catch on. Our research shows that building this trust isn't just about making better AI. It's about understanding how people think and feel about AI. It's about bridging the gap between what AI can do and what people believe it can do. As we move forward, we'll need to keep studying how people react to AI in finance. We'll need to find ways to make AI not just a powerful tool, but a trusted advisor that people feel comfortable relying on for important financial decisions. The world of finance is changing fast, and AI is a big part of that change. But in the end, it's still people who decide where to put their money. Understanding how to build trust between humans and AI will be key to shaping the future of finance.
Share
Share
Copy Link
A new study reveals that trust in AI for financial advice depends on factors such as gender, political affiliation, and prior AI knowledge, highlighting the challenges in integrating AI into the financial sector.
As artificial intelligence (AI) continues to permeate the financial sector, a crucial question arises: Do people trust AI with their money? A recent study involving 3,600 participants in the United States has shed light on this complex issue, revealing that trust in AI for financial advice varies significantly across different demographics 1.
Major financial institutions are already leveraging AI in various capacities. Morgan Stanley, for instance, employs AI to mitigate potential biases in stock market predictions made by its financial analysts. Goldman Sachs, one of the world's largest investment banks, is experimenting with AI for writing computer code, although the specific division remains undisclosed 2.
The study uncovered interesting patterns in trust levels across different groups:
Interestingly, the study revealed a preference for simpler AI methods. Participants were more likely to trust AI using "ordinary least squares" (a basic mathematical technique) compared to more complex methods like "deep learning". This suggests that people tend to trust what they can understand, drawing parallels to the trust placed in simple calculators versus complex scientific instruments 1.
As AI becomes increasingly prevalent in finance, companies face the challenge of improving trust levels. Potential strategies include:
The integration of AI in finance is inevitable, but its success hinges on public trust. Much like the adoption of self-driving cars, the technology's potential can only be realized if people feel comfortable using it. The research emphasizes that building trust goes beyond improving AI capabilities; it requires understanding human perceptions and bridging the gap between AI's actual capabilities and public beliefs about its abilities 3.
A study reveals that employee trust in AI, both cognitive and emotional, significantly impacts its performance and adoption in companies. Despite heavy investments, 80% of firms fail to benefit from AI due to human factors rather than technological limitations.
4 Sources
4 Sources
AI is transforming the financial services landscape, offering personalized solutions for investment, banking, and customer support. This article explores how AI-driven platforms are democratizing financial advice and reshaping the industry.
2 Sources
2 Sources
As tech giants pour billions into AI development, investors and analysts are questioning the return on investment. The AI hype faces a reality check as companies struggle to monetize their AI ventures.
5 Sources
5 Sources
As artificial intelligence continues to advance, the role of financial advisors is evolving. This story explores the impact of AI on the financial advisory industry and how professionals are adapting to remain relevant.
2 Sources
2 Sources
An in-depth look at the current state of AI in the financial sector, exploring challenges in adoption, the evolving roles of traditional banks, fintechs, and big tech companies, and the potential future landscape of AI-driven financial services.
2 Sources
2 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved