AI and financial guidance: Why digital safety is key and human advisors still lead
AI and financial guidance: Why digital safety is key and human advisors still lead
AI chatbots provide quick answers but regulated digital platforms and human advisors offer nuance, accuracy, and context guided by fiduciary standards
AI and financial guidance: Why digital safety is key and human advisors still lead
AI chatbots provide quick answers but regulated digital platforms and human advisors offer nuance, accuracy, and context guided by fiduciary standards
Listen
·Key takeaways
- Sharing personal financial details with AI chatbots can increase privacy and data-use risks.
- AI chatbots operate differently from regulated digital platforms with guardrails and human oversight.
- Human financial advisors can deliver context, personalization, and ongoing support.
- Human financial advisors and regulated digital advice platforms operate within a framework of legal, ethical, and fiduciary standards.
Many people turn to AI chatbots for on-demand answers about money matters including investing and retirement. While sometimes a good starting point for questions, chatbots can generate incorrect responses and pose privacy risks when sensitive information is shared. Regulated digital wealth platforms and human advisors can provide more accurate and personalized advice in a safer setting.
Many retirement savers may be tempted to share personal financial details with AI chatbots. While these tools can be helpful for financial education, it’s also important to approach them with care.
About two-thirds of Americans have used AI for financial advice, according to a survey from Intuit Credit Karma, rising to 82% among Gen Z and Millennials. About 75% said AI lets them ask money questions that they might be hesitant or embarrassed to ask others.1
The appeal is clear: AI tools can offer quick, on-demand answers that can help users make sense of complex financial topics — from budgeting to investment advice to retirement planning — in simple terms and without judgement.2
But convenience comes with risk. Chatbots can provide incomplete or inaccurate answers, and sharing financial or other sensitive information with those tools raises concerns about privacy, security, and use of that data beyond its original purpose.3,4
Safety first: Protecting financial data
AI chatbots like OpenAI’s ChatGPT or Google’s Gemini are large language models trained on vast amounts of text and data from the internet and other sources, including earlier chats, that are retrieved to generate human-like responses.5
When you share information with an AI tool — including financial details such as income, account balances, or debt — that data can be stored indefinitely by the company providing the AI.6
Chatbots may use financial conversations or queries to refine models and improve quality, with human reviewers able to access those exchanges. Portions of shared data and financial discussions can show up into future datasets in various and unpredictable forms.7
Reviewing terms of service and privacy policies — which vary widely — can help clarify how data is used. For example, a Google Gemini help page advises users against sharing confidential details that a reviewer might see or could be used to improve machine-learning technologies.8 ChatGPT has a function that allows users to opt out of sharing data for training.9
Limiting how data might be used or shared doesn’t necessarily address how it might be stored and protected. Like all online platforms, AI chatbots face cybersecurity risks such as hacking attempts, data breaches, and outages. Past lapses have exposed personal information such as email addresses and partial credit card numbers.10
Financial professionals advise sharing only the details necessary with an AI chatbot to get a useful response. Potential personal identifiers such as Social Security numbers, exact income, account numbers, or specific assets should be left out. Questions and shared financial information should be framed in broad, general terms.11
Chatbot developers have added privacy safeguards as models advance; for example, ChatGPT allows users to disable chat history through a home-page toggle.12 Even with evolving safeguards, it’s best to avoid sharing anything with a chatbot that wouldn’t be safe to send in an email.13
Read more: GenAI tools that are reshaping daily life and business
Quality of advice: What to know about chatbot answers
Chatbots can handle complex tasks in seconds and automate routine tasks, like entering items into a budget. They also can be a good starting point for understanding financial concepts and best practices, though getting answers on specific situations might be different.14
More than three-quarter of Americans (76%) say technology can provide financial information, but not judgment or trust, according to Empower’s “Return on Advice” survey.*
Some chatbots respond to money questions with a disclaimer that they’re not able to provide finance or investing advice — even if answers to questions might sound like it. Some also might include disclaimer language about consulting a financial professional.15
Advice varies as much as disclaimers do — across a broad range of topics like retirement, housing, credit, investment, and taxes.16 Studies suggest chatbots tend to struggle with accurate advice when more nuance or detail is added by the user.17
Some financial professionals have voiced concern about the generalized nature of AI-generated advice and the lack of personalized context. Chatbots might take questions at face value or fail to challenge underlying assumptions in those queries.18
There’s also a risk of factual errors, “hallucinations,” broken links, or other misleading information, sometimes stemming from outdated or unverified sources used to train language models.19,20 Answers might sound confident but can still be wrong because AI chatbots can generate responses based on patterns and probabilities.21
Financial professionals say it’s best to verify chatbot answers with external sources and consider consulting with a professional before acting on them.22 The stakes are higher with financial guidance than with a chatbot’s potential bad take on a restaurant or movie.23
AI versus digital wealth platforms. What’s the difference?
Some people may confuse AI-driven financial advice with robo-advisors or digital wealth platforms that blend automated investing with access to human financial advisors. These platforms may use AI, but within a regulated framework that includes guardrails such as strict cybersecurity standards and human oversight.24
Robo-advisors and other digital platforms gather data and use algorithms to automate investment management. After collecting information from users on items like goals, time horizon, and risk tolerance, they build diversified investment portfolios suited to specific needs.25
Robo-advisors are registered with the Securities and Exchange Commission and operate under the same fiduciary responsibilities as human financial advisors. They rely on preprogrammed rules aimed at best investment practices and consistent portfolio management.26
AI chatbots don’t operate under such frameworks; they continually draw from broad and massive amounts of information to improve and generate more responsive answers.27 They also aren’t primarily designed to perform mathematical operations, which can lead to miscalculations or even flawed conclusions when the numbers are correct.28,29
Human oversight is another key distinction. Financial professionals provide ongoing supervision of robo-advice and other digital wealth platforms — even when a customer isn’t interacting with humans directly. It’s much like “AI in a box,” said Dave Gray, Empower Executive Vice President for Enterprise Solutions.
“It's humans that are really setting up the framework for the investment advice,” Gray said. “Versus a technology operating completely independently with no oversight or accountability for the outcomes that it prescribes.”
Read more: In financial planning, AI has entered the room. Where does it fit in?
Human touch: personal guidance for financial plans
Alongside investing platforms, many financial institutions provide dashboards, budgeting tools, and retirement planners that link to accounts, track assets, and model scenarios in a secure, controlled environment.
For those seeking deeper guidance, human advice offers an additional layer of support. Empower research shows 61% of Americans would use AI alongside human financial advisors, especially for significant financial decisions like investing.
Working with a financial advisor allows for discussion-based planning — online or in person — with fiduciary professionals who are legally required to act in the customer’s best interest and follow other standards.30
While AI can process data in seconds, human advisors bring context and experience from navigating past markets and changing conditions. Financial advisors also can deliver greater personalization by asking the right questions, tailoring strategies to individual goals, and providing ongoing support as circumstances change.
Human support is also a great way to spur action toward financial goals. Empower’s “Return on Advice” survey shows that 69% of Americans believe human advice is more powerful than any algorithm.**
Whether developed independently or with professional support, having a financial plan can help provide structure and confidence for setting short-term and long-term goals and implementing them. Americans with a more detailed financial plan are about three times as likely to report greater happiness in money matters.
Read more: Financial planning for your life
The bottom line on AI and financial advice
As AI and other technologies become more common in financial discussions, understanding how different tools operate and where limitations lie is important to making informed decisions.
Human financial advisors and regulated digital advice platforms operate within a framework of legal, ethical, and fiduciary standards, meaning they are trained to understand not just questions but the entire financial picture. They are legally required to act in the client’s best interest, which is why they ask more targeted questions to deliver context, personalization, and fiduciary-based advice.
While quick answers are easy to find on AI chatbots, combining technology with regulated tools and professional insight can help ensure that information is interpreted and used in the right context.
Here are some best practices to keep in mind:
- Keep personal financial information secure. Never share account details, income specifics, or sensitive data with chatbots. Treat them like any other online platform that’s free to the public.
- Use AI for education, not execution. Let AI chatbots help you learn general investing concepts.
- Consider contacting a trusted financial advisor. A human professional can help build a secure, personalized plan that adapts as life, money, and goals change. They can provide ongoing human support and guidance that no algorithm can match.
Get financially happy
Put your money to work for life and play
*, ** Empower's "Return on Advice," online survey responses from 2,202 Americans, ages 18 and older, October 15-16, 2025.
1 Intuit, “The Rise of Fin-AI: Why Americans Are Trusting Generative AI With Their Wallets,” September 2025.
2 The New York Times, “They Had Money Problems. They Turned to ChatGPT for Solutions,”
September 2025.
3 The New York Times, “A.I. Is Getting More Powerful, but Its Hallucinations Are Getting Worse,” May 2025.
4 The Wall Street Journal, “Is It Safe to Share Personal Information With a Chatbot? January 2024.
5 The Wall Street Journal, “Is It Safe to Share Personal Information With a Chatbot? January 2024.
6 Forbes, “ChatGPT-4o Is Wildly Capable, But It Could Be A Privacy Nightmare,” May 2024.
7 The Wall Street Journal, “Is It Safe to Share Personal Information With a Chatbot? January 2024.
8 U.S. News and World Report, “Should You Be Sharing Your Financial Information with AI?” August 2025.
9 “6 ChatGPT Settings You Should Consider Changing,” October 2025.
10 The Wall Street Journal, “Is It Safe to Share Personal Information With a Chatbot? January 2024
11 U.S. News and World Report, “Should You Be Sharing Your Financial Information with AI?” August 2025
12 The Wall Street Journal, “Is It Safe to Share Personal Information With a Chatbot? January 2024.
13 The New York Times, “A.I. Is Getting More Powerful, but Its Hallucinations Are Getting Worse,” May 2025
14 CNBC, “Gen Z, millennials are using AI for personal finance advice, report finds,” November 2024.
15 Money, “Can You Trust AI for Financial Advice? We Put ChatGPT and Gemini to the Test,” August 2025.
16 Money, “Can You Trust AI for Financial Advice? We Put ChatGPT and Gemini to the Test,” August 2025.
17 Financial Advisor, “AI Chatbots Aren't Immune To Giving Bad Money Advice, Studies Find,” November 2025.
18 The New York Times, “They Had Money Problems. They Turned to ChatGPT for Solutions,” September 2025.
19 Fortune, “Can AI replace financial advisors? Experts warn against using AI for much besides basic financial advice,” January 2025.
20 The New York Times, “They Had Money Problems. They Turned to ChatGPT for Solutions,” September 2025.
21 University of St. Gallen, “Be careful! Financial advice from AI comes with risks,” January 2025.
22 CNBC, “Gen Z, millennials are using AI for personal finance advice, report finds,” November 2024.
23 Financial Planning Association, “LLMs Can’t Be Trusted for Financial Advice,” May 2024.
24 Financial Planning Association, Customer Trust and Satisfaction with Robo-Adviser Technology,” August 2024.
25 Financial Planning Association, Customer Trust and Satisfaction with Robo-Adviser Technology,” August 2024.
26 Financial Planning Association, Customer Trust and Satisfaction with Robo-Adviser Technology,” August 2024.
27 The Wall Street Journal, “Is It Safe to Share Personal Information With a Chatbot? January 2024.
28 Forbes, “GenAI Vs. Robo-Advisors: Considerations For The Financial Industry,” August 2025.
29 Financial Planning Association, “LLMs Can’t Be Trusted for Financial Advice,” May 2024.
30 USA Today, “Financial advisor vs. financial planner,” September 2024.
RO5057778-1225
The content contained in this blog post is intended for general informational purposes only and is not meant to constitute legal, tax, accounting or investment advice. You should consult a qualified legal or tax professional regarding your specific situation. No part of this blog, nor the links contained therein is a solicitation or offer to sell securities. Compensation for freelance contributions not to exceed $1,250. Third-party data is obtained from sources believed to be reliable; however, Empower cannot guarantee the accuracy, timeliness, completeness or fitness of this data for any particular purpose. Third-party links are provided solely as a convenience and do not imply an affiliation, endorsement or approval by Empower of the contents on such third-party websites. This article is based on current events, research, and developments at the time of publication, which may change over time.
Certain sections of this blog may contain forward-looking statements that are based on our reasonable expectations, estimates, projections and assumptions. Past performance is not a guarantee of future return, nor is it indicative of future performance. Investing involves risk. The value of your investment will fluctuate and you may lose money.
Certified Financial Planner Board of Standards Inc. (CFP Board) owns the certification marks CFP®, CERTIFIED FINANCIAL PLANNER™, CFP® (with plaque design), and CFP® (with flame design) in the U.S., which it authorizes use of by individuals who successfully complete CFP Board's initial and ongoing certification requirements.