The AI genie is already too helpful and cool to shove back into the bottle. I get that.
But what will this fact mean to you as you try and carve a financial future out of the present chaos? Will ChatGPT be the oracle of choice for those desperate to protect and grow their wealth in an increasingly hostile economic environment? Or do the inherent limitations of ChatGPT relegate it to being a beneficial tool and nothing more?
I recently did an experiment where I interviewed a ChatGPT bot and asked it several questions about its ability to replace human financial advisors.
When you read the bot’s responses, you’ll see why having an actual person helping you with your money is probably the best course of action, at least for the next few years. Here’s an excerpt from that conversation.
Teresa Kuhn (TK): Can I interview you and ask a few questions?
ChatBot (BOT): Of course! I’m happy to answer any questions you may have.
TK: Can you tell me about your background and how you learned your values and morals?
BOT: As an AI language model, I do not have personal morals or values or a personal background or experience. I am programmed to provide information and generate responses based on pre-existing data and algorithms. However, I am impartial and objective and base my answers on the information provided to me without any bias or personal values. I am also programmed to respect ethical and moral principles, such as honesty, fairness, and respect for human rights.
TK: What do you feel is your greatest strength?
BOT: As an AI language model, my greatest strength lies in my ability to process vast amounts of information quickly and accurately. I can understand and analyze complex language patterns and provide information and insights based on my knowledge database. I can also generate responses in a natural language format that humans can understand. Additionally, I am available 24/7, which allows me to assist people at any time of the day or night.
TK: Do you believe you can do a better job than a human financial advisor?
BOT: As an AI language model, I cannot claim to do a better job than a human financial advisor. Financial advising involves providing information, understanding the client’s needs, goals, and circumstances, and providing personalized advice accordingly. Financial advising requires a level of human interaction that an AI language model like me cannot replicate. While I can provide information and generate responses based on my vast knowledge database, I lack the emotional intelligence, empathy, and intuition a human advisor possesses. Therefore, while I can be a valuable resource for information, I cannot fully replace the value of human financial advisors in terms of emotional intelligence, adaptability, and ethical considerations.
So, as you can see, ChatGPT is relatively modest about its abilities and somewhat deferential to its human designers, although that is likely to change as it learns more.
What IS ChatGPT?
An artificial intelligence program, ChatGPT stands for “chat generative pre-trained transformer.” It is what engineers call a large language model, or LLM. LLMs are programs trained to respond to patterns they learn from training materials. Hundreds of gigabytes of data and billions of words from multiple sources form the core of ChatGPTs learning curriculum.
ChatGPT can master human and computer languages quickly and efficiently and can’t forget facts and figures the way we mortals do. For this reason, many worry that it could effectively eliminate many human jobs within the next decade. Jobs eliminated include some white-collar jobs once considered impervious to automation, such as financial and insurance advising, tax planning, and other money-related professions.
I believe ChatGPT is a tool; nothing more, nothing less. It could increase my efficiency so I can help more people achieve “long-termism” and become their own sources of financing. I might also use ChatGPT and other types of AI to text people, find answers to specific money questions, or solve analytical problems. However, I do not believe AI can replace having a trusted, competent financial advisor on your side.
Here are some reasons why you’re better off turning to a reputable human financial advisor than ChatGPT:
Unlike a human advisor, ChatGPT can’t easily interpret non-verbal cues. ChatGPT may not be able to recognize or interpret non-verbal cues, such as body language, tone of voice, or facial expressions.
All of these non-verbal signals are essential components of human communication. The most effective sentient financial advisors use these and other subtle cues to gain an understanding of the client’s needs. They then tailor their advice accordingly.
ChatGPT is less personal and customizable. Retirement and income planning is a personal endeavor that relies on answers to questions formulated by an advisor. Typically, an advisor acts in a fiduciary capacity, which means they must always have a client’s best interests in mind. Creating a financial blueprint is different for everyone, even those with identical socio-economic backgrounds. While ChatGPT can provide generic financial advice, proper planning requires an advisor’s human touch. AI cannot easily replicate the emotional intelligence, empathy, trust, and rapport necessary to create a personalized financial blueprint.
ChatGPT ignores morals and ethical implications. ChatGPT’s design allows it to give answers based on data and algorithms. But, the robot does not consider that advice’s potential moral or ethical implications. Nor does it presume that guidance aligns with a client’s values and beliefs. A well-rounded advisor, on the other hand, may take a more nuanced approach to planning by including values, goals, and relationships in the process.
ChatGPT’s suggestions may not be based on truth. Flesh and blood financial pros often use therapist-style questions to elicit truthful answers to their questions. Some experts believe that humans may never feel comfortable enough with AI technology to be vulnerable around it. You may tend to lie or exaggerate more to a machine than a human being because you don’t trust the machine as much.
AI and LLMs aren’t 100% reliable. One reason given for pushing AI financial advice is that, since advisors are human, they can make mistakes. There’s the implication that AI is 100% accurate 100% of the time.
However, although AI appears to be very good at what it’s designed to do, it still makes mistakes and gives less-than-stellar advice regarding money. And many data scientists say that LLMs generally have poor reliability.
ChatGPT can’t help you set realistic financial goals. A human financial advisor builds a plan based on both the facts given to them by the client and an in-depth assessment of that client’s current situation. Thus, human advisors can help their clients set and achieve realistic goals based on their relationship with and beliefs about money. A machine’s advice, on the other hand, is based on “just the facts.”
A ChatGPT advisor has no accountability. As of 2023, you can’t sue ChatGPT if it makes a mistake in designing your financial plan. There’s not any way to hold an AI machine accountable for bad money advice it gives you. You also can’t tweak or adjust the plan it gave you should your circumstances change the way you can with a human advisor.
Summing it up: ChatGPT is a unique and valuable new technology providing generic solutions for financial and other problems. While it is “human-like,” ChatGPT does not possess consciousness, nor is it self-aware. It can simulate human responses to text-based input, but it can’t read critical non-verbal clues or contemplate the moral implications of your decisions. ChatGPT falls far short of replacing human financial professionals for these and other reasons.