ChatGPT and other generative AI chatbots are being used for almost everything these days, from writing emails to planning a vacation to doing your taxes. That doesn’t mean it’s always a good idea.
Imagine this hypothetical scenario: Lorin, a 42-year-old market researcher at a firm who also started his own business, hates doing his taxes and doesn’t want to spend money on a tax professional. So he decided to ask ChatGPT for help with his tax returns.
He wouldn’t be the first to do so. One study by Invoice Home found that more than two in five (43%) Americans would trust AI to file their taxes over hiring a tax professional [1]. While Gen Z was more open to the idea (at 49%), older generations were also willing to trust AI with their taxes (25% of Baby Boomers and 18% of The Silent Generation).
But should they? In Lorin’s case, using ChatGPT would save him a significant amount. Here’s what Lorin may want to consider before he submits his return.
Let’s face it, the vast majority of people don’t enjoy doing their taxes. Some downright dread it. So it’s no wonder more people are turning to ChatGPT or other AI chatbots for help.
AI could be useful for certain tax-prep tasks, such as generating a list of which documents and forms you need to fill out. If you have a side hustle or second job, AI could help you locate the necessary forms to cover multiple income streams from multiple employers.
It could also help you identify common deductions or tax credits (from there, you can determine if it makes more sense to provide itemized deductions or use the updated standard deduction). And it may be able to flag inconsistencies or missing information for further investigation on your part.
Keep in mind that the information you see may not be up-to-date or accurate. For example, it might miss out on newly announced credits. Relying fully on AI without verifying the information yourself from a good source (or with a tax professional) could lead to inaccuracies or even false claims.
“ChatGPT can explain what an ETF is, but it doesn’t know your debt-to-income ratio, state tax bracket, filing status, deductions, retirement goals or risk appetite. Because its training data may stop short of the current tax year, and of the latest rate hikes, its guidance may well be stale when you hit enter,” wrote tech journalist Nelson Aguilar in an article for CNET [2].
Plus, it might deliver answers “that are biased, outdated, or just plain incorrect, all while sounding like a PhD.”
OpenAI, the company behind ChatGPT, describes AI hallucinations as “plausible but false statements generated by language models.”
This happens because AI doesn’t “understand” information; rather, it generates responses based on patterns in training data, and language models are usually designed to make guesses rather than admit uncertainty. Also, some of that training data may be biased, inaccurate or incorrect. But when the AI provides a response, it sounds authoritative, even if that information is fabricated.
That’s where another phenomenon comes in: “sycophancy,” which is the “tendency of AI models to adjust their responses to align with users’ views” and “can make ChatGPT and its ilk prioritize flattery over accuracy,” according to Axios [3].
Read More: Are you richer than you think? 5 clear signs you’re punching way above the average American
CPA Practice Advisor cites research that reveals AI often suffers from an issue of “simplexity,” which means “it misinterprets complex concepts in an effort to make them simple.” Researchers discovered “this issue is present even in the IRS’s Interactive Tax Assistant, finding it can characterize tax laws in ways that are overly favorable to the taxpayer” [4].
In a study published in the Journal of Emerging Technologies in Accounting, common tax questions from the 2023 and 2024 tax seasons were entered into ChatGPT models 3.5 and 4 [5]. The researchers found that ChatGPT’s overall percentage of correct responses ranged from 39 to 47%. It “also provides less accurate responses to tax questions that are more common, have more complex answers, require an evaluation of taxpayers’ fact patterns, and relate to tax information determined after ChatGPT’s knowledge cutoff date.”
Another major risk is to your privacy. An AI could use your personal data to train models; it could accidentally share sensitive information in outputs or expose it through data breaches. It’s advisable to never share personally identifiable financial, health or other sensitive information with a chatbot.
“The chatbot simply can’t replace a CPA who can catch a hidden deduction worth a few hundred dollars or flag a mistake that could cost you thousands,” wrote Aguilar. “When real money, filing deadlines, and IRS penalties are on the line, call a professional, not AI. Also, be aware that anything you share with an AI chatbot will probably become part of its training data, and that includes your income, your Social Security number and your bank routing information.”
If you have a simple return, an AI can help you automate data entry and identify legitimate deductions. But it can also hallucinate data.
An AI bot isn’t necessarily great at understanding your personal tax situation and accounting for context, especially with complex returns. And it may misinterpret complex tax laws.
Say, for example, you sometimes work from home. The AI might tell you that you qualify for a home office deduction when you don’t. Or, it might tell you that your side hustle losses are fully deductible when they aren’t. If AI misclassifies deductions, you’re responsible if the IRS flags it.
The prompts you use could make a difference, too. For example, if Lorin prompts the AI to reduce his taxes, “sycophancy” could come into play in which the AI finds deductions even if they aren’t applicable to his situation.
Ultimately, you’re responsible for the accuracy of your tax return. And inaccuracies can be costly, particularly if you end up underreporting your income or overstating your deductions.
If the AI provided you with fabricated information, then you could end up unintentionally committing tax fraud — a serious offense that can come with financial penalties or criminal charges. Inaccurate or incomplete information could also increase your risk of being audited by the IRS.
That’s not to say you can never use AI in the tax-prep process. Even tax software companies are building AI chatbot assistance into their products (though they tend to be limited in functionality).
But you should always double-check your numbers and verify the truthfulness of the information. Basically, don’t take AI at its word.
We rely only on vetted sources and credible third-party reporting. For details, see our editorial ethics and guidelines.
Invoice Home (1); CNET (2); Axios (3); CPA Practice Advisor (4); Journal of Emerging Technologies in Accounting (5;
This article provides information only and should not be construed as advice. It is provided without warranty of any kind.