Tax Day is Wednesday, and millions of Americans are looking for ways to make filing less stressful. This year’s temptation is artificial intelligence.

Chatbots promise quick answers, plain-English explanations and even accounting assistance — though many of them also caution that they’re neither tax accountants nor tax lawyers, and that their recommendations should thus not be taken at face value. 

Still, who really pays attention to such warnings? If inexpensive AI advice sounds good, many busy taxpayers will see it as a gift — particularly if it promises a sizable refund. In reality, AI is a gamble.

AI is a useful learning tool that can provide helpful explanations and information about taxes. AI can explain deductions, summarize the difference between a credit and a refund, and help users make sense of confusing Internal Revenue Service language. That doesn’t mean you can trust it to prepare your return.

Filing taxes is a matter of precision. Calculations, eligibility and personal information must all be accurate. While AI may someday get there, it’s not quite there yet. Rely on it at your own peril.

Researchers evaluating the performance of AI preparing federal tax returns found that even the best systems got only a minority of cases correct. The tools made routine mistakes, including misusing tax tables, making faulty calculations, and incorrectly deciding whether someone qualified for common credits and deductions. Errors even crept into standard credits upon which many families rely, including the Child Tax Credit and the Earned Income Tax Credit.

Other inquiries have reached similarly troubling conclusions. In some scenarios, chatbot-generated tax results were off by more than $2,000. 

And the privacy issue that pervades conversations of AI safety may be even more concerning than the math.  To get personalized tax help from a chatbot, users are likely to provide sensitive information such as Social Security numbers, income statements, bank details, home addresses, dependent information and often documents such as W-2s or 1099s. That requires placing extraordinary trust in a consumer AI platform. Are you ordinarily that trusting?

Few people give sufficient thought to questions like how long the information is stored or whether it could be used to train future systems. Even fewer know who within a company can access their sensitive personal information or what vulnerabilities may exist. Government experts have already warned that AI systems can leak, generate or infer sensitive information. That alone should be enough to make taxpayers think twice.

Couple those concerns with the standard IRS warning about tax-season scams, phishing attempts, identity theft, and AI-assisted fraud, and the risks become clear. Tax season is Christmas for criminals actively trying to separate people from their money and their data. Is that really the moment you want to upload your sensitive data to a chatbot?

Scratch the surface, and you’ll note that even most of the people promoting AI for tax use offer disclaimers to protect themselves from any harm that may come to those who heed their advice.  Such warnings tell the true story: If you need a human to check your answers, what AI is providing is a rough guide at best.

Here’s better advice: Learn to use AI correctly before you get yourself into trouble. Ask it questions. Use it to explain technical language. Have it compile a list of issues to raise with a CPA, an enrolled agent, or a trusted tax software provider. Do not hand over your return and assume a machine will get it right. And don’t upload your sensitive financial data until you’re certain you can trust everyone who can access it. The IRS provides official filing help and tools on IRS.gov.

Americans should treat AI as an educator, not an authority. AI can help you understand taxes. It cannot safely prepare them. That distinction is worth remembering.

Bruce Abramson is the executive director of recruiting and partnerships of Applied Data Science at New College of Florida.