What is the IRS using AI for in 2026?

The IRS uses AI across taxpayer service, operations, compliance and fraud detection, with audit selection now part of the governance conversation.

GAO's March 2026 report gives tax practitioners the cleanest public snapshot. The IRS had 126 active artificial intelligence use cases in its June 2025 inventory. That was up from 10 use cases reported in August 2022.

GAO said most of those use cases fell into operational efficiency or tax compliance and fraud detection. It also said 77 of the 126 use cases were still in development. That means the IRS is not just running old scoring models. It is still building and changing how AI fits agency work.

For practitioners, the point is not that every return is being judged by a robot. That is too broad. The point is that AI is now part of the tax administration system clients may encounter.

How does IRS AI audit selection work?

Public sources show the IRS uses machine learning to identify compliance risk, especially in complex returns such as large partnerships.

The IRS has publicly discussed machine learning in large partnership compliance. IRS materials for fiscal year 2024 said the Large Partnership Compliance program would use machine learning to identify potential compliance risk in partnership tax, general income tax, accounting and international tax.

That does not mean the IRS publishes a checklist of every audit trigger. It should not. But the broad pattern is visible: the agency is using data tools to sort large volumes of returns and identify where human examiners should look.

Tax practitioners should explain that distinction to clients. AI audit selection is a risk-screening process. It is not a legal finding, a fraud conclusion or proof the return is wrong.

Client Question Better Practitioner Answer File Action
Did AI say we did something wrong? No. Selection means the return was chosen for review, not that liability exists. Preserve the notice, return support and original workpapers.
Can we know the trigger? Usually not with certainty. Focus on the issues identified in the notice. Map each notice item to source documents.
Should we change next year's return? Only if facts, documentation or reporting positions need correction. Update organizer questions and evidence requests.
Does this change engagement risk? It can. More complex clients need clearer audit-response planning. Review engagement letter scope and response procedures.

Does an AI-selected audit mean the client did something wrong?

No. An AI-selected audit means a model or process helped identify a return for review. It does not prove noncompliance.

This is the client-communication problem practitioners need to solve first. Clients hear "AI audit" and may assume the IRS found something hidden. That is not how selection should be explained.

IRS policy itself recognizes the stakes. IRM 10.24.1 says AI that informs or influences whether a taxpayer will be subject to audit or what aspects of a return will be audited is a presumed high-impact use case. The policy requires governance around high-impact AI because the consequences can matter to taxpayers.

Practitioners should avoid fear-based language. A calmer message is more accurate: selection means the return deserves a documented response. The taxpayer still has rights. The practitioner still needs to match every IRS question to evidence.

What did GAO find about IRS AI risk?

GAO found rapid AI growth, incomplete inventory information, staffing pressure and no IRS workforce plan for AI skills gaps.

The staffing detail is not small. GAO reported that the IRS Research, Applied Analytics and Statistics group lost 63 employees who had been working full-time or part-time on AI. GAO also said IRS officials had not identified the skills needed to support AI or developed a plan to address the gaps.

GAO found inventory quality problems too. More than 25% of AI use cases lacked information on how the use case would benefit the agency. GAO also identified omissions, including contracted AI-enabled tools officials said helped build criminal cases.

That combination matters for tax practice. If the IRS expands AI while governance and staffing catch up, practitioners should expect uneven implementation. Some notices may be well targeted. Others may require extra patience, extra documentation and firm follow-up.

The real practitioner risk

The risk is not that AI makes every client more likely to be audited. The risk is that practitioners face a less transparent selection process while clients expect simple answers. Strong files beat guesses about the model.

Which returns deserve stronger documentation?

Returns with complexity, unusual ratios, large deductions, partnership issues or digital-asset activity deserve cleaner support before filing.

The IRS focus on high-income taxpayers, large partnerships, corporations and complex pass-through entities is not new. What is changing is the agency's ability to use machine learning audit selection across large data sets.

Practitioners should strengthen evidence where return facts are harder to explain quickly. That includes large partnership allocations, S corporation officer compensation, large Schedule C deductions, real estate losses, crypto transactions, international reporting, related-party activity and major year-over-year changes.

This does not mean clients should avoid legitimate deductions. It means the file should be ready before the notice arrives. If a deduction depends on business purpose, logs, receipts, contemporaneous notes or third-party statements, collect them while the facts are fresh.

What should tax practitioners change before next filing season?

Tax practitioners should update organizers, document riskier positions and prepare clients for audit selection that may involve AI screening.

Start with intake. Add questions that surface documentation gaps early: digital asset activity, large charitable gifts, home office use, vehicle expenses, owner compensation, related-party transactions and unusual income swings. Then decide which answers require supporting documents before filing.

Next, update engagement letters and client messaging. The letter does not need to promise protection from AI selection. It should make audit-response scope clear. If the client wants representation after a notice, define how that work begins, who responds and what it costs.

Finally, train staff to talk about AI without drama. "The IRS may use analytics in selection" is enough. "The IRS AI flagged you" is usually too certain. The best client service is boring in the right way: clear records, plain explanations and a file that can answer the notice.

Sources

Fact-checked by Sydney Smart
IRS AI Audit Tax Practitioners Tax Compliance