What did the PCAOB actually find — and what does it mean in plain language?

Audit committee chairs are turning to external auditors as their primary source for understanding how AI affects financial reporting and controls—a new advisory responsibility most auditors weren't trained for.

Here's what the PCAOB found: audit committee chairs are now asking their external auditors to explain how AI affects their company's financial reporting and internal controls. It's a new role auditors didn't expect.

Audit committees, often filled with non-technical executives, are "treading carefully" as they implement AI. That means they're relying on experienced advisors to help them navigate the risks. And they're turning to their external auditors.

What does that mean in practice? Audit committees used to leave technology decisions to the CIO or CFO. Now they're asking auditors directly: "How does this AI tool affect our revenue recognition process? What new fraud risks has AI introduced? What's our audit trail?" Auditors are expected to have informed answers, not "that's a finance question."

What specific AI questions are audit committee chairs asking their auditors?

Audit committees ask three kinds of questions: how AI affects revenue recognition and fraud risk, what controls are needed, and whether AI use requires financial statement disclosure.

The questions fall into three categories: risk, controls, and disclosure. Here are real examples: "How is AI affecting our revenue recognition process?" "What new fraud risks has AI introduced to our internal controls?" "Should we disclose AI use in our financial statement footnotes?"

Audit committees aren't asking about tool features. They're asking about business implications. "Does this AI model introduce bias into customer creditworthiness decisions?" "Can we explain how the AI arrived at this number if a regulator asks?" "What happens if the vendor's AI stops working or changes mid-year?" These are governance questions, not tech support questions. They're the kind of questions that expose when an auditor hasn't thought deeply about AI in their client's operations.

The pattern is always the same: audit committees want assurance. They want to know their auditor understands AI well enough to spot problems and recommend controls. If the auditor says "I don't know much about AI," the audit committee hears: "I can't spot AI-related audit risks."

Why are audit committees going to auditors instead of their own IT or finance teams?

Audit committees need independent risk assessment. IT teams have conflicts of interest in the tools they chose. External auditors provide neutral governance advice that internal teams can't.

Because audit committees represent the board. They're accountable for governance and risk. IT teams own the tools they selected. That's a conflict. External auditors — paid specifically to assess risk independently — don't have that conflict.

Internal IT teams often downplay risks because they chose the AI tools. CFOs focus on cost-benefit analysis and implementation speed. But audit committees answer to shareholders. They need someone independent who can say: "this tool creates a new fraud risk you need to address." That's supposed to be the auditor.

There's also a competency gap. Many internal finance teams understand accounting. They don't understand LLM hallucinations (when AI generates false information) or prompt injection attacks (when someone manipulates an AI tool by injecting hidden commands). Audit committees assume external auditors — who audit multiple clients across industries — have seen more AI implementations and can spot patterns the internal team would miss.

What does this mean for auditors and CPA firms serving audit clients?

Auditors must develop AI governance expertise or risk losing credibility with audit committees. Firms that build this capability unlock new advisory revenue streams and stronger client relationships.

This creates both an expectation and an opportunity. Expectation: auditors are now expected to have informed opinions on how AI affects revenue recognition, inventory valuation, fraud risk, and disclosure. Opportunity: firms that build this capability can charge for it as advisory work, not just audit work.

Auditors who can't answer these questions will lose credibility. Audit committees will hire external consultants (taking revenue out of the audit firm's hands) or switch to a firm that has AI expertise. For Big Four firms, this is a capability race. For mid-size and small firms, it's an existential threat — unless they upskill fast.

On the flip side, auditors who develop AI governance expertise unlock advisory revenue. When a CFO deploys an AI revenue recognition tool, that company needs both audit oversight and advisory guidance. A firm that can provide both wins the engagement and strengthens client relationships. This becomes the new standard part of the audit conversation.

How should a CPA firm prepare to be the AI resource an audit committee expects?

Identify an AI champion per office, add AI governance questions to audit planning, and build a simple client checklist to start advisory conversations naturally.

Three steps: First, identify your AI governance champion. Pick 1–2 partners or senior managers per office who will develop deep expertise. Send them through formal AI governance training (like AICPA's new AI Skills Accelerator, covered in a related article). Make them the public face of your firm's AI capability. Second, add AI governance questions to your audit planning and risk assessment. When scoping an engagement, ask: "Are you using AI in revenue recognition, inventory management, fraud detection, or financial forecasting?" Document the answer and use it to scope audit procedures. That shows audit committees you're thinking systematically about AI risk. Third, build a simple AI governance checklist for clients. "Does your AI model have documented validation testing? Can you explain how it makes decisions? Is there a backup plan if the model fails?" These questions let you start advisory conversations naturally.

For firms: this isn't about becoming AI technologists. It's about auditing AI. Which parts of the AI workflow need controls? Where could AI failures create audit risk? What should an audit committee ask before approving an AI deployment? Those are audit questions, not engineering questions. Your existing audit expertise transfers directly.

Why Audit Committees Are the New AI Gatekeepers

The shift from IT to audit committee is a symptom of corporate AI governance maturing. Two years ago, IT departments owned AI decisions. Today, boards and audit committees realize AI affects financial reporting, fraud risk, and regulatory compliance — audit territory. The firms that recognize this shift and position their auditors as AI governance advisors will capture the advisory revenue that follows. The ones that don't will find audit committees turning to external consultants or switching firms.

Sources

PCAOB Audit Committee AI Governance Financial Reporting Internal Controls