What is going-concern and why does it suddenly matter for AI companies?

Going-concern asks whether a company will stay in business for 12+ months. AI introduces new doubt factors: unbudgeted infrastructure costs with unclear ROI now trigger auditor questions about long-term sustainability.

Going-concern is the audit standard question asked every year: "Will this company stay in business for the next 12 months?" If auditors detect substantial doubt, they flag it in the annual audit report. That flag matters — it moves investors, lenders, and board members.

For the first time, AI is introducing a new category of going-concern risk. Auditors are starting to ask about it directly. This isn't the old playbook. Auditors have always scrutinized cash burn, debt covenants, customer concentration, market downturns. But AI is different. A company might look fine on paper — solid cash, growing revenue, signed contracts. Yet if that company has $50 million in unbudgeted AI infrastructure costs with unclear ROI, auditors now have a specific reason to question its long-term viability.

Why are CFOs under such cost pressure, and how does AI spending fit in?

52% of CFOs cite cost management as top internal risk. They need to automate, but unbudgeted AI capex creates a tension: cost pressure drives AI adoption, but unclear ROI creates audit risk on sustainability.

In the Deloitte Q1 2026 CFO Signals survey, 52% of CFOs from billion-dollar companies cited cost management as their top internal risk. That's significant. After years of cheap capital and loose spending discipline, companies now feel margin compression. 53% said automation and technology upgrades are their most effective cost control lever. But here's the contradiction: 49% also report that pressure to invest in new technology — cloud, AI, data analytics — is itself driving cost management urgency. They need to automate. But they're terrified of the capex.

CFOs are caught between competing imperatives. They know they need AI to compete. They also know Wall Street and their boards want cost discipline. So they approve AI projects hoping for savings, but without the governance infrastructure to prove those savings will materialize. That's when auditors get nervous. And rightfully so.

What specific AI governance gaps are auditors flagging?

78% of companies lack confidence they'd pass an independent governance audit on AI within 90 days. Auditors see undocumented models, no explainability, vendor lock-in, and missing business cases connecting spend to ROI.

According to Grant Thornton's 2026 AI Impact Survey, 46% of business leaders cite AI governance and compliance barriers as the leading cause of AI projects failing to meet objectives. But this stat is the real problem: 78% of companies lack confidence they could pass an independent governance audit on their AI program within 90 days. That's not "we need to improve." That's "we're not ready for external scrutiny."

The specific gaps auditors are seeing: AI models without documented validation testing (meaning nobody verified the AI actually works), lack of explainability for high-stakes decisions (credit approvals, hiring, pricing), vendor lock-in with no backup plan, and no clear business case connecting AI spend to projected revenue or cost savings. No single gap triggers going-concern doubt. Together, they signal something worse: a company's decision-making infrastructure isn't mature enough to sustain a major technology investment.

How does an AI governance gap become a going-concern issue?

Auditors see major AI spend, ask for a business case, find none, then escalate from financial risk to sustainability doubt. If management can't show a credible plan to course-correct if AI underperforms, auditors document substantial doubt.

The path is straightforward. Auditors see a company has $30+ million in AI infrastructure spend projected over 18 months. They ask for the business case — documented assumptions about ROI, payback period, sensitivity analysis. The CFO can't produce one. That's not a maybe. That's a fact. The company never built the governance to do rigorous scenario planning around AI.

Next, auditors escalate. From "this is a financial reporting risk" to "this looks like material cash burn with unclear return. Does management have a plan to course-correct if AI projects underperform?" If the auditor doesn't see a plausible answer, they document substantial doubt about the company's ability to fund operations if the AI strategy doesn't deliver.

In practice, it usually doesn't reach a public going-concern qualification. Instead, auditors tell the CFO and board: "We're seeing gaps in your AI governance. Until you address them, we're raising this as a material weakness in internal controls and revisiting it at every interim audit." That pressure forces the conversation about AI sustainability to the board level — which is exactly where it should be.

What does a company with mature AI governance look like to an auditor?

74% of companies with mature AI governance have strong audit confidence (vs. 22% without). Maturity means: governance committee, clear project criteria, vendor due diligence, quarterly ROI reviews, and documented escalation for underperformance.

The Grant Thornton survey shows a 4:1 confidence gap: 74% of companies with mature AI governance passed independent audit with confidence. Only 22% without governance structure. What maturity looks like in audit terms: a documented AI governance committee with C-suite sponsorship, clear decision criteria for which AI projects get greenlit, rigorous vendor due diligence, quarterly business case reviews comparing actual AI spending to projected ROI, a remediation plan if gaps are found, and a clear escalation path if an AI project is underperforming its business case.

Does this require a separate AI governance team? No. It can live within existing risk management, finance, and technology infrastructure. But it has to be documented, repeatable, and visible to auditors. That's the difference between "we're thinking about AI governance" and "we have the infrastructure to sustain an AI investment."

If auditors are questioning sustainability, what should a board do right now?

Require business cases for AI over $5M, assign project accountability to executives, conduct a governance audit, and update audit committee charter to include AI oversight.

Four steps: First, stop approving AI projects without a documented business case. Every AI investment over $5 million should have clear assumptions: customer acquisition cost reduction by X%, operational efficiency gain of Y%, or revenue uplift of Z%. If the CFO can't articulate that, the project isn't ready. Second, assign a single business owner to each AI project — someone whose compensation is tied to hitting ROI targets. That creates accountability and gives auditors someone to talk to about course correction if the project underperforms. Third, conduct a governance audit on your existing AI portfolio. Deloitte, Grant Thornton, and the Big Four all offer this. You want to know your gaps before auditors find them in the annual audit. Fourth, update your audit committee charter. Be explicit about AI governance oversight. That signals to auditors and the board that you're taking this seriously as a business continuity issue, not just a technology initiative.

Why Going-Concern Questions About AI Aren't Alarmism—They're Prudent Business Practice

The shift from "AI is a cost" to "AI sustainability is a going-concern factor" represents a healthy maturation in how boards and auditors think about technology risk. It's not that AI will disappear or fail as a category. It's that a company spending 15% of profit on an AI infrastructure build needs the same level of financial discipline and governance that a company building a new manufacturing plant would need. If you can't defend the capex decision, you can't defend the company's long-term viability to shareholders and lenders. Auditors asking that question now—when AI spending is still somewhat discretionary—protects companies from the much harder conversations that will happen if AI projects underperform and cash dries up. The companies that will thrive in the AI era are the ones that treat it as a business decision, not just a technology bet.

Sources

Going-Concern AI Governance Business Continuity CFO Strategy Audit Risk AI Spending