The CFO's AI Challenge
You're being asked to approve AI investments you can't fully evaluate, track spending you can't see, and demonstrate ROI you can't measure. The business case is "AI will transform everything." But your job is to govern capital—not fund experiments indefinitely.
The Questions You Can't Answer
- "How much are we actually spending on AI across the organization?"
- "What's the ROI on our AI investments?"
- "Which AI projects should we fund vs. kill?"
- "How do we evaluate AI budget requests we don't understand?"
- "What should I tell the board about our AI strategy?"
The 5 Questions CFOs Must Ask
- What specific business problem does this solve? — If they can't articulate the problem clearly, stop there.
- What's the measurable outcome? — "Improve efficiency" isn't acceptable. Require quantified targets.
- What's the total cost of ownership? — Include: development, infrastructure, data, talent, maintenance, change management.
- What's the exit strategy? — If it doesn't work, how do we stop? What's already committed?
- Who's accountable? — Technical success ≠ business success. Who owns the business outcome?
Hidden AI Costs Most CFOs Miss
AI spending is notoriously scattered. Look for costs hiding in:
| Hidden Cost Category | Where It Hides | Typical % of True Cost |
|---|---|---|
| Cloud/Infrastructure | IT budget, buried in AWS/Azure/GCP | 20-40% |
| Data Engineering | Data team budget, not tagged to AI | 30-50% |
| Shadow AI | Departmental budgets, credit cards | 10-30% |
| Talent | Hiring, contractors, training | 30-50% |
| Change Management | Often unfunded or underestimated | 10-20% |
| Maintenance | Ongoing ops, model retraining | 20-40% annually |
CFO Quick Win: The AI Cost Audit
Pull together finance, IT, and procurement for a 2-hour session. Ask:
- What vendors have "AI" or "ML" in their contracts?
- What cloud resources are tagged as AI/ML workloads?
- What headcount is dedicated to AI projects?
- What AI tools are being expensed on corporate cards?
You'll likely discover you're spending 2-3x what anyone thought.
AI ROI: What Actually Works
Stage-Appropriate Expectations
AI ROI depends heavily on maturity stage:
- Exploration (Year 1): Expect learning, not returns. Budget as R&D.
- Pilot (Year 1-2): Prove feasibility. Measure technical success, not business ROI.
- Production (Year 2+): NOW measure business outcomes. Expect 3:1 ROI minimum.
- Scale (Year 3+): Economies of scale kick in. ROI should accelerate.
ROI Red Flags
- Promises of ROI before any pilot is complete
- Inability to define success metrics before starting
- "Strategic value" without any quantification
- Perpetual pilot status (pilot purgatory)
- No comparison to non-AI alternatives
Board Communication Framework
When presenting AI to the board, structure around:
- Investment Summary: Total spend, allocation by initiative, trend
- Value Delivered: Quantified outcomes from production AI
- Risk Profile: Key risks and mitigation status
- Portfolio View: Pipeline of explore/pilot/production initiatives
- Forward View: Investment asks and expected outcomes
The One Slide That Works
Show a simple 2x2 matrix:
- X-axis: Investment level (low to high)
- Y-axis: Value delivered (low to high)
- Plot each AI initiative as a bubble (size = spend)
Boards immediately see: which bets are paying off, which are still developing, and which should be questioned.
Governing AI Spend
Create Visibility
- Mandate AI cost tagging in cloud/procurement systems
- Require AI project registration with finance
- Quarterly AI spend reviews with IT/business leaders
Create Accountability
- Every AI project needs a business owner (not just tech)
- Define success metrics before funding
- Stage-gate funding: seed → pilot → production
- Kill criteria: when do we stop if it's not working?
Create Governance
- AI investment committee (Finance + Tech + Business)
- Standard business case template for AI
- Post-implementation reviews for AI projects