Why business cases fail
Nearly half of enterprise leaders say proving business value is the #1 hurdle for AI adoption. The technology works. The tools are available. But the internal conversation stalls because no one has made a clear, credible case for what AI will deliver and what it will cost.
Most AI business cases fail for one of three reasons:
- Too vague: "AI will make us more productive" is not a business case.
- Too ambitious: "AI will transform everything" scares people and sounds like hype.
- No baseline: Without knowing how long things take today, you cannot demonstrate improvement.
This module gives you the framework to build a business case that gets funded.
Which of these is the most common reason AI business cases fail?
Tier 1 — Action counts
The most effective AI business cases work on three levels simultaneously. Start with what is easy to measure, then build toward what is most powerful.
Tier 1: Action Counts (Easy — Start Here)
Measure adoption itself:
- How many people are using AI tools?
- How frequently? Daily, weekly, ad hoc?
- What tasks are they using it for?
Why this matters: It demonstrates demand. If 80% of your analysts are already using AI informally, there is a clear argument for providing sanctioned tools with proper controls.
Tier 2 — Workflow efficiency
Tier 2: Workflow Efficiency (Most Convincing)
Measure time savings on specific workflows:
| Workflow | Before AI | After AI | Saving |
|---|---|---|---|
| DD memo (first draft) | 5 days | 2 days | 60% |
| Earnings call summary | 3 hours | 30 minutes | 83% |
| Comparable company analysis | 4 hours | 1 hour | 75% |
| Quarterly LP report (first draft) | 2 days | 4 hours | 75% |
| Regulatory filing review | 1 day | 2 hours | 75% |
Why this matters: Time savings translate directly to capacity. If your deal team can produce DD memos in 2 days instead of 5, they can work on more deals simultaneously — or submit faster and win more mandates.
Tier 3 — Business outcomes
Tier 3: Business Outcomes (Most Powerful)
Connect efficiency to revenue and competitive advantage:
- Faster deal execution — more competitive bids — higher win rate
- Higher throughput — more deals with the same headcount — better revenue per professional
- Better quality — fewer errors, more thorough analysis — better investment decisions
- Client satisfaction — faster turnaround — stronger relationships and retention
Why this matters: This is the language your leadership speaks. But these metrics are harder to isolate, so you need Tier 2 data as the foundation.
Which ROI tier should you lead with when presenting to senior leadership?
Calculating the cost
A credible business case includes costs as well as benefits.
Direct costs:
- Enterprise AI licences: Typically $20-60 per user per month for enterprise tiers
- Implementation support: Internal or external resources to set up workflows, train teams
- MCP integration: If connecting to enterprise systems, factor in development time
- Training: Time invested in getting the team proficient
Total cost example:
| Cost Item | Year 1 | Ongoing |
|---|---|---|
| Enterprise AI licences (50 users x $40/mo) | $24,000 | $24,000 |
| Implementation and workflow design | $30,000 | — |
| Training (internal time) | $10,000 | $5,000 |
| MCP integration (2 systems) | $20,000 | $5,000 |
| Total | $84,000 | $34,000 |
For context, one senior associate at a major financial services firm costs the firm $200,000-$400,000 per year fully loaded. If AI saves 20% of that person's time, the tool pays for itself on one person's productivity alone — and you are deploying it across 50.
The executive presentation
When you present the business case, structure it as follows:
Slide 1: The Opportunity — "Our competitors are using AI to execute deals faster. Firms with embedded AI workflows report 3x higher ROI on their AI investments. We are currently at Stage 2 (individual experimentation) — organised adoption would move us to Stage 3."
Slide 2: What We Tested — "We ran a pilot with [team name] on [specific workflow]. Here are the results." Before: [time/cost]. After: [time/cost]. Sample output: [show an actual example of AI-assisted work product].
Slide 3: The Proposal — Scope: Deploy enterprise AI to [N teams / N users]. Cost: [$X Year 1, $Y ongoing]. Expected return: [Z% time savings on named workflows]. Timeline: [pilot — expand — full deployment].
Slide 4: Risk and Governance — "We have designed a governance framework (see Module 4) that addresses data privacy, compliance, and model risk." Enterprise tier = no training on our data. Use case classification = appropriate oversight by risk level. Audit trails = full compliance visibility.
Slide 5: Ask — Specific budget, timeline, and next step.
Designing a pilot
Do not propose organisation-wide deployment from day one. A focused pilot is more credible and lower risk.
The ideal pilot:
- One team (5-10 people) that is motivated and has a clear, measurable workflow
- One workflow that is document-heavy, repetitive, and high-value (DD is often the best candidate)
- 4-6 weeks — long enough to see real results, short enough to maintain momentum
- Clear metrics defined before the pilot starts
Pilot selection criteria — choose a workflow that is:
- Document-heavy (AI's biggest strength)
- Time-consuming (large potential savings)
- Repeatable (can measure before and after)
- Not client-facing initially (lower risk for errors)
Avoid workflows that are:
- One-off projects (cannot measure improvement)
- Highly regulated initially (keep governance simple for the pilot)
After the pilot, present results to leadership with specific time savings and before/after data, examples of actual work product (redacted as needed), team feedback on quality and experience, and a proposed expansion plan.
Change management
The best AI tools fail without adoption. Here is what drives adoption and what kills it.
What drives adoption:
- Executive sponsorship: When leadership visibly uses and advocates for AI, teams follow
- Early wins: Small, tangible successes that people can see and reference
- Peer champions: Train 2-3 enthusiastic users intensively, then let them mentor others (the "flight instructor" model)
- Remove friction: Pre-configured tools, shared prompt libraries, clear permissions
What kills adoption:
- Mandates without support: Telling people to "use AI" without training or workflows
- Fear: People worry AI will replace them. Address this directly — AI replaces tasks, not roles
- Bad first experiences: If someone's first try with AI produces garbage, they will not try again. Start them with a use case that works well.
- Governance paralysis: Overly restrictive policies that make AI harder to use than not using it
What is the single most effective driver of team AI adoption?
Key takeaways
- Start with Tier 2 metrics — time saved on specific workflows is the most convincing evidence.
- Calculate costs honestly — licences, implementation, training. Then show ROI that makes the cost trivial.
- Run a focused pilot — one team, one workflow, clear metrics, 4-6 weeks.
- Present with data — before/after comparisons, actual examples, specific dollar figures.
- Manage the change — executive sponsorship, peer champions, early wins.
You have completed the Practitioner tier. The Advanced tier begins with AI agents for deal lifecycle automation.
Module 8 — Knowledge Check
When presenting an AI business case to senior leadership, which tier of evidence should you lead with?
What is the strongest argument for AI investment ROI?
What is the ideal scope for an AI pilot?