The 10x multiplier
Individual AI use delivers 2-3x productivity gains. Team-level AI adoption delivers 10x. The difference is not the tool — it is the operating model.
When one analyst uses AI brilliantly, they are faster. When an entire deal team operates with AI-integrated workflows, the whole process accelerates — from sourcing through close. Research happens in parallel. Documents are reviewed faster. Memos are drafted and iterated in hours, not days.
This module covers how to move from "some people on the team use AI" to "AI is embedded in how the team works."
The adoption curve
Every team goes through predictable stages of AI adoption. Understanding where you are helps you plan the next step.
Stage 1: Individual Experimentation — A few people on the team try AI for ad hoc tasks. Results vary. No consistency. The team is not faster overall because knowledge stays with individuals.
Stage 2: Champion-Led Adoption — One or two "power users" emerge. They share prompts and show colleagues what works. Adoption spreads informally. This is where most financial services teams are today.
Stage 3: Workflow Integration — The team deliberately redesigns key workflows to include AI. Shared prompts, shared projects, defined handoff points. This is where the 10x multiplier kicks in.
Stage 4: Institutional Capability — AI is embedded in the firm's standard operating procedures. New hires are trained on AI workflows. The firm's prompt libraries, projects, and processes are a competitive advantage.
Where do most financial services teams get stuck?
Designing team AI workflows
The key principle: do not bolt AI onto existing workflows — redesign the workflow around AI's capabilities.
Before: Traditional Deal Due Diligence
- Associate receives CIM and data room access (Day 1)
- Associate reads CIM, takes notes (Days 1-3)
- Associate reviews data room documents (Days 3-7)
- Associate drafts DD memo (Days 7-10)
- VP reviews and provides comments (Days 10-12)
- Associate revises (Days 12-14)
- Final memo to IC (Day 14)
After: AI-Integrated Due Diligence
- Deal lead uploads CIM and key data room documents to shared AI project (Day 1)
- AI produces structured analysis of CIM: key risks, financial summary, questions for management (Day 1 — 30 minutes)
- Associate reviews AI analysis, adds judgment, identifies areas needing deeper investigation (Day 1-2)
- Associate runs targeted analysis on specific topics using AI (financials, legal, commercial) (Days 2-4)
- AI drafts DD memo from accumulated analysis (Day 4 — 1 hour)
- Associate and VP review and refine together (Days 4-5)
- Final memo to IC (Day 5)
Same quality. Half the time. The AI did not do the thinking — it eliminated the mechanical work of reading, structuring, and drafting.
Shared knowledge bases
When a team shares an AI project, the benefits compound. Everyone has the same context, and the AI's understanding of the deal deepens with each interaction.
How to set up a team project:
- Create a shared workspace with the key documents for your deal/mandate/project
- Set team instructions that reflect your firm's standards:
- "We are [firm name], a [firm type]"
- "Our investment criteria are [X, Y, Z]"
- "Memos should follow our standard format: Executive Summary, Investment Thesis, Key Risks, Financial Analysis, Recommendation"
- Upload reference materials — your firm's memo templates, past examples of good work, relevant industry reports
- Assign roles — who updates the project, who adds new documents as they come in
Over time, a well-maintained team project becomes a repository of institutional knowledge. New team members can ask the AI about the deal and get answers grounded in the team's actual work — not generic information.
This is particularly valuable for onboarding (new associates get up to speed in hours, not days), continuity (when team members rotate off, the project retains the context), and consistency (everyone's work references the same source materials).
Quality assurance and review
With AI producing more first drafts, the review process becomes even more important. The skill shifts from "writing the analysis" to "reviewing and improving the analysis."
The review checklist:
- Factual accuracy — Are the numbers right? Do quotes match the source document?
- Logical coherence — Does the analysis follow logically, or did the AI make unjustified leaps?
- Completeness — Did the AI cover all the topics you asked for? Did it miss anything you would expect?
- Judgment calls — Where the AI made assessments (e.g., "this risk is high"), do you agree? Would you nuance it differently?
- Tone and audience — Is the output appropriate for its intended audience (IC memo, client presentation, board update)?
AI typically gets the first 80% right. Your value is in the remaining 20% — the judgment, nuance, and institutional context that the AI cannot have. Train your team to approach AI output as a capable first draft from a smart but new analyst. It will be well-structured and thorough, but it needs your experience and judgment to be truly excellent.
When reviewing AI-generated analysis, what should you focus on most?
Measuring team AI ROI
If you cannot measure it, you cannot justify scaling it. Here is a practical framework for demonstrating AI's value to your leadership.
Tier 1: Activity metrics (easiest to measure)
- Number of team members actively using AI
- Frequency of use (daily, weekly, ad hoc)
- Types of tasks being automated
Tier 2: Efficiency metrics (most convincing)
- Time to produce DD memos (before vs. after)
- Number of deals the team can work simultaneously
- Time from mandate to IC memo
- Hours spent on document review per deal
Tier 3: Outcome metrics (hardest to measure but most powerful)
- Deal conversion rate (if faster DD leads to more competitive bids)
- Client satisfaction (faster turnaround, higher quality deliverables)
- Revenue per professional (if the team is handling more with the same headcount)
How to run the measurement:
- Baseline: Record current metrics for 2-3 key workflows before AI integration
- Pilot: Run the AI-integrated workflow on 3-5 deals/projects
- Compare: Measure the same metrics on the pilot projects
- Present: Show the before/after comparison to leadership with specific examples
Key takeaways
- Team adoption is 10x, not 2x — the real gains come from redesigning workflows, not individual productivity hacks.
- Redesign, don't bolt on — map your workflow, identify where AI eliminates mechanical work, and restructure around that.
- Shared projects create compounding value — team knowledge bases improve over time and accelerate onboarding.
- Review skills matter more than ever — when AI produces the first draft, your job is judgment, nuance, and quality assurance.
- Measure before and after — you need data to justify scaling AI adoption at your firm.
Your team is producing AI-assisted work. But the AI still operates in a silo — a chat window separate from your other tools. The next module covers MCP: connecting AI to your enterprise tools.
Module 6 — Knowledge Check
What is the primary reason most teams get stuck at Stage 2 (champion-led) adoption?
In the AI-integrated DD workflow, what did the AI replace?
Which ROI measurement tier is most convincing to senior leadership?