AI Consulting
Engineers solve problems, consultants sell their services. What bridges the gap?
78% of organizations are adopting AI, but most are stuck in pilot purgatory. The gap isn't technology — it's tribal imbalance and missing systems.
What organizations say: "We need AI strategy." What they mean: "We have pilots everywhere and nothing is shipping."
Consulting Psychology
Every major consultancy follows the same meta-pattern: picture the gap, sell the bridge. The sequence is always confusion → clarity → confidence → contract.
| # | Step | What the Consultant Does | Your Version |
|---|---|---|---|
| 1 | Name the pain | Frame the gap. Use data to make it urgent. | "78% adopting AI, most stuck in pilot purgatory" |
| 2 | Diagnose root cause | Show the obvious explanation is wrong | "The gap isn't technology — it's tribal imbalance" |
| 3 | Provide a framework | Make complexity legible. This is where consultants earn. | Three Tribes + Four-Layer Playbook |
| 4 | Prioritize ruthlessly | Decide what to do first. 2x2 matrices, quick wins. | Use Case Prioritization (Layer 3) |
| 5 | Package the engagement | Phases, deliverables, pricing. Low-risk entry, expand on results. | Diagnostic → Pilot → Managed Service |
The consultant's job is to be the person who makes the world legible, then charges to keep it that way. A good story does half the selling. A good presentation does the rest.
The Consulting Checklist
What the best firms actually run through:
- Presenting vs. real problem? Clients describe symptoms, not causes
- Who has power, pain, budget? Sponsor mapping
- Undeniable success in 90 days? Anchor to measurable outcomes
- Simplest explanation for why this hasn't been solved? Structural diagnosis
- 2-3 moves with disproportionate value? Prioritized use cases
- Minimum viable engagement to prove you're right? Paid diagnostic
- How does this expand into recurring revenue? Managed service / retainer
Mapping to Tight Five
The consulting playbook is an external audit of someone else's Tight Five. The sale happens when you show them which P is broken:
| Consulting Step | Tight Five Framework | What the Consultant Does |
|---|---|---|
| Name the Pain | Performance — "How do you know it's working?" | Expose the gap between claimed performance and actual outcomes |
| Diagnose Root Cause | Principles — "What truths guide you?" | Show decisions feel random because there's no shared first-truth |
| Provide a Framework | Platform — "What do you control?" | Assess assets, data, tools, architecture. Show what's missing structurally. |
| Prioritize & Coordinate | Protocols — "How do you coordinate?" | Install the missing operating system. Why success doesn't compound. |
| Package the Engagement | Players — "Who creates harmony?" | Map the Three Tribes. Show the tribal mix is unbalanced. Position yourself as the missing player. |
The deep insight: consultants sell the binding, not the elements. A client usually has all five Ps in some form. What they're missing is the tightness — the coordination that makes five compound instead of compete.
Recursive Application
The Tight Five works at every level of the engagement:
| Level | The Five | Maps To |
|---|---|---|
| Discovery | Capture, Inquire, Design, Platform, People | Diagnostic phase — help them find their five |
| Strategic | Principles, Performance, Platform, Protocols, Players | Roadmap deliverable — help them define their five |
| Tactical | 5 Facts → 5 Questions → 5 Answers → 5 Ideas → 1 Decision | Each use case one-pager — help them act on their five |
The diagnostic demonstrates the framework's value. You use the Tight Five to assess the client, and the assessment is the sale — once they see through the lens, they can't unsee which P is broken.
Four-Layer Playbook
Layer 1: Business Alignment
Anchor every discussion in business value, not technology.
| Area | Key Questions |
|---|---|
| Objectives | What 2-3 business metrics should AI move first? |
| Scope | Which units/processes are in-scope in the next 90 days? |
| Sponsor | Who signs off, who blocks, who operates day-to-day? |
| Constraints | What data/regs/brand rules can't be violated? |
| Success | What proves this was worth it within a quarter? |
Layer 2: Readiness Diagnostic
Run in a 2-4 week diagnostic phase.
| Dimension | What to Assess |
|---|---|
| Strategy & Leadership | Named AI owner? Documented vision linked to budget? Clear principles on where AI will/won't be used? |
| Data Foundation | Accessible, permissioned data? Governance policies? Sensitivity classification? |
| Technology & Architecture | Approved platforms? API/integration patterns? MLOps capability? |
| Governance & Risk | AI risk policy? High-risk review process? Output monitoring? |
| People & Skills | Internal AI training? Champions in teams? Capacity without derailing ops? |
Layer 3: Use Case Prioritization
| Criterion | Question |
|---|---|
| Business Impact | Quantifiable upside in time, cost, revenue, or risk reduction? |
| Feasibility | Data availability, technical complexity, dependencies? |
| Time-to-Value | Result inside 4-12 weeks with a scoped pilot? |
| Adoption Likelihood | Clear owner, motivated team, workflow people want to improve? |
Layer 4: Delivery & Governance
| Phase | Duration | Purpose |
|---|---|---|
| Phase 0 - Diagnostic | 2-4 weeks | Readiness checklist, discover/prioritize use cases, strategy + roadmap |
| Phase 1 - Pilots | 6-12 weeks | Implement 1-3 high-ROI use cases with clear metrics and governance |
| Phase 2 - Scale | Ongoing | Expand successful pilots, build reusable agents, train teams, formalize operating model |
The Three Tribes
AI transformations fail when tribes are unbalanced.
| Tribe | Question They Ask | What They Provide |
|---|---|---|
| Explorers | "What if we tried...?" | Discover options, frontier awareness |
| Automators | "How do we operationalize this?" | Scale validated ideas, integrate systems |
| Validators | "How do we ensure quality and safety?" | Standards, compliance, trust |
The failure pattern: Explorers generate pilots → Automators can't operationalize → Validators block everything → Pilot purgatory.
The solution: Tight fives that pass through all three tribes before shipping.
The Three Tribes model is the Players P applied recursively — Explorers, Automators, and Validators need their own tight five to ship anything, which is exactly the "pilot purgatory" failure pattern.
Pricing Model
| Tier | Deliverable | Price Range |
|---|---|---|
| Diagnostic | Readiness baseline + prioritized use case roadmap | $5K-15K |
| Pilot | 1-3 implemented use cases with metrics and governance | $15K-50K |
| Managed Service | Ongoing AI worker operations + optimization | $3K-10K/month |
Providers to Study
Context
- Tight Five Framework — The diagnostic lens consultants sell
- Trades Consulting — Same playbook, different industry
- Pitch Decks — Packaging the engagement for sale
- Presenting — Delivering the pitch
- Storytelling — The narrative that sells
- Pictures — Picture the gap, sell the bridge
- Cost Centres — What you're helping them fix
- Solopreneur Model — Alternative delivery model