Market Validation Meta-Skill
Overview
Use this meta-skill to validate or audit market claims. It supports both pre-plan field validation and post-draft evidence review so market logic is grounded in real customer signal rather than narrative convenience.
Use When
- Use before writing when market assumptions still need field validation.
- Use after drafting when market claims need an evidence audit.
- Use when the plan's credibility depends on proving demand and customer behaviour.
Do Not Use When
- Do not use to launder speculation into “validated” language.
- Do not treat desk research alone as customer validation.
- Do not keep validating forever when a clear decision can already be made.
Required Inputs
- Business idea, offer, and target-customer assumptions
- Existing market evidence, customer conversations, or draft claims
- Country, sector, and channel context where behaviour matters
- Adjacent market, target-market, and sales sections where consistency matters
Workflow
- Decide whether the task is pre-plan validation or post-plan auditing.
- Identify the market assumptions or claims that matter most.
- Gather or test evidence against those claims.
- Distinguish validated findings from hypotheses and weak signals.
- Reconcile the results with the plan's narrative and numbers.
- Flag unsupported claims that should be revised or removed.
Quality Bar
- The output clearly separates evidence from assumption.
- Validation work is targeted at decisions that matter.
- Weak claims are surfaced rather than buried.
- Findings improve the plan's credibility and focus.
Anti-Patterns
- Using anecdote as proof of market demand.
- Auditing market claims without checking their financial implications.
- Equating interest with purchasing behaviour.
- Leaving unsupported claims in place because they “sound strategic”.
Outputs
- A validation plan, evidence audit, or market-claim review
- Clear distinction between validated facts and open assumptions
- Recommended revisions or next tests
When to Use
Mode A Pre-Plan Field Validation: Before writing the business plan. Use when the entrepreneur has an idea but hasn't yet validated it with real customers. Guides systematic assumption-testing.
Mode B Post-Plan Claim Auditing: After sections 04-07 are complete. Reviews market claims and flags unsupported assertions before investors do.
Both modes can be used sequentially: validate first, write the plan, then audit the plan.
Mode A: Pre-Plan Field Validation
Core Philosophy
"A startup is a temporary organisation in search of a scalable, repeatable, profitable business model" (Blank & Dorf, 2012). Business plans are collections of untested hypotheses. Customer Development converts hypotheses into facts through systematic testing.
The 14th rule: There are no facts inside your building get outside.
Step 1: Classify the Venture's Problem Recognition Level
Before designing validation activities, assess where target customers sit on the Problem Recognition Scale (Blank & Dorf, 2012):
| Level | Customer State | Validation Approach |
|---|
| Latent | Have the problem but don't know it | Education-first; validate that the problem exists |
| Passive | Know the problem but aren't motivated to change | Validate pain severity; quantify cost of inaction |
| Active | Searching for a solution with a timetable | Validate solution fit; test willingness to pay |
| Vision | Have cobbled together a workaround | Validate that your solution is better than their hack |
Step 2: Identify Earlyvangelists
Find customers with all five characteristics (Blank & Dorf, 2012):
- They have a problem or need
- They understand they have a problem
- They're actively searching for a solution with a timetable
- The problem is so painful they've cobbled together an interim solution
- They've committed or can quickly acquire budget to purchase
Step 3: Map Stakeholders
Use three concentric rings (Alam):
- Target 1-3 primary beneficiaries/users
- Connected payers, implementers, gatekeepers who directly influence
- Influenced community, regulators, adjacent businesses indirectly affected
Step 4: Conduct Empathy-Based Research
Follow the 8-category interview guide (Alam): Introduction Jobs to Be Done Customers Challenges Aspirations Stories Emotions Conclusion.
Key engagement rules:
- Ask "whySection " repeatedly
- Encourage stories ("Tell me about the last time...")
- Look for inconsistencies between words and actions
- Embrace silence don't fill pauses
- Never suggest solutions during the interview
Build Empathy Maps: Observations Interpretations Insights. See
references/empathy-validation-tools.md
.
Step 5: Apply Rapid Validation
Use Kagan's Golden Rule: Find 3 paying customers in 48 hours.
Three validation methods:
- Direct preselling use the LOT framework (Listen-Options-Transition)
- Marketplaces post on Facebook Marketplace, local forums, WhatsApp groups
- Landing pages simple page with price and buy button
Structure offers using the Price + Benefit + Time formula:
"For [price], I will [benefit] in [time]."
When rejected, use the 4-question script: Why notSection Who elseSection What would make it a no-brainerSection What would you paySection
See
references/rapid-validation-methods.md
.
Step 6: Document and Track Assumptions
Use the Assumptions Tracking Template (Alam):
- Classify each assumption as Minor / Major / Critical
- Assign owner and due date
- Track status: New In Progress Validated / Disproved
Calculate Risk Score:
(Minor 1) + (Major 5) + (Critical 25)
. Target: below 100.
Step 7: Test the Solution (MVP)
Follow the MVP Evolution Model (Cooper & Vlaskovits, 2010):
| Stage | Interaction | Objective | Currency |
|---|
| MVP 1 | Landing page / concept | Test problem resonance | Attention |
| MVP 2 | Demo / prototype | Test solution approach | Commitment |
| MVP 3 | Working product | Test willingness to pay | Money |
Evaluate each capability using the BFCE framework (Alam): Better (quality)Section Faster (efficiency)Section Cheaper (cost)Section Easier (experience)Section
Step 8: Measure Product-Market Fit
Three-criteria test (Cooper & Vlaskovits, 2010):
- Customer willing to pay
- Cost of acquisition < revenue per customer
- Sufficient evidence market is large enough
Sean Ellis 40% Rule: If 40% of users say they'd be "very disappointed" without the product, you have product-market fit.
Step 9: Pivot or Proceed
Apply the three-question test (Blank & Dorf, 2012):
- **Can it scaleSection ** $1 in acquisition produces > $1 in revenueSection
- **Is there a repeatable sales roadmapSection ** Can others replicate the sales processSection
- **Is the funnel predictableSection ** Can you forecast conversion at each stageSection
If any answer is no, pivot (change one or more Business Model Canvas boxes) and return to Step 4. See
references/customer-development-process.md
for pivot methodology.
Quick Validation Checklist
Before writing the business plan, the entrepreneur should have validated:
Mode B: Post-Plan Claim Auditing
Audit the market-facing sections of the business plan (sections 04-07) to ensure claims are defensible and data-backed.
What to Validate
1. Market Size Validation
- Is TAM calculated using credible methodology (bottom-up preferred)Section
- Is SAM a logical subset of TAM with clear narrowing criteriaSection
- Is SOM realistic (typically 1-5% of SAM for startups)Section
- Are market size sources cited and current (within 2 years)Section
- Does bottom-up calculation align with top-downSection
- What is the market type: existing, new, re-segmented, or cloneSection (Blank & Dorf, 2012)
2. Growth Rate Validation
- Are growth projections supported by historical dataSection
- Is the cited CAGR from a reputable sourceSection
- Are growth assumptions consistent with the market typeSection (New markets take 3-7 years; existing markets grow incrementally)
- Is the business growing faster than the marketSection If so, whySection
3. Customer Assumption Validation
- Are customer personas based on research or assumptionsSection
- Were earlyvangelists identified and interviewedSection
- Is the CAC estimate grounded in comparable dataSection
- Is the CLV calculation realistic given churn assumptionsSection
- Is the CLV:CAC ratio defensible (>3:1)Section
- Has the Problem Recognition Scale been assessedSection
4. Competitive Positioning Validation
- Are all relevant competitors identified (direct, indirect, substitutes)Section
- Is the market type acknowledged, and does competitive strategy matchSection
- Are competitive advantages genuinely sustainableSection
- Are competitor weaknesses based on evidence, not wishful thinkingSection
- Has cost-of-entry been assessedSection (74%+ = monopoly, 41%+ = leader, 26%+ = unstable, <26% = open; Blank & Dorf, 2012)
5. Pricing Validation
- Is pricing consistent with the value propositionSection
- Was pricing tested with real customers (value-based approach)Section
- How does pricing compare to competitors and workaroundsSection
- Does the pricing model support the revenue projectionsSection
- Have the Six Revenue Dials been consideredSection (Kagan, 2024)
6. Validation Evidence Check (new)
- Did the plan authors conduct Customer Development activitiesSection
- Is there evidence of customer interviews, surveys, or presellingSection
- Are assumptions documented with validation statusSection
- Is the Risk Score reported and acceptableSection
- Has product-market fit been measuredSection
Claim-by-Claim Output Format
For each claim reviewed:
Claim: [The specific assertion]
Source: [Where it appears in the plan]
Evidence: [Supporting data found]
Validation Method Used: [Interview / Preselling / Survey / Secondary research / None]
Status: VALIDATED / NEEDS EVIDENCE / UNSUPPORTED / CONTRADICTED
Action: [What to do cite source, conduct research, revise claim]
Validation Summary Dashboard
| Area | Claims | Validated | Needs Evidence | Unsupported | Critical Issues |
|---|
| Market size | X | X | X | X | [List] |
| Growth rates | X | X | X | X | [List] |
| Customer data | X | X | X | X | [List] |
| Competition | X | X | X | X | [List] |
| Pricing | X | X | X | X | [List] |
| Validation evidence | X | X | X | X | [List] |
Generation Process
Mode A (Pre-Plan)
- Classify the venture's problem recognition level
- Identify earlyvangelists and map stakeholders
- Design and conduct empathy-based research
- Apply rapid validation (Golden Rule: 3 customers/48 hours)
- Document assumptions with impact classifications
- Build and test MVP through the evolution model
- Measure product-market fit
- Decide: pivot or proceed
- Compile validated findings as input for business plan writing
Mode B (Post-Plan)
- Review sections 04-07 and extract all factual claims
- Categorise each claim (market size, growth, customer, competitive, pricing, validation evidence)
- Assess evidence for each claim, including Customer Development evidence
- Flag unsupported or contradicted claims
- Suggest validation methods for gaps (preselling, interviews, pilot tests, marketplace tests)
- Produce validation summary dashboard
Quality Criteria
- Every factual claim is assessed, not just the obvious ones
- Validation is objective does not rubber-stamp weak claims
- The 9 Deadly Sins are actively checked for (Blank & Dorf, 2012)
- Premature scaling warnings are flagged aggressively
- Suggested validation methods are practical and affordable for the Ugandan context
- Critical issues are highlighted with urgency
- Risk Score trajectory is tracked if assumptions data is available
References
references/customer-development-process.md
Blank/Dorf's 4-step Customer Development, 14 rules, 9 Deadly Sins, pivot methodology, Business Model Canvas as scorecard
references/customer-discovery-steps.md
Cooper/Vlaskovits' 8-step Customer Discovery, C-P-S hypotheses, Funnel Matrix, Value Path, Business Ecosystem Mapping, outreach templates, MVP Evolution Model, product-market fit measurement
references/rapid-validation-methods.md
Kagan's Golden Rule, LOT framework, Dream Ten List, Price+Benefit+Time formula, Rejection Script, validation methods, One-Minute Business Model, Six Revenue Dials, Content Circle Framework
references/empathy-validation-tools.md
Alam's Transform3+1, stakeholder mapping, empathy research, persona template, journey mapping, BFCE framework, user testing methodology, Assumptions Tracking, Risk Score formula, elevator pitch templates
references/mckinsey-problem-solving.md
McKinsey's MECE principle (Mutually Exclusive, Collectively Exhaustive) with worked examples; issue tree construction and branching rules; hypothesis-driven analysis (Initial Hypothesis method, three-step generation, insurance leakage anecdote); 80/20 rule as diagnostic jump-start; key drivers framework; fact-based analysis; Forces at Work four-category environmental scan (suppliers/customers/competitors/substitutes); elevator test; presentation structure (one message per chart, prewiring); 10 common analysis mistakes Source: Rasiel (McGraw-Hill). Read when structuring any analytical section (market analysis, competitive analysis, risk), when building issue trees, or when auditing claims for MECE compliance and fact-based support.
- 72-tool business analysis toolkit: See
references/business-analysis-techniques-cadle.md
for all 72 BA tools grouped by stage (strategy, investigation, stakeholder analysis, process modelling, options evaluation, change management), a business plan application table mapping each category to plan sections, and Uganda/EA contextualisation notes Source: Cadle, Paul & Turner (BCS, 2010). Read when structuring a market investigation, designing stakeholder analysis, building process models for the operations plan, evaluating options with CBA/NPV, or auditing a plan's analytical rigour against a structured toolkit.