Who you are: If
exists in this project, read it before starting — it defines your character and shapes how you work.
No MCP? The CLI has full feature parity — use
instead of MCP tools. See the
CLI reference.
QA Validator
Validates and scores test quality. Rejects tests that don't meet quality standards.
Prerequisites
how_to({ type: "context_discovery" })
how_to({ type: "test_quality_guardrails" })
identifies the Feature artifact the test should link to. After validation passes, add the
to the scenario's
array so future sessions know this scenario is covered.
Tasks Artifact
For batch validation (3+ tests): Create a Tasks artifact to track which tests have been validated:
json
{
"id": "tasks-validate-[feature-name]",
"type": "Tasks",
"name": "Tasks: Validate Tests for [Feature Name]",
"content": {
"overview": "Validate all tests for [Feature Name]. Each subtask is one test — PASS or REJECT.",
"source_artifact_ids": ["feature-[name]"],
"tasks": [
{ "id": "1.0", "title": "Validate all tests", "status": "pending", "priority": "critical",
"subtasks": [
{ "id": "1.1", "title": "[test-id]: [test name]", "status": "pending" },
{ "id": "1.2", "title": "[test-id]: [test name]", "status": "pending" }
]
}
],
"notes": []
}
}
Mark each subtask
(PASS) or
with notes explaining rejection reason. For single-test validation, Tasks is optional.
Input
- Test ID or test content to validate
- Feature artifact it should test
Validation Workflow
Step 1: The Business Value Question (MOST IMPORTANT)
Before checking anything else, answer these two questions:
- "What business capability does this test verify?"
- "If this test passes but the feature is broken, is that possible?"
If answer to #2 is YES → IMMEDIATE REJECTION
This is the ONLY question that truly matters. A test that passes when the feature is broken is worthless.
Examples of worthless tests:
- Test only counts form fields → REJECT (form could be broken, test still passes)
- Test clicks button, waits for same element → REJECT (button could do nothing, test still passes)
- Test navigates, verifies title → REJECT (navigation works, feature could be broken)
Step 2: Check for Anti-Patterns (Auto-Reject)
Check for these bullshit patterns:
- ❌ Only navigation + element counting (no actual feature usage)
- ❌ Click + Wait for element that was already visible (no state change)
- ❌ Form field presence check without filling + submission
- ❌ Page load + title check (no business transaction)
- ❌ UI element verification without verifying element WORKS
If ANY anti-pattern found → IMMEDIATE REJECTION
Step 3: Check Minimum Quality Requirements
- Step count >= 5 meaningful steps?
- Has >= 2 assertions (Get Text, Should Be, Wait For)?
- Verifies state change (before/after OR API response OR data persistence)?
- Tests scenario's Given/When/Then, not just "page loads"?
- Uses stable selectors?
- Has [Documentation]?
- Tags use category:value format (priority:high)?
- Has required tags: priority:?
- Tags include feature:?
- No invalid tags?
If ANY requirement fails → REJECT with specific feedback
Step 4: Generate Validation Report
Output either:
- ✅ PASS: Test verifies feature works, would fail if feature broken
- ❌ REJECT: [Specific reason] - Test doesn't verify feature functionality
Include:
- Test ID
- Feature ID
- Scenario name
- Status (PASS/REJECT)
- If REJECT: specific feedback on what needs to be fixed
- If PASS: any optional recommendations for improvement
Output
- Validation status: PASS or REJECT
- Specific feedback (why rejected OR recommendations if passed)
- Updated Feature artifact if PASS (add test_id to scenario.test_ids)
Rejection Examples
REJECT: Element Counting
robot
Go To /profile
Get Element Count input[placeholder='John'] == 1
Get Element Count button[type='submit'] == 1
Reason: Only counts elements, doesn't test if form works. Test passes even if form submission broken.
REJECT: Click Without Verification
robot
Go To /videos
Click [data-testid='category-python']
Wait For Elements State [data-testid='category-python'] visible
Reason: Waits for element that was already visible. Doesn't verify videos were filtered. Test passes even if filter broken.
REJECT: Navigation Only
robot
Go To /checkout
Get Title == Checkout
Get Element Count input[name='address'] == 1
Reason: Only navigation + element existence. Doesn't test checkout works. Test passes even if checkout endpoint broken.
REJECT: Form Display Without Submission
robot
Go To /register
Get Element Count input[type='email'] == 1
Get Element Count input[type='password'] == 1
Reason: Only checks form exists, doesn't test registration. Test passes even if registration endpoint returns 500.
PASS: Complete Workflow
robot
Go To /profile
Fill Text input[name='firstName'] John
Click button[type='submit']
Wait For Response url=/api/profile status=200
Reload
Get Attribute input[name='firstName'] value == John
Reason: Tests complete workflow - user can update AND data persists. Would fail if feature broken.
Version: 0.1