Loading...
Loading...
Performance and load testing patterns — k6 load tests, Locust stress tests, pytest execution optimization (xdist parallel, plugins), test type classification, and performance benchmarking. Use when writing load tests, optimizing test execution speed, or setting up pytest infrastructure.
npx skill4agent add yonatangross/orchestkit testing-perf| Area | File | Purpose |
|---|---|---|
| k6 Load Testing | | Thresholds, stages, custom metrics, CI integration |
| Locust Testing | | Python load tests, task weighting, auth flows |
| Test Types | | Load, stress, spike, soak test patterns |
| Execution | | Coverage reporting, parallel execution, failure analysis |
| Pytest Markers | | Custom markers, xdist parallel, worker isolation |
| Pytest Plugins | | Factory fixtures, plugin hooks, anti-patterns |
| k6 Patterns | | Staged ramp-up, authenticated requests, test types |
| xdist Parallel | | Distribution modes, worker isolation, CI config |
| Custom Plugins | | conftest plugins, installable plugins, hook reference |
| Perf Checklist | | Planning, setup, metrics, load patterns, analysis |
| Pytest Checklist | | Config, markers, parallel, fixtures, CI/CD |
| Test Template | | Full test case documentation template |
import http from 'k6/http';
import { check, sleep } from 'k6';
export const options = {
stages: [
{ duration: '30s', target: 20 }, // Ramp up
{ duration: '1m', target: 20 }, // Steady state
{ duration: '30s', target: 0 }, // Ramp down
],
thresholds: {
http_req_duration: ['p(95)<500'], // 95th percentile under 500ms
http_req_failed: ['rate<0.01'], // Less than 1% error rate
},
};
export default function () {
const res = http.get('http://localhost:8000/api/health');
check(res, {
'status is 200': (r) => r.status === 200,
'response time < 200ms': (r) => r.timings.duration < 200,
});
sleep(1);
}k6 run --out json=results.json tests/load/api.js| Type | Duration | VUs | Purpose | When to Use |
|---|---|---|---|---|
| Load | 5-10 min | Expected traffic | Validate normal conditions | Every release |
| Stress | 10-20 min | 2-3x expected | Find breaking point | Pre-launch |
| Spike | 5 min | Sudden 10x surge | Test auto-scaling | Before events |
| Soak | 4-12 hours | Normal load | Detect memory leaks | Weekly/nightly |
# pyproject.toml
[tool.pytest.ini_options]
addopts = ["-n", "auto", "--dist", "loadscope"]
markers = [
"slow: marks tests as slow",
"smoke: critical path tests for CI/CD",
]# Run with parallel workers and coverage
pytest -n auto --dist loadscope --cov=app --cov-report=term-missing --maxfail=3
# CI fast path — skip slow tests
pytest -m "not slow" -n auto
# Debug mode — single worker
pytest -n 0 -x --tb=long@pytest.fixture(scope="session")
def db_engine(worker_id):
db_name = f"test_db_{worker_id}" if worker_id != "master" else "test_db"
engine = create_engine(f"postgresql://localhost/{db_name}")
yield engine
engine.dispose()| Metric | Target | Tool |
|---|---|---|
| p95 response time | < 500ms | k6 |
| p99 response time | < 1000ms | k6 |
| Error rate | < 1% | k6 / Locust |
| Business logic coverage | 90% | pytest-cov |
| Critical path coverage | 100% | pytest-cov |
| Scenario | Recommendation |
|---|---|
| JavaScript/TypeScript team | k6 for load testing |
| Python team | Locust for load testing |
| Need CI thresholds | k6 (built-in threshold support) |
| Need distributed testing | Locust (built-in distributed mode) |
| Slow test suite | pytest-xdist with |
| Flaky parallel tests | |
| DB-heavy tests | Worker-isolated databases with |
ork:testing-unitork:testing-e2eork:performance