pr-review
Original:🇺🇸 English
Translated
Perform a thorough quality review of a pull request or feature branch before merging. Use this skill whenever the user asks to review a PR, check if code is production-ready, assess quality, verify docs are updated, or asks "is this ready to merge?", "review this PR", "check quality", "is this production ready?", or similar. Also use when reviewing your own work before submitting.
4installs
Sourcedynatrace-oss/dtctl
Added on
NPX Install
npx skill4agent add dynatrace-oss/dtctl pr-reviewTags
Translated version includes tags in frontmatterSKILL.md Content
View Translation Comparison →PR Quality Review
Perform a comprehensive quality review of a pull request or feature branch. This skill covers production readiness, code quality, UX, documentation, tests, and safety.
The review is structured as a checklist across seven dimensions. For each dimension, investigate the actual code and report specific findings -- not just "looks good" but concrete observations with file paths and line numbers.
Getting Started
First, understand the scope of the change:
bash
# What branch are we on, what's the base?
git branch --show-current
git log main..HEAD --oneline
# What files changed?
git diff main...HEAD --stat
# Full diff for review
git diff main...HEADIf reviewing a remote PR, fetch it first:
bash
gh pr view <number> --json title,body,files
gh pr diff <number>Read the PR description and all commits to understand the intent before diving into code.
Review Dimensions
Work through each dimension below. For each one, report a status:
- Pass -- meets the bar, no issues
- Needs work -- specific issues found (list them)
- N/A -- not applicable to this change
1. Production Readiness
Does this code behave correctly and handle failure gracefully?
- Error handling: Are all errors checked? Are they wrapped with context ()? Do they surface actionable messages to users?
fmt.Errorf("context: %w", err) - Edge cases: Empty inputs, nil values, missing config, network failures, API rate limits, large datasets, pagination boundaries
- Safety checks: All mutating commands (create, edit, apply, delete, update) must include safety checks after and before client operations. Pattern:
LoadConfig()Verify correct operation type. Skip only in dry-run paths.gochecker, err := NewSafetyChecker(cfg) if err := checker.CheckError(safety.OperationXXX, safety.OwnershipUnknown); err != nil { return err } - No stdout in library code: must return data, not print. Only
pkg/handles output.cmd/ - No hardcoded secrets or customer data: No real names, env IDs, tokens, or emails in code or tests. Use for emails (RFC 2606).
@example.invalid
2. Code Quality
Is the code clean, idiomatic, and maintainable?
- Go idioms: Follows Effective Go and Go Code Review Comments
- Naming: Descriptive names, Go conventions (camelCase unexported, PascalCase exported), suffix for interfaces
-er - File size: Files should be under 500 lines. Large files should be split.
- Imports: Standard library first, then third-party, then internal ()
github.com/dynatrace-oss/dtctl - Duplication: Look for copy-pasted code that should be extracted into helpers
- Comments: Exported functions/types documented. Comments explain "why" not "what".
- Consistent patterns: New code should follow existing patterns in the codebase. Check similar resources in for reference implementations.
pkg/resources/
Run the linter to catch issues the eye might miss:
bash
make lint-strict3. User Experience
Does this feel right from the user's perspective?
- Command naming: Follows the pattern. No custom query flags -- use DQL passthrough.
dtctl <verb> <resource> - Output formatting: Table output is readable, columns make sense, no misalignment. JSON/YAML output is clean.
- Error messages: Clear, actionable, suggest next steps. In agent mode (), errors are structured JSON with machine-readable codes.
-A - Interactive behavior: Name resolution, disambiguation prompts work. disables interactive behavior.
--plain - Help text: Command has a description,
Shortdescription, andLongfield. Parent verb commands have examples.Example - Aliases: Resource has sensible aliases (e.g., for workflow,
wffor dashboard).dash - Agent mode: Commands support envelope with contextual suggestions. Test with
--agent.-A -o json - Color control: Respects ,
NO_COLOR,FORCE_COLOR, and TTY detection.--plain
Try running the actual commands to see how they feel:
bash
# Does the help text look good?
dtctl <command> --help
# Does table output look right?
dtctl <command> --plain
# Does agent mode work?
dtctl <command> -A4. Test Coverage
Are the changes well-tested?
- Unit tests: New functions have tests. Table-driven tests preferred.
- Coverage targets: 70% minimum overall, 80% for new packages, 90% for critical packages (,
pkg/client).pkg/config - Edge case tests: Not just happy paths -- test error conditions, empty inputs, boundary values.
- Mock server guards: Paginated mock servers must reject invalid parameter combinations (e.g., +
page-size). Settings API mocks must also rejectpage-key/schemaIdswithscopes.nextPageKey - Golden tests: If output formatting changed or a new resource was added, golden files must be updated. Check:
Golden tests use real production structs frombash
go test ./pkg/output/ -run TestGolden-- never test-only duplicates.pkg/resources/* - E2E tests: Integration scenarios in for new resources or complex workflows.
test/e2e/
bash
# Run full suite
go test ./...
# Check coverage
make test-coverage5. Documentation
Is the change properly documented for users and contributors?
Always required:
- CHANGELOG.md: Entry under following Keep-a-Changelog format. Bold feature name with em dash and description.
[Unreleased]
Required for new features:
- README.md: Updated if the feature is user-facing and significant (new resource type, new command category)
- docs/QUICK_START.md: Usage examples for major new features
- docs/dev/IMPLEMENTATION_STATUS.md: Feature matrix rows updated
- docs/dev/API_DESIGN.md: Design patterns documented if introducing new conventions
- docs/TOKEN_SCOPES.md: New scopes documented if the feature requires additional API permissions
Required for new resources:
- Resource-specific doc page in or
docs/docs/site/_docs/ - Command reference updated in
docs/site/_docs/command-reference.md
Required for new AI agent support:
- ,
README.md,CHANGELOG.md,docs/QUICK_START.md,docs/dev/API_DESIGN.md(all five)docs/dev/IMPLEMENTATION_STATUS.md
6. GitHub Pages
If the change adds a new user-facing feature, is the documentation site updated?
The site lives in and deploys via GitHub Actions on pushes to main that touch .
docs/site/docs/site/**Check:
- New doc page: Does the feature need a page in ? Use YAML frontmatter with
docs/site/_docs/,title.layout: docs - Navigation: Is the new page added to in the correct section (Getting Started / Resources / Reference)?
docs/site/_includes/docs-nav.html - Landing page: Does need updating? (e.g., new resource in the feature table, new capability mentioned)
docs/site/index.md - Existing pages: Are related pages updated to mention the new feature? (e.g., a new output format should appear on the output-formats page)
- Links: All links work, relative paths are correct, no broken references.
7. PR Description
Is the PR itself well-described?
- Title: Clear, follows conventional commit style (,
feat: ...)fix: ... - Summary: Explains what changed and why (not just what files were touched)
- Related issues: References issues with or
Closes #NNNFixes #NNN - Breaking changes: Called out explicitly if any
- Testing section: Describes how the change was tested
- UX examples: Before/after CLI output for user-facing changes
Review Output
After completing the review, provide a summary in this format:
## PR Review: <title>
| Dimension | Status | Notes |
|-----------|--------|-------|
| Production readiness | Pass/Needs work | ... |
| Code quality | Pass/Needs work | ... |
| User experience | Pass/Needs work | ... |
| Test coverage | Pass/Needs work | ... |
| Documentation | Pass/Needs work | ... |
| GitHub Pages | Pass/Needs work/N/A | ... |
| PR description | Pass/Needs work | ... |
### Issues Found
1. **[Dimension]** file:line -- description of issue
2. ...
### Suggestions (non-blocking)
1. ...
### Verdict
Ready to merge / Needs revisions (list blockers)Be direct and specific. Reference exact file paths and line numbers. Distinguish between blocking issues (must fix) and suggestions (nice to have). Don't pad the review with praise -- focus on what needs attention.