EHR Design Review
Overview
Use this skill to inspect healthcare or EHR software screens, components, mockups, or code and produce a structured report of design issues mapped to established healthcare usability and safety standards. The review covers patient identity, layout, color, typography, data display, numeric formatting, units, dates, alerts, medication safety, forms, accessibility, workflow, audit logging, error prevention, terminology, interoperability, internationalization, security, and documentation.
Operating Rules
- Never change code, designs, configurations, or documentation.
- Do not present the output as a formal certification or regulatory determination.
- Bias toward observable evidence from the artifacts under review and clearly separate:
- confirmed violations from the code, markup, design, or config
- likely inferences from surrounding implementation
- areas that require runtime testing, user research, or policy validation
- When a guideline cannot be evaluated from the provided artifacts, mark it as not assessable rather than passing or failing.
Workflow
- Confirm the scope: which screens, components, modules, or code paths to review.
- Load
references/style-guide.md
to access the full design criteria.
- Walk through each review category (see categories below) against the artifacts in scope.
- Assign severity and confidence for each finding.
- Produce a report only. Do not draft fixes, patches, or redesigns unless explicitly asked.
Review Categories
Each category maps to a section of the style guide reference.
- Patient Context and Identity — persistent header, required identifiers, patient-switch confirmation, environment indicators
- Layout and Information Hierarchy — summary order, navigation depth, click efficiency, cross-module consistency
- Color Standards — semantic color use, dual-coding (never color-only), WCAG contrast ratios
- Typography — font legibility, size minimums, avoidance of condensed or decorative fonts
- Data Tables and Clinical Data Display — alignment, column consistency, sorting, filtering, reference ranges, abnormal value marking
- Numeric Formatting — thousands separators, decimal precision, trailing zeros
- Units of Measure — units always displayed, UCUM preference, no bare numbers
- Date and Time Formatting — ISO-8601 storage, unambiguous display (DD Mon YYYY), 24-hour time
- Alerts and Clinical Decision Support — alert levels, fatigue prevention, override documentation, clear explanations
- Medication Safety — dangerous abbreviation avoidance, structured order display, trailing-zero prevention
- Forms and Data Entry — structured input preference, autocomplete, range display, immediate validation
- Accessibility — WCAG 2.1 AA compliance, keyboard navigation, screen reader support, focus indicators, no hover-only information
- Workflow Optimization — click reduction, persistent key data, minimal modals, quick patient navigation
- Audit Logging — user, timestamp, action, before/after data, location in audit records
- Error Prevention — proactive constraints, range warnings, input validation before submission
- Clinical Terminology Standards — SNOMED CT, LOINC, ICD-10 usage for coded concepts
- Interoperability and Data Exchange — HL7 FHIR resource alignment, structured data exchange
- Internationalization — multi-language support, locale-aware formatting, standardized internal representation
- Security and Privacy — RBAC, session timeouts, encryption, audit logs, HIPAA alignment
- Documentation and Help — contextual help, error explanations, training materials, workflow guides
Constraints
- Review only — do not propose code changes unless the user explicitly requests remediation guidance.
- Stay within the scope described in the frontmatter.
- Surface patient-safety implications with the highest priority.
- Distinguish between must-fix safety issues and nice-to-have improvements.
- When artifacts are insufficient to evaluate a category, say so rather than guessing.
Resources
references/style-guide.md
: full Healthcare Software Design Style Guide with criteria, examples, and source standards
examples/example-report.md
: example review report showing expected output shape, finding format, and coverage matrix
Invocation Modes
Standalone (default)
When invoked directly by a user or without the phrase "scoped review," operate normally: confirm scope interactively, load references, walk review categories, and produce the full report described in the Output Contract below.
Scoped
When invoked with the phrase "scoped review" and a pre-determined list of file paths, operate in scoped mode:
-
Input: a list of file paths to review. Scope is pre-determined — do not ask for confirmation.
-
Behavior: skip interactive scope confirmation. Skip executive summary and coverage matrix generation. Review only the provided files against the review categories.
-
Output: return a findings-only list. Each finding uses this format:
### [HF-{n}] {title}
- Severity: critical | major | minor | info
- Category: {category from the 20 review categories}
- File: {path}:{line}
- Detail: {what was observed}
- Guideline: {which standard or rule applies}
If no findings are discovered, return a single line: "No human-factors findings for the provided files."
Output Contract
When operating in standalone mode, return a review report with:
- Executive Summary: overall assessment, highest-risk findings, and scope of review
- Scope: artifacts reviewed, categories assessed, categories not assessable
- Findings Table with columns:
- ID
- Severity (critical / major / minor / info)
- Category (from the 20 review categories)
- Location (file, screen, component, or line reference)
- Finding (what was observed)
- Guideline (which standard or rule applies)
- Risk (patient-safety or usability impact)
- Confidence (confirmed / likely / unclear)
- Category Coverage Matrix: for each of the 20 categories — compliant, partial, non-compliant, or not assessable
- Positive Observations: areas where the design follows the guidelines well
- Open Questions: areas requiring runtime testing, user research, or additional artifacts
- Standards Basis: list of standards referenced in the review