code-analyzer
Compare original and translation side by side
🇺🇸
Original
English🇨🇳
Translation
ChineseCode Analyzer Skill
代码分析Skill
Installation
安装
No separate download: the skill runs the in-repo tool .
.claude/tools/analysis/project-analyzer/analyzer.mjs- Ensure Node.js (v18+) is installed: nodejs.org or (Windows),
winget install OpenJS.NodeJS.LTS(macOS).brew install node - From the project root, the script is invoked automatically; no extra install steps.
无需单独下载:该Skill运行仓库内的工具 .
.claude/tools/analysis/project-analyzer/analyzer.mjs- 确保Node.js (v18+) 已安装:可通过nodejs.org下载,或在Windows使用 ,在macOS使用
winget install OpenJS.NodeJS.LTS。brew install node - 从项目根目录出发,脚本会自动调用,无需额外安装步骤。
Cheat Sheet & Best Practices
速查手册与最佳实践
Metrics: Focus on cyclomatic complexity (decision paths), LOC, maintainability index, and duplicate blocks. Use ESLint rule (e.g. ) for JS/TS; optional chaining and default params add branches.
complexity"complexity": ["error", 15]Process: Analyze before refactoring; run project-wide then drill into hotspots. Track trends over time (not one-off). Use , , , , alongside complexity.
max-depthmax-linesmax-nested-callbacksmax-paramsmax-statementsHacks: Start with project-analyzer output; filter by file type and threshold. Prioritize files with high complexity and high churn. Disable complexity rule only if you cannot set a sensible limit; prefer lowering the threshold over disabling.
指标: 重点关注圈复杂度(决策路径)、LOC(代码行数)、可维护性指数和重复代码块。针对JS/TS使用ESLint的规则(例如);可选链和默认参数会增加分支数量。
流程: 在重构前进行分析;先执行全项目分析,再深入排查热点文件。长期追踪指标趋势(而非单次分析)。结合、、、、规则与复杂度指标一同使用。
技巧: 从project-analyzer的输出结果入手;按文件类型和阈值筛选。优先处理复杂度高且变更频繁的文件。仅当无法设置合理限制时才禁用复杂度规则;相比禁用规则,更建议降低阈值。
complexity"complexity": ["error", 15]max-depthmax-linesmax-nested-callbacksmax-paramsmax-statementsCertifications & Training
认证与培训
No single cert; aligns with static analysis and ESLint complexity. ESLint: complexity rule, max-depth, max-lines, max-params. Skill data: Cyclomatic complexity, LOC, maintainability, duplicates; analyze before refactor; track hotspots and trends.
无专属认证; 与静态代码分析和ESLint复杂度规则相关。ESLint相关资源: complexity规则、max-depth、max-lines、max-params。Skill数据: 圈复杂度、LOC、可维护性、重复代码;重构前分析;追踪热点文件与趋势。
Hooks & Workflows
钩子与工作流
Suggested hooks: Pre-commit or CI: run project-analyzer/doctor for health; optional complexity gate. Use with developer (secondary), qa (secondary), code-reviewer (primary).
Workflows: Use with code-reviewer (primary), developer/ qa (secondary), c4-code (primary). Flow: run analyzer → filter hotspots → refactor or add tests. See .
code-review-workflow.md推荐钩子: 提交前或CI阶段:运行project-analyzer/doctor检查项目健康状态;可选设置复杂度门槛。主要配合code-reviewer使用,次要配合developer、qa使用。
工作流: 主要配合code-reviewer使用,次要配合developer/qa使用,以及c4-code(主要)。流程:运行分析工具 → 筛选热点文件 → 重构或添加测试。详见。
code-review-workflow.mdOverview
概述
Static code analysis and metrics. 90%+ context savings.
静态代码分析与指标分析。可节省90%以上的上下文梳理时间。
Tools (Progressive Disclosure)
工具(渐进式披露)
Analysis
分析类
| Tool | Description |
|---|---|
| analyze-file | Analyze single file |
| analyze-project | Analyze entire project |
| complexity | Calculate complexity metrics |
| 工具 | 描述 |
|---|---|
| analyze-file | 分析单个文件 |
| analyze-project | 分析整个项目 |
| complexity | 计算复杂度指标 |
Metrics
指标类
| Tool | Description |
|---|---|
| loc | Lines of code |
| cyclomatic | Cyclomatic complexity |
| maintainability | Maintainability index |
| duplicates | Find duplicate code |
| 工具 | 描述 |
|---|---|
| loc | 代码行数 |
| cyclomatic | 圈复杂度 |
| maintainability | 可维护性指数 |
| duplicates | 查找重复代码 |
Reporting
报告类
| Tool | Description |
|---|---|
| summary | Get analysis summary |
| hotspots | Find complexity hotspots |
| trends | Analyze metric trends |
| 工具 | 描述 |
|---|---|
| summary | 获取分析汇总 |
| hotspots | 查找复杂度热点文件 |
| trends | 分析指标趋势 |
Agent Integration
Agent集成
- code-reviewer (primary): Code review
- refactoring-specialist (primary): Tech debt analysis
- architect (secondary): Architecture assessment
- code-reviewer (primary): 代码评审
- refactoring-specialist (primary): 技术债务分析
- architect (secondary): 架构评估
Iron Laws
铁律
- ALWAYS run project-wide analysis before drilling into individual files — local analysis without context misses which files are actually the highest-priority hotspots; start broad, then focus.
- ALWAYS focus on high-complexity AND high-churn files — a complex but rarely-changed file is lower priority than a moderately complex but frequently-changed one; intersection matters most.
- NEVER set complexity thresholds above 20 — cyclomatic complexity >20 is demonstrably correlated with defects; teams that allow >20 accumulate unmaintainable code without noticing.
- ALWAYS track metrics over time, not just once — a single analysis snapshot is meaningless; track trends weekly to detect gradual degradation before it becomes a crisis.
- NEVER report metrics without actionable next steps — complexity numbers without refactoring targets provide no value; every high-complexity finding must include a specific suggested improvement.
- ALWAYS run project-wide analysis before drilling into individual files — 缺乏上下文的局部分析会遗漏真正优先级最高的热点文件;先从全局入手,再聚焦细节。
- ALWAYS focus on high-complexity AND high-churn files — 复杂度高但极少变更的文件,优先级低于复杂度中等但频繁变更的文件;两者的交集才是最关键的。
- NEVER set complexity thresholds above 20 — 圈复杂度>20已被证明与代码缺陷率显著相关;允许阈值>20的团队会在不知不觉中积累难以维护的代码。
- ALWAYS track metrics over time, not just once — 单次分析快照毫无意义;每周追踪趋势,在代码质量逐步恶化演变为危机前及时发现。
- NEVER report metrics without actionable next steps — 没有重构目标的复杂度数据毫无价值;每个高复杂度的发现都必须包含具体的改进建议。
Anti-Patterns
反模式
| Anti-Pattern | Why It Fails | Correct Approach |
|---|---|---|
| Analyzing only changed files | Misses cross-file complexity accumulation | Run project-wide then filter to changed hot spots |
| Ignoring high-complexity files over time | Gradual degradation invisible in point-in-time analysis | Track weekly trends; alert on any increase |
| Complexity threshold >20 | Research shows defect rate spikes sharply above 20 | Set ESLint complexity rule to ≤15 for enforcement |
| Reporting metrics without action items | Metrics without remediation don't reduce complexity | Attach specific refactoring suggestion per hotspot |
| Running analysis once and ignoring results | Technical debt silently accumulates | Schedule automated weekly analysis with trend reports |
| 反模式 | 失败原因 | 正确做法 |
|---|---|---|
| 仅分析已变更的文件 | 会遗漏跨文件的复杂度累积问题 | 先执行全项目分析,再筛选已变更的热点文件 |
| 长期忽略高复杂度文件 | 逐步的质量恶化在单次分析中无法察觉 | 每周追踪趋势;当指标上升时发出警报 |
| 复杂度阈值>20 | 研究表明,阈值超过20后缺陷率会急剧上升 | 将ESLint complexity规则设置为≤15以强制执行 |
| 仅报告指标而不附带行动项 | 没有修复方案的指标无法降低复杂度 | 为每个热点文件附上具体的重构建议 |
| 仅运行一次分析后就忽略结果 | 技术债务会悄然累积 | 安排自动化的每周分析并生成趋势报告 |
Memory Protocol (MANDATORY)
记忆协议(强制要求)
Before starting:
Read
.claude/context/memory/learnings.mdAfter completing:
- New pattern ->
.claude/context/memory/learnings.md - Issue found ->
.claude/context/memory/issues.md - Decision made ->
.claude/context/memory/decisions.md
ASSUME INTERRUPTION: If it's not in memory, it didn't happen.
开始前:
阅读
.claude/context/memory/learnings.md完成后:
- 新发现的模式 ->
.claude/context/memory/learnings.md - 发现的问题 ->
.claude/context/memory/issues.md - 做出的决策 ->
.claude/context/memory/decisions.md
ASSUME INTERRUPTION: 如果未存入记忆,则视为未发生。