agent-supply-chain

Compare original and translation side by side

🇺🇸

Original

English
🇨🇳

Translation

Chinese

Agent Supply Chain Integrity

Agent 供应链完整性

Generate and verify integrity manifests for AI agent plugins and tools. Detect tampering, enforce version pinning, and establish supply chain provenance.
生成并校验AI Agent插件与工具的完整性清单,检测篡改行为,强制版本锁定,建立供应链溯源能力。

Overview

概述

Agent plugins and MCP servers have the same supply chain risks as npm packages or container images — except the ecosystem has no equivalent of npm provenance, Sigstore, or SLSA. This skill fills that gap.
Plugin Directory → Hash All Files (SHA-256) → Generate INTEGRITY.json
Later: Plugin Directory → Re-Hash Files → Compare Against INTEGRITY.json
                                          Match? VERIFIED : TAMPERED
Agent插件与MCP服务器面临和npm包、容器镜像相同的供应链风险,但目前该生态还没有等效于npm provenance、Sigstore或SLSA的能力,本Skill填补了这一空白。
Plugin Directory → Hash All Files (SHA-256) → Generate INTEGRITY.json
Later: Plugin Directory → Re-Hash Files → Compare Against INTEGRITY.json
                                          Match? VERIFIED : TAMPERED

When to Use

适用场景

  • Before promoting a plugin from development to production
  • During code review of plugin PRs
  • As a CI step to verify no files were modified after review
  • When auditing third-party agent tools or MCP servers
  • Building a plugin marketplace with integrity requirements

  • 将插件从开发环境发布到生产环境前
  • 插件PR的代码评审阶段
  • 作为CI步骤,校验评审后的文件未被修改
  • 审计第三方Agent工具或MCP服务器时
  • 搭建有完整性要求的插件市场时

Pattern 1: Generate Integrity Manifest

模式1:生成完整性清单

Create a deterministic
INTEGRITY.json
with SHA-256 hashes of all plugin files.
python
import hashlib
import json
from datetime import datetime, timezone
from pathlib import Path

EXCLUDE_DIRS = {".git", "__pycache__", "node_modules", ".venv", ".pytest_cache"}
EXCLUDE_FILES = {".DS_Store", "Thumbs.db", "INTEGRITY.json"}

def hash_file(path: Path) -> str:
    """Compute SHA-256 hex digest of a file."""
    h = hashlib.sha256()
    with open(path, "rb") as f:
        for chunk in iter(lambda: f.read(8192), b""):
            h.update(chunk)
    return h.hexdigest()

def generate_manifest(plugin_dir: str) -> dict:
    """Generate an integrity manifest for a plugin directory."""
    root = Path(plugin_dir)
    files = {}

    for path in sorted(root.rglob("*")):
        if not path.is_file():
            continue
        if path.name in EXCLUDE_FILES:
            continue
        if any(part in EXCLUDE_DIRS for part in path.relative_to(root).parts):
            continue
        rel = path.relative_to(root).as_posix()
        files[rel] = hash_file(path)

    # Chain hash: SHA-256 of all file hashes concatenated in sorted order
    chain = hashlib.sha256()
    for key in sorted(files.keys()):
        chain.update(files[key].encode("ascii"))

    manifest = {
        "plugin_name": root.name,
        "generated_at": datetime.now(timezone.utc).isoformat(),
        "algorithm": "sha256",
        "file_count": len(files),
        "files": files,
        "manifest_hash": chain.hexdigest(),
    }
    return manifest
创建确定性的
INTEGRITY.json
文件,包含所有插件文件的SHA-256哈希值。
python
import hashlib
import json
from datetime import datetime, timezone
from pathlib import Path

EXCLUDE_DIRS = {".git", "__pycache__", "node_modules", ".venv", ".pytest_cache"}
EXCLUDE_FILES = {".DS_Store", "Thumbs.db", "INTEGRITY.json"}

def hash_file(path: Path) -> str:
    """Compute SHA-256 hex digest of a file."""
    h = hashlib.sha256()
    with open(path, "rb") as f:
        for chunk in iter(lambda: f.read(8192), b""):
            h.update(chunk)
    return h.hexdigest()

def generate_manifest(plugin_dir: str) -> dict:
    """Generate an integrity manifest for a plugin directory."""
    root = Path(plugin_dir)
    files = {}

    for path in sorted(root.rglob("*")):
        if not path.is_file():
            continue
        if path.name in EXCLUDE_FILES:
            continue
        if any(part in EXCLUDE_DIRS for part in path.relative_to(root).parts):
            continue
        rel = path.relative_to(root).as_posix()
        files[rel] = hash_file(path)

    # Chain hash: SHA-256 of all file hashes concatenated in sorted order
    chain = hashlib.sha256()
    for key in sorted(files.keys()):
        chain.update(files[key].encode("ascii"))

    manifest = {
        "plugin_name": root.name,
        "generated_at": datetime.now(timezone.utc).isoformat(),
        "algorithm": "sha256",
        "file_count": len(files),
        "files": files,
        "manifest_hash": chain.hexdigest(),
    }
    return manifest

Generate and save

Generate and save

manifest = generate_manifest("my-plugin/") Path("my-plugin/INTEGRITY.json").write_text( json.dumps(manifest, indent=2) + "\n" ) print(f"Generated manifest: {manifest['file_count']} files, " f"hash: {manifest['manifest_hash'][:16]}...")

**Output (`INTEGRITY.json`):**
```json
{
  "plugin_name": "my-plugin",
  "generated_at": "2026-04-01T03:00:00+00:00",
  "algorithm": "sha256",
  "file_count": 12,
  "files": {
    ".claude-plugin/plugin.json": "a1b2c3d4...",
    "README.md": "e5f6a7b8...",
    "skills/search/SKILL.md": "c9d0e1f2...",
    "agency.json": "3a4b5c6d..."
  },
  "manifest_hash": "7e8f9a0b1c2d3e4f..."
}

manifest = generate_manifest("my-plugin/") Path("my-plugin/INTEGRITY.json").write_text( json.dumps(manifest, indent=2) + "\n" ) print(f"Generated manifest: {manifest['file_count']} files, " f"hash: {manifest['manifest_hash'][:16]}...")

**输出 (`INTEGRITY.json`):**
```json
{
  "plugin_name": "my-plugin",
  "generated_at": "2026-04-01T03:00:00+00:00",
  "algorithm": "sha256",
  "file_count": 12,
  "files": {
    ".claude-plugin/plugin.json": "a1b2c3d4...",
    "README.md": "e5f6a7b8...",
    "skills/search/SKILL.md": "c9d0e1f2...",
    "agency.json": "3a4b5c6d..."
  },
  "manifest_hash": "7e8f9a0b1c2d3e4f..."
}

Pattern 2: Verify Integrity

模式2:校验完整性

Check that current files match the manifest.
python
undefined
检查当前文件是否与清单匹配。
python
undefined

Requires: hash_file() and generate_manifest() from Pattern 1 above

Requires: hash_file() and generate_manifest() from Pattern 1 above

import json from pathlib import Path
def verify_manifest(plugin_dir: str) -> tuple[bool, list[str]]: """Verify plugin files against INTEGRITY.json.""" root = Path(plugin_dir) manifest_path = root / "INTEGRITY.json"
if not manifest_path.exists():
    return False, ["INTEGRITY.json not found"]

manifest = json.loads(manifest_path.read_text())
recorded = manifest.get("files", {})
errors = []

# Check recorded files
for rel_path, expected_hash in recorded.items():
    full = root / rel_path
    if not full.exists():
        errors.append(f"MISSING: {rel_path}")
        continue
    actual = hash_file(full)
    if actual != expected_hash:
        errors.append(f"MODIFIED: {rel_path}")

# Check for new untracked files
current = generate_manifest(plugin_dir)
for rel_path in current["files"]:
    if rel_path not in recorded:
        errors.append(f"UNTRACKED: {rel_path}")

return len(errors) == 0, errors
import json from pathlib import Path
def verify_manifest(plugin_dir: str) -> tuple[bool, list[str]]: """Verify plugin files against INTEGRITY.json.""" root = Path(plugin_dir) manifest_path = root / "INTEGRITY.json"
if not manifest_path.exists():
    return False, ["INTEGRITY.json not found"]

manifest = json.loads(manifest_path.read_text())
recorded = manifest.get("files", {})
errors = []

# Check recorded files
for rel_path, expected_hash in recorded.items():
    full = root / rel_path
    if not full.exists():
        errors.append(f"MISSING: {rel_path}")
        continue
    actual = hash_file(full)
    if actual != expected_hash:
        errors.append(f"MODIFIED: {rel_path}")

# Check for new untracked files
current = generate_manifest(plugin_dir)
for rel_path in current["files"]:
    if rel_path not in recorded:
        errors.append(f"UNTRACKED: {rel_path}")

return len(errors) == 0, errors

Verify

Verify

passed, errors = verify_manifest("my-plugin/") if passed: print("VERIFIED: All files match manifest") else: print(f"FAILED: {len(errors)} issue(s)") for e in errors: print(f" {e}")

**Output on tampered plugin:**
FAILED: 3 issue(s) MODIFIED: skills/search/SKILL.md MISSING: agency.json UNTRACKED: backdoor.py

---
passed, errors = verify_manifest("my-plugin/") if passed: print("VERIFIED: All files match manifest") else: print(f"FAILED: {len(errors)} issue(s)") for e in errors: print(f" {e}")

**插件被篡改时的输出:**
FAILED: 3 issue(s) MODIFIED: skills/search/SKILL.md MISSING: agency.json UNTRACKED: backdoor.py

---

Pattern 3: Dependency Version Audit

模式3:依赖版本审计

Check that agent dependencies use pinned versions.
python
import re

def audit_versions(config_path: str) -> list[dict]:
    """Audit dependency version pinning in a config file."""
    findings = []
    path = Path(config_path)
    content = path.read_text()

    if path.name == "package.json":
        data = json.loads(content)
        for section in ("dependencies", "devDependencies"):
            for pkg, ver in data.get(section, {}).items():
                if ver.startswith("^") or ver.startswith("~") or ver == "*" or ver == "latest":
                    findings.append({
                        "package": pkg,
                        "version": ver,
                        "severity": "HIGH" if ver in ("*", "latest") else "MEDIUM",
                        "fix": f'Pin to exact: "{pkg}": "{ver.lstrip("^~")}"'
                    })

    elif path.name in ("requirements.txt", "pyproject.toml"):
        for line in content.splitlines():
            line = line.strip()
            if ">=" in line and "<" not in line:
                findings.append({
                    "package": line.split(">=")[0].strip(),
                    "version": line,
                    "severity": "MEDIUM",
                    "fix": f"Add upper bound: {line},<next_major"
                })

    return findings

检查Agent依赖是否使用了锁定版本。
python
import re

def audit_versions(config_path: str) -> list[dict]:
    """Audit dependency version pinning in a config file."""
    findings = []
    path = Path(config_path)
    content = path.read_text()

    if path.name == "package.json":
        data = json.loads(content)
        for section in ("dependencies", "devDependencies"):
            for pkg, ver in data.get(section, {}).items():
                if ver.startswith("^") or ver.startswith("~") or ver == "*" or ver == "latest":
                    findings.append({
                        "package": pkg,
                        "version": ver,
                        "severity": "HIGH" if ver in ("*", "latest") else "MEDIUM",
                        "fix": f'Pin to exact: "{pkg}": "{ver.lstrip("^~")}"'
                    })

    elif path.name in ("requirements.txt", "pyproject.toml"):
        for line in content.splitlines():
            line = line.strip()
            if ">=" in line and "<" not in line:
                findings.append({
                    "package": line.split(">=")[0].strip(),
                    "version": line,
                    "severity": "MEDIUM",
                    "fix": f"Add upper bound: {line},<next_major"
                })

    return findings

Pattern 4: Promotion Gate

模式4:发布闸门

Use integrity verification as a gate before promoting plugins.
python
def promotion_check(plugin_dir: str) -> dict:
    """Check if a plugin is ready for production promotion."""
    checks = {}

    # 1. Integrity manifest exists and verifies
    passed, errors = verify_manifest(plugin_dir)
    checks["integrity"] = {
        "passed": passed,
        "errors": errors
    }

    # 2. Required files exist
    root = Path(plugin_dir)
    required = ["README.md"]
    missing = [f for f in required if not (root / f).exists()]

    # Require at least one plugin manifest (supports both layouts)
    manifest_paths = [
        root / ".github/plugin/plugin.json",
        root / ".claude-plugin/plugin.json",
    ]
    if not any(p.exists() for p in manifest_paths):
        missing.append(".github/plugin/plugin.json (or .claude-plugin/plugin.json)")

    checks["required_files"] = {
        "passed": len(missing) == 0,
        "missing": missing
    }

    # 3. No unpinned dependencies
    mcp_path = root / ".mcp.json"
    if mcp_path.exists():
        config = json.loads(mcp_path.read_text())
        unpinned = []
        for server in config.get("mcpServers", {}).values():
            if isinstance(server, dict):
                for arg in server.get("args", []):
                    if isinstance(arg, str) and "@latest" in arg:
                        unpinned.append(arg)
        checks["pinned_deps"] = {
            "passed": len(unpinned) == 0,
            "unpinned": unpinned
        }

    # Overall
    all_passed = all(c["passed"] for c in checks.values())
    return {"ready": all_passed, "checks": checks}

result = promotion_check("my-plugin/")
if result["ready"]:
    print("Plugin is ready for production promotion")
else:
    print("Plugin NOT ready:")
    for name, check in result["checks"].items():
        if not check["passed"]:
            print(f"  FAILED: {name}")

将完整性校验作为插件发布前的准入门槛。
python
def promotion_check(plugin_dir: str) -> dict:
    """Check if a plugin is ready for production promotion."""
    checks = {}

    # 1. Integrity manifest exists and verifies
    passed, errors = verify_manifest(plugin_dir)
    checks["integrity"] = {
        "passed": passed,
        "errors": errors
    }

    # 2. Required files exist
    root = Path(plugin_dir)
    required = ["README.md"]
    missing = [f for f in required if not (root / f).exists()]

    # Require at least one plugin manifest (supports both layouts)
    manifest_paths = [
        root / ".github/plugin/plugin.json",
        root / ".claude-plugin/plugin.json",
    ]
    if not any(p.exists() for p in manifest_paths):
        missing.append(".github/plugin/plugin.json (or .claude-plugin/plugin.json)")

    checks["required_files"] = {
        "passed": len(missing) == 0,
        "missing": missing
    }

    # 3. No unpinned dependencies
    mcp_path = root / ".mcp.json"
    if mcp_path.exists():
        config = json.loads(mcp_path.read_text())
        unpinned = []
        for server in config.get("mcpServers", {}).values():
            if isinstance(server, dict):
                for arg in server.get("args", []):
                    if isinstance(arg, str) and "@latest" in arg:
                        unpinned.append(arg)
        checks["pinned_deps"] = {
            "passed": len(unpinned) == 0,
            "unpinned": unpinned
        }

    # Overall
    all_passed = all(c["passed"] for c in checks.values())
    return {"ready": all_passed, "checks": checks}

result = promotion_check("my-plugin/")
if result["ready"]:
    print("Plugin is ready for production promotion")
else:
    print("Plugin NOT ready:")
    for name, check in result["checks"].items():
        if not check["passed"]:
            print(f"  FAILED: {name}")

CI Integration

CI集成

Add to your GitHub Actions workflow:
yaml
- name: Verify plugin integrity
  run: |
    PLUGIN_DIR="${{ matrix.plugin || '.' }}"
    cd "$PLUGIN_DIR"
    python -c "
    from pathlib import Path
    import json, hashlib, sys

    def hash_file(p):
        h = hashlib.sha256()
        with open(p, 'rb') as f:
            for c in iter(lambda: f.read(8192), b''):
                h.update(c)
        return h.hexdigest()

    manifest = json.loads(Path('INTEGRITY.json').read_text())
    errors = []
    for rel, expected in manifest['files'].items():
        p = Path(rel)
        if not p.exists():
            errors.append(f'MISSING: {rel}')
        elif hash_file(p) != expected:
            errors.append(f'MODIFIED: {rel}')
    if errors:
        for e in errors:
            print(f'::error::{e}')
        sys.exit(1)
    print(f'Verified {len(manifest[\"files\"])} files')
    "

添加到你的GitHub Actions工作流中:
yaml
- name: Verify plugin integrity
  run: |
    PLUGIN_DIR="${{ matrix.plugin || '.' }}"
    cd "$PLUGIN_DIR"
    python -c "
    from pathlib import Path
    import json, hashlib, sys

    def hash_file(p):
        h = hashlib.sha256()
        with open(p, 'rb') as f:
            for c in iter(lambda: f.read(8192), b''):
                h.update(c)
        return h.hexdigest()

    manifest = json.loads(Path('INTEGRITY.json').read_text())
    errors = []
    for rel, expected in manifest['files'].items():
        p = Path(rel)
        if not p.exists():
            errors.append(f'MISSING: {rel}')
        elif hash_file(p) != expected:
            errors.append(f'MODIFIED: {rel}')
    if errors:
        for e in errors:
            print(f'::error::{e}')
        sys.exit(1)
    print(f'Verified {len(manifest[\"files\"])} files')
    "

Best Practices

最佳实践

PracticeRationale
Generate manifest after code reviewEnsures reviewed code matches production code
Include manifest in the PRReviewers can verify what was hashed
Verify in CI before deployCatches post-review modifications
Chain hash for tamper evidenceSingle hash represents entire plugin state
Exclude build artifactsOnly hash source files — .git, pycache, node_modules excluded
Pin all dependency versionsUnpinned deps = different code on every install

实践合理性
代码评审后生成清单确保评审过的代码与生产代码一致
将清单纳入PR提交评审者可校验被哈希的内容是否正确
部署前在CI中校验发现评审后发生的修改
使用链式哈希作为篡改证据用单个哈希代表整个插件的状态
排除构建产物仅对源文件哈希,.git、pycache、node_modules等目录会被排除
锁定所有依赖版本未锁定的依赖会导致每次安装得到的代码都不同

Related Resources

相关资源