buildkite-pipelines
Compare original and translation side by side
🇺🇸
Original
English🇨🇳
Translation
ChineseBuildkite Pipelines
Buildkite 流水线
Pipeline YAML is the core of Buildkite CI/CD. This skill covers writing, optimizing, and troubleshooting — step types, caching, parallelism, annotations, retry, dynamic pipelines, matrix builds, plugins, notifications, artifacts, and concurrency.
.buildkite/pipeline.ymlPipeline YAML 是 Buildkite CI/CD 的核心。本技能涵盖 的编写、优化与故障排查——包括步骤类型、缓存、并行处理、注释、重试、动态流水线、矩阵构建、插件、通知、制品以及并发控制等内容。
.buildkite/pipeline.ymlQuick Start
快速入门
Create in the repository root:
.buildkite/pipeline.ymlyaml
steps:
- label: ":hammer: Tests"
command: "npm test"
artifact_paths: "coverage/**/*"
- wait
- label: ":rocket: Deploy"
command: "scripts/deploy.sh"
branches: "main"Set the pipeline's initial command in Buildkite to upload this file:
yaml
steps:
- label: ":pipeline: Upload"
command: buildkite-agent pipeline uploadThe agent reads and uploads the steps to Buildkite for execution.
.buildkite/pipeline.ymlBuildkite looks for by default. Override the path with .
.buildkite/pipeline.ymlbuildkite-agent pipeline upload path/to/other.ymlFor creating pipelines programmatically, see the buildkite-api skill. For agent and queue setup, see the buildkite-agent-infrastructure skill.
在仓库根目录创建 :
.buildkite/pipeline.ymlyaml
steps:
- label: ":hammer: Tests"
command: "npm test"
artifact_paths: "coverage/**/*"
- wait
- label: ":rocket: Deploy"
command: "scripts/deploy.sh"
branches: "main"在 Buildkite 中设置流水线的初始命令以上传该文件:
yaml
steps:
- label: ":pipeline: Upload"
command: buildkite-agent pipeline uploadAgent 会读取 并将步骤上传至 Buildkite 执行。
.buildkite/pipeline.ymlBuildkite 默认会查找 。可通过 覆盖路径。
.buildkite/pipeline.ymlbuildkite-agent pipeline upload path/to/other.yml如需以编程方式创建流水线,请查看 buildkite-api 技能。 如需 Agent 和队列设置,请查看 buildkite-agent-infrastructure 技能。
Step Types
步骤类型
| Type | Purpose | Minimal syntax |
|---|---|---|
| command | Run a shell command | |
| wait | Block until all previous steps pass | |
| block | Pause for manual approval | |
| trigger | Start a build on another pipeline | |
| group | Visually group steps (collapsible) | |
| input | Collect user input before continuing | |
For detailed attributes and advanced examples of each step type, see .
references/step-types-reference.md| 类型 | 用途 | 最简语法 |
|---|---|---|
| command | 运行 shell 命令 | |
| wait | 等待所有前置步骤完成后再继续 | |
| block | 暂停以等待手动批准 | |
| trigger | 启动另一条流水线的构建 | |
| group | 可视化分组步骤(可折叠) | |
| input | 继续前收集用户输入 | |
如需各步骤类型的详细属性和高级示例,请查看 。
references/step-types-reference.mdCaching
缓存
Caching dependencies is the single highest-impact optimization. Use the cache plugin with manifest-based invalidation:
yaml
steps:
- label: ":nodejs: Test"
command: "npm ci && npm test"
plugins:
- cache#v1.8.1:
paths:
- "node_modules/"
manifest: "package-lock.json"The cache key derives from the manifest file hash. When changes, the cache rebuilds.
package-lock.jsonHosted agents also support a built-in key (no plugin needed):
cacheyaml
steps:
- label: ":nodejs: Test"
command: "npm ci && npm test"
cache:
paths:
- "node_modules/"
key: "v1-deps-{{ checksum 'package-lock.json' }}"Hosted agent setup and instance shapes are covered by the buildkite-agent-infrastructure skill.
缓存依赖项是提升构建速度最有效的优化手段。使用缓存插件并基于清单文件进行失效控制:
yaml
steps:
- label: ":nodejs: Test"
command: "npm ci && npm test"
plugins:
- cache#v1.8.1:
paths:
- "node_modules/"
manifest: "package-lock.json"缓存键由清单文件的哈希值生成。当 变更时,缓存会重新构建。
package-lock.json托管 Agent 还支持内置的 键(无需插件):
cacheyaml
steps:
- label: ":nodejs: Test"
command: "npm ci && npm test"
cache:
paths:
- "node_modules/"
key: "v1-deps-{{ checksum 'package-lock.json' }}"托管 Agent 的设置和实例规格请查看 buildkite-agent-infrastructure 技能。
Fast-Fail and Non-Blocking Steps
快速失败与非阻塞步骤
Cancel remaining jobs immediately when any job fails:
yaml
steps:
- label: ":rspec: Tests"
command: "bundle exec rspec"
cancel_on_build_failing: trueUse for steps that should not block the build (security scans, linting, coverage):
soft_failyaml
steps:
- label: ":shield: Security Scan"
command: "scripts/security-scan.sh"
soft_fail:
- exit_status: 1A soft-failed step shows as a warning in the UI but does not fail the build. Combine with on a wait step to let downstream steps run regardless.
continue_on_failure: true当任何任务失败时立即取消剩余任务:
yaml
steps:
- label: ":rspec: Tests"
command: "bundle exec rspec"
cancel_on_build_failing: true使用 标记不应阻塞构建的步骤(如安全扫描、代码检查、覆盖率统计):
soft_failyaml
steps:
- label: ":shield: Security Scan"
command: "scripts/security-scan.sh"
soft_fail:
- exit_status: 1软失败的步骤在 UI 中会显示为警告,但不会导致构建失败。可结合 wait 步骤的 ,让下游步骤无论前置结果如何都能运行。
continue_on_failure: trueParallelism and Dependencies
并行处理与依赖关系
Parallel execution
并行执行
Steps at the same level run in parallel by default. Use to fan out a single step:
parallelismyaml
steps:
- label: ":rspec: Tests %n"
command: "bundle exec rspec"
parallelism: 10This creates 10 parallel jobs. Each receives (0-9) and (10) as environment variables for splitting work.
BUILDKITE_PARALLEL_JOBBUILDKITE_PARALLEL_JOB_COUNTFor intelligent test splitting based on timing data, see the buildkite-test-engine skill.
同一层级的步骤默认并行运行。使用 将单个步骤拆分为多个并行任务:
parallelismyaml
steps:
- label: ":rspec: Tests %n"
command: "bundle exec rspec"
parallelism: 10这会创建 10 个并行任务。每个任务会收到环境变量 (0-9)和 (10),用于拆分工作负载。
BUILDKITE_PARALLEL_JOBBUILDKITE_PARALLEL_JOB_COUNT如需基于时间数据进行智能测试拆分,请查看 buildkite-test-engine 技能。
Explicit dependencies
显式依赖
Use to express step-level dependencies without :
depends_onwaityaml
steps:
- label: "Build"
key: "build"
command: "make build"
- label: "Unit Tests"
depends_on: "build"
command: "make test-unit"
- label: "Integration Tests"
depends_on: "build"
command: "make test-integration"Unit and integration tests run in parallel after build completes — no step needed.
wait使用 定义步骤级依赖,无需使用 :
depends_onwaityaml
steps:
- label: "Build"
key: "build"
command: "make build"
- label: "Unit Tests"
depends_on: "build"
command: "make test-unit"
- label: "Integration Tests"
depends_on: "build"
command: "make test-integration"单元测试和集成测试会在构建完成后并行运行——无需 wait 步骤。
Annotations
注释
Surface build results directly on the build page using . Supports Markdown and HTML.
buildkite-agent annotateyaml
steps:
- label: ":test_tube: Tests"
command: |
if ! make test 2>&1 | tee test-output.txt; then
buildkite-agent annotate --style "error" --context "test-failures" < test-output.txt
exit 1
fi
buildkite-agent annotate "All tests passed :white_check_mark:" --style "success" --context "test-results"| Flag | Default | Description |
|---|---|---|
| | Visual style: |
| random | Unique ID — reusing a context replaces the annotation |
| | Append to existing annotation with same context |
Link to uploaded artifacts in annotations:
yaml
- command: |
buildkite-agent artifact upload "coverage/*"
buildkite-agent annotate --style "info" 'Coverage: <a href="artifact://coverage/index.html">view report</a>'使用 在构建页面直接展示构建结果,支持 Markdown 和 HTML。
buildkite-agent annotateyaml
steps:
- label: ":test_tube: Tests"
command: |
if ! make test 2>&1 | tee test-output.txt; then
buildkite-agent annotate --style "error" --context "test-failures" < test-output.txt
exit 1
fi
buildkite-agent annotate "All tests passed :white_check_mark:" --style "success" --context "test-results"| 参数 | 默认值 | 说明 |
|---|---|---|
| | 视觉样式: |
| 随机值 | 唯一 ID——重复使用同一 context 会替换现有注释 |
| | 追加到同一 context 的现有注释后 |
在注释中链接已上传的制品:
yaml
- command: |
buildkite-agent artifact upload "coverage/*"
buildkite-agent annotate --style "info" 'Coverage: <a href="artifact://coverage/index.html">view report</a>'Retry
重试
Automatic retry
自动重试
Retry transient failures by exit status:
yaml
steps:
- label: ":hammer: Build"
command: "make build"
retry:
automatic:
- exit_status: -1 # Agent lost
limit: 2
- exit_status: 143 # SIGTERM (spot instance termination)
limit: 2
- exit_status: 255 # Timeout or SSH failure
limit: 2
- exit_status: "*" # Any non-zero exit
limit: 1根据退出状态重试临时故障:
yaml
steps:
- label: ":hammer: Build"
command: "make build"
retry:
automatic:
- exit_status: -1 # Agent 连接丢失
limit: 2
- exit_status: 143 # SIGTERM(临时实例终止)
limit: 2
- exit_status: 255 # 超时或 SSH 失败
limit: 2
- exit_status: "*" # 任何非零退出状态
limit: 1Manual retry
手动重试
Control whether manual retries are allowed:
yaml
retry:
manual:
allowed: false
reason: "Deployment steps cannot be retried"For comprehensive exit code tables and retry strategy recommendations, see .
references/retry-and-error-codes.md控制是否允许手动重试:
yaml
retry:
manual:
allowed: false
reason: "Deployment steps cannot be retried"如需完整的退出代码表和重试策略建议,请查看 。
references/retry-and-error-codes.mdDynamic Pipelines
动态流水线
Generate pipeline steps at runtime based on repository state. Upload generated YAML with :
buildkite-agent pipeline uploadyaml
steps:
- label: ":pipeline: Generate"
command: |
.buildkite/generate-pipeline.sh | buildkite-agent pipeline upload根据仓库状态在运行时生成流水线步骤。使用 上传生成的 YAML:
buildkite-agent pipeline uploadyaml
steps:
- label: ":pipeline: Generate"
command: |
.buildkite/generate-pipeline.sh | buildkite-agent pipeline uploadWhen to use what
场景选型
Pipelines exist on a spectrum. Pick the simplest option that does the job:
| Situation | Approach |
|---|---|
| Same steps every build, branch-level filtering at most | Static YAML |
| Org-wide enforcement of pipeline structure, admin-controlled (Enterprise plan) | Pipeline templates |
| Reusable, vetted logic (caching, Docker, artifact transfer) shared across many pipelines | Pinned plugin |
| Skip steps when specific files haven't changed | |
| Monorepo with separate pipelines per service | |
Combine | Dynamic generation |
| Apply consistent retry / timeout / env config across many pipelines | Dynamic (shared config) |
Calculate test shards, matrix combos, | Dynamic (often SDK) |
| Monorepo with transitive dependencies between services | Dynamic (custom dep graph) |
| Recover from infra failures (OOM → bigger agent) | Dynamic ( |
| Steps depend on output from previous steps (multi-stage) | Dynamic, often |
| Cleanup / teardown step that must run regardless of earlier failures | Dynamic ( |
| Fallback step only when the primary step fails | Dynamic ( |
| Pipeline YAML has outgrown what the team can maintain | Dynamic (SDK in Python / TS / Go / Ruby) |
流水线的实现方式有多种,选择最简单的方案即可:
| 场景 | 方案 |
|---|---|
| 每次构建步骤相同,最多仅需按分支过滤 | 静态 YAML |
| 组织层面强制流水线结构,由管理员控制(企业版) | 流水线模板 |
| 在多条流水线间共享经过验证的可复用逻辑(缓存、Docker、制品传输) | 固定版本的插件 |
| 特定文件未变更时跳过步骤 | |
| 单仓库多服务,每个服务独立流水线 | |
结合 | 动态生成 |
| 在多条流水线间统一应用重试/超时/环境变量配置 | 动态生成(共享配置) |
在运行时计算测试分片、矩阵组合、 | 动态生成(通常使用 SDK) |
| 单仓库多服务且服务间存在传递依赖 | 动态生成(自定义依赖图) |
| 从基础设施故障中恢复(如 OOM 后切换到更大规格的 Agent) | 动态生成( |
| 步骤依赖前置步骤的输出(多阶段) | 动态生成,通常使用 |
| 无论前置步骤是否失败都必须运行的清理/销毁步骤 | 动态生成( |
| 仅当主步骤失败时才运行的 fallback 步骤 | 动态生成( |
| 流水线 YAML 复杂度超出团队维护能力 | 动态生成(使用 Python/TS/Go/Ruby 编写的 SDK) |
Don't reach for dynamic pipelines for the wrong job
避免误用动态流水线
Dynamic generation is the right tool when the steps themselves need to change. For passing data between steps, simpler primitives exist:
- — small key-value pairs any later step in the same build can read (a version string, a commit SHA, a feature flag).
buildkite-agent meta-data set/get - Artifacts — files passed between steps ().
buildkite-agent artifact upload/download - Trigger step — env vars passed to a build in a different pipeline.
env:
If only data needs to move, metadata or artifacts is simpler and safer. See the buildkite-agent-runtime skill.
仅当步骤本身需要变更时,才适合使用动态生成。如需在步骤间传递数据,更简单的方案如下:
- —— 存储小型键值对,同一构建中的后续步骤均可读取(如版本字符串、提交 SHA、功能开关)。
buildkite-agent meta-data set/get - 制品 —— 在步骤间传递文件()。
buildkite-agent artifact upload/download - Trigger 步骤的 —— 将环境变量传递到另一条流水线的构建中。
env:
如果仅需传递数据,元数据或制品方案更简单安全。请查看 buildkite-agent-runtime 技能。
Bootstrap script
引导脚本
Always start generator scripts with . Without , a failing returns the exit code of the last piped command, the build step reports success, and no generated steps appear — the most common dynamic pipeline failure mode.
set -euo pipefailpipefailpipeline uploadExample generator that runs tests only for changed services:
bash
#!/bin/bash
set -euo pipefail
CHANGED=$(git diff --name-only HEAD~1)
cat <<YAML
steps:
YAML
for dir in services/*/; do
svc=$(basename "$dir")
if echo "$CHANGED" | grep -q "^services/$svc/"; then
cat <<YAML
- label: ":test_tube: $svc"
command: "cd services/$svc && make test"
key: "test-$svc"
YAML
fi
doneSet on every generated step. It enables , makes retries idempotent ( blocks silent duplication if the upload step re-runs), and gives stable identifiers across builds. Validate locally with before pushing.
key:depends_onDuplicateKeyErrorbuildkite-agent pipeline upload --dry-runKeep uploads under 500 steps per call and 4,000 jobs per build (platform defaults, raisable via support). For larger monorepos, use trigger steps to fan out across separate builds.
For type-checked, unit-testable generators, the Buildkite SDK supports JavaScript/TypeScript, Python, Go, and Ruby. Wrap related steps in group steps once a generator produces more than ~10 steps — adding any group enables DAG mode for the build, and attributes are rejected on groups (see ).
concurrencyreferences/group-steps.mdA generator step can also read runtime state (meta-data, artifacts, git diff) and upload the next phase of the pipeline — the handler pattern used by multi-stage builds. For this, fan-out/fan-in, and finalizer / always-run steps via hooks, see . For failure modes, see . For advanced generator patterns (Python, monorepo, multi-stage), see .
pre-exitreferences/dynamic-pipeline-patterns.mdreferences/dynamic-pipeline-troubleshooting.mdreferences/advanced-patterns.md生成器脚本必须以 开头。 若缺少 ,失败的 会返回最后一个管道命令的退出码,导致构建步骤报告成功,但生成的步骤不会显示——这是动态流水线最常见的故障模式。
set -euo pipefailpipefailpipeline upload示例生成器:仅对变更的服务运行测试
bash
#!/bin/bash
set -euo pipefail
CHANGED=$(git diff --name-only HEAD~1)
cat <<YAML
steps:
YAML
for dir in services/*/; do
svc=$(basename "$dir")
if echo "$CHANGED" | grep -q "^services/$svc/"; then
cat <<YAML
- label: ":test_tube: $svc"
command: "cd services/$svc && make test"
key: "test-$svc"
YAML
fi
done为每个生成的步骤设置 。这能启用 ,让重试具有幂等性(若上传步骤重新运行, 会阻止静默重复),并在多次构建间提供稳定的标识符。推送前请使用 本地验证。
key:depends_onDuplicateKeyErrorbuildkite-agent pipeline upload --dry-run每次上传的步骤数请控制在 500 步以内,每个构建的任务数控制在 4000 个以内(平台默认限制,可通过支持申请提高)。对于大型单仓库,使用 trigger 步骤将任务分散到多个独立构建中。
如需类型检查、可单元测试的生成器,Buildkite SDK 支持 JavaScript/TypeScript、Python、Go 和 Ruby。当生成器输出超过约 10 个步骤时,请将相关步骤包装在 group 中——添加任何 group 都会启用构建的 DAG 模式,且 group 上不支持 属性(请查看 )。
concurrencyreferences/group-steps.md生成器步骤还可读取运行时状态(元数据、制品、Git 差异)并上传流水线的下一阶段——这是多阶段构建使用的处理程序模式。如需了解扇出/扇入、通过 钩子实现的最终步骤/始终运行步骤,请查看 。如需故障排查,请查看 。如需高级生成器模式(Python、单仓库、多阶段),请查看 。
pre-exitreferences/dynamic-pipeline-patterns.mdreferences/dynamic-pipeline-troubleshooting.mdreferences/advanced-patterns.mdConditional Execution
条件执行
Step-level conditions
步骤级条件
Use to conditionally run steps based on build state:
ifyaml
steps:
- label: ":rocket: Deploy"
command: "scripts/deploy.sh"
if: build.branch == "main" && build.message !~ /\[skip deploy\]/For the full list of condition expressions, see Conditionals.
[skip ci][skip ci][ci skip]使用 根据构建状态条件性运行步骤:
ifyaml
steps:
- label: ":rocket: Deploy"
command: "scripts/deploy.sh"
if: build.branch == "main" && build.message !~ /\[skip deploy\]/如需完整的条件表达式列表,请查看 条件语句。
[skip ci][skip ci][ci skip]Directory-based step filtering (if_changed)
基于目录的步骤过滤(if_changed)
Skip steps when relevant files haven't changed. Only applied by the Buildkite agent when uploading a pipeline. See https://buildkite.com/docs/pipelines/configure/dynamic-pipelines/if-changed.md.
yaml
steps:
- label: ":nodejs: Frontend tests"
command: "npm test"
if_changed:
- "src/frontend/**"
- "package.json"For exclude patterns and monorepo configurations, see .
references/advanced-patterns.mdFor large monorepos, use the Sparse Checkout plugin to check out only for the upload step — dramatically faster pipeline uploads.
.buildkite/当相关文件未变更时跳过步骤。仅当 Buildkite Agent 上传流水线时生效。请查看 https://buildkite.com/docs/pipelines/configure/dynamic-pipelines/if-changed.md。
yaml
steps:
- label: ":nodejs: Frontend tests"
command: "npm test"
if_changed:
- "src/frontend/**"
- "package.json"如需排除模式和单仓库配置,请查看 。
references/advanced-patterns.md对于大型单仓库,使用 Sparse Checkout 插件 仅检出 用于上传步骤——可大幅加快流水线上传速度。
.buildkite/Conditionally running plugins
条件性运行插件
Step-level does not prevent plugins from executing. Wrap steps in a to skip plugins entirely:
ifgroupyaml
steps:
- group: ":docker: Build"
if: build.env("DOCKER_PASSWORD") != null
steps:
- label: "Build image"
command: "docker build -t myapp ."
plugins:
- docker-login#v2.1.0:
username: myuser
password-env: DOCKER_PASSWORD步骤级的 无法阻止插件执行。请将步骤包装在 中以完全跳过插件:
ifgroupyaml
steps:
- group: ":docker: Build"
if: build.env("DOCKER_PASSWORD") != null
steps:
- label: "Build image"
command: "docker build -t myapp ."
plugins:
- docker-login#v2.1.0:
username: myuser
password-env: DOCKER_PASSWORDMatrix Builds
矩阵构建
Run the same step across multiple configurations:
yaml
steps:
- label: "Test {{matrix.ruby}} on {{matrix.os}}"
command: "bundle exec rake test"
matrix:
setup:
ruby:
- "3.2"
- "3.3"
os:
- "ubuntu"
- "alpine"
adjustments:
- with:
ruby: "3.2"
os: "alpine"
skip: true # Known incompatibleValid properties inside each entry: , , , . The key is not valid inside — Buildkite rejects the pipeline with "agents is not a valid property on the matrix.adjustments configuration". To route matrix combinations to different queues (e.g., Linux vs Windows agents), use separate steps or a dynamic pipeline generator.
adjustmentswithskipsoft_failenvagents:adjustments在多种配置下运行相同步骤:
yaml
steps:
- label: "Test {{matrix.ruby}} on {{matrix.os}}"
command: "bundle exec rake test"
matrix:
setup:
ruby:
- "3.2"
- "3.3"
os:
- "ubuntu"
- "alpine"
adjustments:
- with:
ruby: "3.2"
os: "alpine"
skip: true # 已知不兼容adjustmentswithskipsoft_failenvagents:adjustmentsPlugins
插件
Add capabilities with 3-line YAML blocks. Pin versions for reproducibility:
yaml
plugins:
- docker-compose#v5.5.0:
run: app
config: docker-compose.ci.yml| Plugin | Purpose |
|---|---|
| Dependency caching with manifest-based invalidation |
| Run steps inside a Docker container |
| Build and run with Docker Compose |
| Download artifacts between steps |
| Upload test results to Test Engine |
Always pin plugin versions (e.g., not ). Unpinned versions can break builds when plugins release new major versions.
docker#v5.12.0docker#v5For private organizational plugins, use full Git URLs — the shorthand syntax only works for public plugins:
yaml
plugins:
- ssh://git@github.com/my-org/my-plugin.git#v1.0.0:
config: value通过 3 行 YAML 块扩展功能。固定版本以确保可重复性:
yaml
plugins:
- docker-compose#v5.5.0:
run: app
config: docker-compose.ci.yml| 插件 | 用途 |
|---|---|
| 基于清单文件失效控制的依赖缓存 |
| 在 Docker 容器内运行步骤 |
| 使用 Docker Compose 构建和运行 |
| 在步骤间下载制品 |
| 将测试结果上传到 Test Engine |
请始终固定插件版本(如 而非 )。未固定的版本可能在插件发布重大更新时导致构建失败。
docker#v5.12.0docker#v5对于私有组织插件,请使用完整 Git URL——简写语法仅适用于公共插件:
yaml
plugins:
- ssh://git@github.com/my-org/my-plugin.git#v1.0.0:
config: valueNotifications and Artifacts
通知与制品
Add pipeline-level above to send Slack, email, or webhook notifications on build state changes. See Notifications for syntax.
notify:steps:Artifact upload and download
制品上传与下载
Upload artifacts from steps, download in later steps:
yaml
steps:
- label: "Build"
command: "make build"
artifact_paths: "dist/**/*"
- wait
- label: "Package"
command: |
buildkite-agent artifact download "dist/*" .
make packageWhen using artifacts in a Docker build, download artifacts before starting the Docker build since is not available inside the container:
buildkite-agentyaml
steps:
- label: "Docker build"
command: |
buildkite-agent artifact download "dist/*" .
docker build -t myapp .在步骤中上传制品,在后续步骤中下载:
yaml
steps:
- label: "Build"
command: "make build"
artifact_paths: "dist/**/*"
- wait
- label: "Package"
command: |
buildkite-agent artifact download "dist/*" .
make package在 Docker 构建中使用制品时,请在启动 Docker 构建前下载制品,因为 在容器内不可用:
buildkite-agentyaml
steps:
- label: "Docker build"
command: |
buildkite-agent artifact download "dist/*" .
docker build -t myapp .Concurrency
并发控制
Limit parallel execution of steps sharing a resource. Always pair with — without a group name, the limit is silently ignored.
concurrencyconcurrency_groupyaml
steps:
- label: ":rocket: Deploy"
command: "scripts/deploy.sh"
concurrency: 1
concurrency_group: "deploy/production"
concurrency_method: "eager"Use (next available) for independent jobs like deploys. Use the default (FIFO) when execution order matters. Set (default , higher = first) to control which queued jobs run next.
concurrency_method: "eager""ordered"priority0For full concurrency configuration options, see Controlling Concurrency.
For triggering, watching, and debugging pipelines from the terminal, see the buildkite-cli skill.
限制共享资源的步骤并行执行次数。请始终将 与 配合使用——若缺少组名称,限制会被静默忽略。
concurrencyconcurrency_groupyaml
steps:
- label: ":rocket: Deploy"
command: "scripts/deploy.sh"
concurrency: 1
concurrency_group: "deploy/production"
concurrency_method: "eager"对于独立任务(如部署),使用 (抢占式)。当执行顺序重要时,使用默认的 (先进先出)。设置 (默认 0,值越高越优先)控制队列中任务的执行顺序。
concurrency_method: "eager""ordered"priority如需完整的并发配置选项,请查看 控制并发。
如需从终端触发、查看和调试流水线,请查看 buildkite-cli 技能。
Common Mistakes
常见错误
| Mistake | What happens | Fix |
|---|---|---|
Missing | Steps run in parallel, second step fails because first hasn't finished | Add |
Using only | Valid but non-idiomatic; | Give named steps a |
No | Dependencies reinstalled from scratch on every build, slowing builds and inflating costs | Add |
Using step-level | Plugins still execute (they run before | Wrap in a |
| Not pinning plugin versions | Builds break when plugin releases breaking change | Always use full semver: |
Forgetting | | Always pair |
| Artifacts silently not uploaded, downstream steps fail | Test glob pattern locally; use |
| Hardcoding parallel job split logic | Uneven test distribution, one slow job blocks the build | Use |
| Inline secrets in pipeline YAML | Secrets visible in build logs and Buildkite UI | Use cluster secrets or agent environment hooks |
Using | Genuine bugs retry repeatedly, wasting compute | Target specific exit codes; keep wildcard limit at 1 |
Using | Pipeline upload fails: "agents is not a valid property on the matrix.adjustments configuration" | Remove |
| Build fails but all visible steps passed | A trigger step started a child pipeline that failed, or a step was cancelled rather than unblocked | Check the triggered pipeline's build status; inspect block steps for cancellations |
| Pipeline upload fails with no clear error | YAML syntax error or agent-side issue not shown in build logs | Validate YAML locally; check agent logs on the host machine for detailed upload errors; run |
| Fork builds enabled on public pipelines | Contributors can modify | Disable fork builds in pipeline settings for public repos; use a separate pipeline for external PRs with no secret access |
| Docker Compose steps produce artifacts but agent can't find them | Files created inside containers are invisible to the host agent | Mount the working directory as a volume in |
| Dynamic pipeline generates 1000+ steps | UI becomes slow, pipeline processing degrades | Keep generated pipelines under ~500 steps; use orchestrator pipelines with trigger steps for larger monorepos |
| 错误 | 后果 | 修复方案 |
|---|---|---|
依赖步骤间缺少 | 步骤并行运行,后续步骤因前置步骤未完成而失败 | 添加 |
所有依赖都仅使用 | 虽有效但不符合最佳实践; | 为命名步骤设置 |
包安装步骤未配置 | 每次构建都从头重新安装依赖,拖慢构建速度并增加成本 | 添加 |
使用步骤级 | 插件仍会执行(它们在 | 将步骤包装在带有 |
| 未固定插件版本 | 插件发布破坏性更新时构建失败 | 始终使用完整语义化版本: |
使用 | | 始终将 |
| 制品被静默忽略未上传,下游步骤失败 | 本地测试通配符模式;对嵌套目录使用 |
| 硬编码并行任务拆分逻辑 | 测试分布不均,单个慢任务阻塞构建 | 使用 |
| 在流水线 YAML 中硬编码密钥 | 密钥会暴露在构建日志和 Buildkite UI 中 | 使用集群密钥或 Agent 环境钩子 |
| 真正的 bug 会反复重试,浪费计算资源 | 针对特定退出码设置重试;通配符重试限制保持为 1 |
在 | 流水线上传失败:"agents is not a valid property on the matrix.adjustments configuration" | 从 |
| 构建失败但所有可见步骤都显示成功 | Trigger 步骤启动的子流水线失败,或步骤被取消而非解除阻塞 | 检查触发流水线的构建状态;检查 block 步骤是否被取消 |
| 流水线上传失败且无明确错误 | YAML 语法错误或 Agent 端问题未在构建日志中显示 | 本地验证 YAML;查看主机上的 Agent 日志获取详细上传错误;运行 |
| 公开流水线启用了 fork 构建 | 贡献者可修改 | 公开仓库的流水线设置中禁用 fork 构建;为外部 PR 使用无密钥访问的独立流水线 |
| Docker Compose 步骤生成制品但 Agent 无法找到 | 容器内创建的文件对主机 Agent 不可见 | 在 |
| 动态流水线生成 1000+ 步骤 | UI 变慢,流水线处理性能下降 | 将生成的流水线控制在约 500 步以内;对大型单仓库使用编排器流水线和 trigger 步骤 |
Additional Resources
额外资源
Reference Files
参考文件
- — Detailed attribute tables for all step types
references/step-types-reference.md - — Dynamic pipeline generators, matrix adjustments, monorepo patterns, multi-stage pipelines
references/advanced-patterns.md - — Comprehensive exit code table, retry strategies by failure type
references/retry-and-error-codes.md - — Group step attributes, DAG mode, merging across uploads, no-nesting workaround, job-limit impact
references/group-steps.md - — Silent upload failures, quota limits, env var interpolation, duplicate-on-retry, retry storms
references/dynamic-pipeline-troubleshooting.md - — Fan-out/fan-in, SDK generation, the handler pattern, finalizer steps, trigger-based fan-out
references/dynamic-pipeline-patterns.md
- —— 所有步骤类型的详细属性表
references/step-types-reference.md - —— 动态流水线生成器、矩阵调整、单仓库模式、多阶段流水线
references/advanced-patterns.md - —— 完整的退出代码表、按故障类型划分的重试策略
references/retry-and-error-codes.md - —— Group 步骤属性、DAG 模式、跨上传合并、非嵌套解决方案、任务限制影响
references/group-steps.md - —— 静默上传失败、配额限制、环境变量插值、重试重复、重试风暴
references/dynamic-pipeline-troubleshooting.md - —— 扇出/扇入、SDK 生成、处理程序模式、最终步骤、基于 trigger 的扇出
references/dynamic-pipeline-patterns.md
Examples
示例
- — Minimal working pipeline (test, wait, deploy)
examples/basic-pipeline.yml - — Full-featured pipeline with caching, parallelism, annotations, retry, artifacts, and notifications
examples/optimized-pipeline.yml
For migrating pipelines from other CI systems, see the buildkite-migration skill.
- —— 最简可用流水线(测试、等待、部署)
examples/basic-pipeline.yml - —— 全功能流水线,包含缓存、并行处理、注释、重试、制品和通知
examples/optimized-pipeline.yml
如需从其他 CI 系统迁移流水线,请查看 buildkite-migration 技能。