buildkite-pipelines

Compare original and translation side by side

🇺🇸

Original

English
🇨🇳

Translation

Chinese

Buildkite Pipelines

Buildkite 流水线

Pipeline YAML is the core of Buildkite CI/CD. This skill covers writing, optimizing, and troubleshooting
.buildkite/pipeline.yml
— step types, caching, parallelism, annotations, retry, dynamic pipelines, matrix builds, plugins, notifications, artifacts, and concurrency.
Pipeline YAML 是 Buildkite CI/CD 的核心。本技能涵盖
.buildkite/pipeline.yml
的编写、优化与故障排查——包括步骤类型、缓存、并行处理、注释、重试、动态流水线、矩阵构建、插件、通知、制品以及并发控制等内容。

Quick Start

快速入门

Create
.buildkite/pipeline.yml
in the repository root:
yaml
steps:
  - label: ":hammer: Tests"
    command: "npm test"
    artifact_paths: "coverage/**/*"

  - wait

  - label: ":rocket: Deploy"
    command: "scripts/deploy.sh"
    branches: "main"
Set the pipeline's initial command in Buildkite to upload this file:
yaml
steps:
  - label: ":pipeline: Upload"
    command: buildkite-agent pipeline upload
The agent reads
.buildkite/pipeline.yml
and uploads the steps to Buildkite for execution.
Buildkite looks for
.buildkite/pipeline.yml
by default. Override the path with
buildkite-agent pipeline upload path/to/other.yml
.
For creating pipelines programmatically, see the buildkite-api skill. For agent and queue setup, see the buildkite-agent-infrastructure skill.
在仓库根目录创建
.buildkite/pipeline.yml
yaml
steps:
  - label: ":hammer: Tests"
    command: "npm test"
    artifact_paths: "coverage/**/*"

  - wait

  - label: ":rocket: Deploy"
    command: "scripts/deploy.sh"
    branches: "main"
在 Buildkite 中设置流水线的初始命令以上传该文件:
yaml
steps:
  - label: ":pipeline: Upload"
    command: buildkite-agent pipeline upload
Agent 会读取
.buildkite/pipeline.yml
并将步骤上传至 Buildkite 执行。
Buildkite 默认会查找
.buildkite/pipeline.yml
。可通过
buildkite-agent pipeline upload path/to/other.yml
覆盖路径。
如需以编程方式创建流水线,请查看 buildkite-api 技能。 如需 Agent 和队列设置,请查看 buildkite-agent-infrastructure 技能。

Step Types

步骤类型

TypePurposeMinimal syntax
commandRun a shell command
- command: "make test"
waitBlock until all previous steps pass
- wait
blockPause for manual approval
- block: ":shipit: Release"
triggerStart a build on another pipeline
- trigger: "deploy-pipeline"
groupVisually group steps (collapsible)
- group: "Tests"
with nested
steps:
inputCollect user input before continuing
- input: "Release version"
with
fields:
For detailed attributes and advanced examples of each step type, see
references/step-types-reference.md
.
类型用途最简语法
command运行 shell 命令
- command: "make test"
wait等待所有前置步骤完成后再继续
- wait
block暂停以等待手动批准
- block: ":shipit: Release"
trigger启动另一条流水线的构建
- trigger: "deploy-pipeline"
group可视化分组步骤(可折叠)
- group: "Tests"
并嵌套
steps:
input继续前收集用户输入
- input: "Release version"
并配置
fields:
如需各步骤类型的详细属性和高级示例,请查看
references/step-types-reference.md

Caching

缓存

Caching dependencies is the single highest-impact optimization. Use the cache plugin with manifest-based invalidation:
yaml
steps:
  - label: ":nodejs: Test"
    command: "npm ci && npm test"
    plugins:
      - cache#v1.8.1:
          paths:
            - "node_modules/"
          manifest: "package-lock.json"
The cache key derives from the manifest file hash. When
package-lock.json
changes, the cache rebuilds.
Hosted agents also support a built-in
cache
key (no plugin needed):
yaml
steps:
  - label: ":nodejs: Test"
    command: "npm ci && npm test"
    cache:
      paths:
        - "node_modules/"
      key: "v1-deps-{{ checksum 'package-lock.json' }}"
Hosted agent setup and instance shapes are covered by the buildkite-agent-infrastructure skill.
缓存依赖项是提升构建速度最有效的优化手段。使用缓存插件并基于清单文件进行失效控制:
yaml
steps:
  - label: ":nodejs: Test"
    command: "npm ci && npm test"
    plugins:
      - cache#v1.8.1:
          paths:
            - "node_modules/"
          manifest: "package-lock.json"
缓存键由清单文件的哈希值生成。当
package-lock.json
变更时,缓存会重新构建。
托管 Agent 还支持内置的
cache
键(无需插件):
yaml
steps:
  - label: ":nodejs: Test"
    command: "npm ci && npm test"
    cache:
      paths:
        - "node_modules/"
      key: "v1-deps-{{ checksum 'package-lock.json' }}"
托管 Agent 的设置和实例规格请查看 buildkite-agent-infrastructure 技能。

Fast-Fail and Non-Blocking Steps

快速失败与非阻塞步骤

Cancel remaining jobs immediately when any job fails:
yaml
steps:
  - label: ":rspec: Tests"
    command: "bundle exec rspec"
    cancel_on_build_failing: true
Use
soft_fail
for steps that should not block the build (security scans, linting, coverage):
yaml
steps:
  - label: ":shield: Security Scan"
    command: "scripts/security-scan.sh"
    soft_fail:
      - exit_status: 1
A soft-failed step shows as a warning in the UI but does not fail the build. Combine with
continue_on_failure: true
on a wait step to let downstream steps run regardless.
当任何任务失败时立即取消剩余任务:
yaml
steps:
  - label: ":rspec: Tests"
    command: "bundle exec rspec"
    cancel_on_build_failing: true
使用
soft_fail
标记不应阻塞构建的步骤(如安全扫描、代码检查、覆盖率统计):
yaml
steps:
  - label: ":shield: Security Scan"
    command: "scripts/security-scan.sh"
    soft_fail:
      - exit_status: 1
软失败的步骤在 UI 中会显示为警告,但不会导致构建失败。可结合 wait 步骤的
continue_on_failure: true
,让下游步骤无论前置结果如何都能运行。

Parallelism and Dependencies

并行处理与依赖关系

Parallel execution

并行执行

Steps at the same level run in parallel by default. Use
parallelism
to fan out a single step:
yaml
steps:
  - label: ":rspec: Tests %n"
    command: "bundle exec rspec"
    parallelism: 10
This creates 10 parallel jobs. Each receives
BUILDKITE_PARALLEL_JOB
(0-9) and
BUILDKITE_PARALLEL_JOB_COUNT
(10) as environment variables for splitting work.
For intelligent test splitting based on timing data, see the buildkite-test-engine skill.
同一层级的步骤默认并行运行。使用
parallelism
将单个步骤拆分为多个并行任务:
yaml
steps:
  - label: ":rspec: Tests %n"
    command: "bundle exec rspec"
    parallelism: 10
这会创建 10 个并行任务。每个任务会收到环境变量
BUILDKITE_PARALLEL_JOB
(0-9)和
BUILDKITE_PARALLEL_JOB_COUNT
(10),用于拆分工作负载。
如需基于时间数据进行智能测试拆分,请查看 buildkite-test-engine 技能。

Explicit dependencies

显式依赖

Use
depends_on
to express step-level dependencies without
wait
:
yaml
steps:
  - label: "Build"
    key: "build"
    command: "make build"

  - label: "Unit Tests"
    depends_on: "build"
    command: "make test-unit"

  - label: "Integration Tests"
    depends_on: "build"
    command: "make test-integration"
Unit and integration tests run in parallel after build completes — no
wait
step needed.
使用
depends_on
定义步骤级依赖,无需使用
wait
yaml
steps:
  - label: "Build"
    key: "build"
    command: "make build"

  - label: "Unit Tests"
    depends_on: "build"
    command: "make test-unit"

  - label: "Integration Tests"
    depends_on: "build"
    command: "make test-integration"
单元测试和集成测试会在构建完成后并行运行——无需 wait 步骤。

Annotations

注释

Surface build results directly on the build page using
buildkite-agent annotate
. Supports Markdown and HTML.
yaml
steps:
  - label: ":test_tube: Tests"
    command: |
      if ! make test 2>&1 | tee test-output.txt; then
        buildkite-agent annotate --style "error" --context "test-failures" < test-output.txt
        exit 1
      fi
      buildkite-agent annotate "All tests passed :white_check_mark:" --style "success" --context "test-results"
FlagDefaultDescription
--style
default
Visual style:
default
,
info
,
warning
,
error
,
success
--context
randomUnique ID — reusing a context replaces the annotation
--append
false
Append to existing annotation with same context
Link to uploaded artifacts in annotations:
yaml
- command: |
    buildkite-agent artifact upload "coverage/*"
    buildkite-agent annotate --style "info" 'Coverage: <a href="artifact://coverage/index.html">view report</a>'
使用
buildkite-agent annotate
在构建页面直接展示构建结果,支持 Markdown 和 HTML。
yaml
steps:
  - label: ":test_tube: Tests"
    command: |
      if ! make test 2>&1 | tee test-output.txt; then
        buildkite-agent annotate --style "error" --context "test-failures" < test-output.txt
        exit 1
      fi
      buildkite-agent annotate "All tests passed :white_check_mark:" --style "success" --context "test-results"
参数默认值说明
--style
default
视觉样式:
default
info
warning
error
success
--context
随机值唯一 ID——重复使用同一 context 会替换现有注释
--append
false
追加到同一 context 的现有注释后
在注释中链接已上传的制品:
yaml
- command: |
    buildkite-agent artifact upload "coverage/*"
    buildkite-agent annotate --style "info" 'Coverage: <a href="artifact://coverage/index.html">view report</a>'

Retry

重试

Automatic retry

自动重试

Retry transient failures by exit status:
yaml
steps:
  - label: ":hammer: Build"
    command: "make build"
    retry:
      automatic:
        - exit_status: -1    # Agent lost
          limit: 2
        - exit_status: 143   # SIGTERM (spot instance termination)
          limit: 2
        - exit_status: 255   # Timeout or SSH failure
          limit: 2
        - exit_status: "*"   # Any non-zero exit
          limit: 1
根据退出状态重试临时故障:
yaml
steps:
  - label: ":hammer: Build"
    command: "make build"
    retry:
      automatic:
        - exit_status: -1    # Agent 连接丢失
          limit: 2
        - exit_status: 143   # SIGTERM(临时实例终止)
          limit: 2
        - exit_status: 255   # 超时或 SSH 失败
          limit: 2
        - exit_status: "*"   # 任何非零退出状态
          limit: 1

Manual retry

手动重试

Control whether manual retries are allowed:
yaml
retry:
  manual:
    allowed: false
    reason: "Deployment steps cannot be retried"
For comprehensive exit code tables and retry strategy recommendations, see
references/retry-and-error-codes.md
.
控制是否允许手动重试:
yaml
retry:
  manual:
    allowed: false
    reason: "Deployment steps cannot be retried"
如需完整的退出代码表和重试策略建议,请查看
references/retry-and-error-codes.md

Dynamic Pipelines

动态流水线

Generate pipeline steps at runtime based on repository state. Upload generated YAML with
buildkite-agent pipeline upload
:
yaml
steps:
  - label: ":pipeline: Generate"
    command: |
      .buildkite/generate-pipeline.sh | buildkite-agent pipeline upload
根据仓库状态在运行时生成流水线步骤。使用
buildkite-agent pipeline upload
上传生成的 YAML:
yaml
steps:
  - label: ":pipeline: Generate"
    command: |
      .buildkite/generate-pipeline.sh | buildkite-agent pipeline upload

When to use what

场景选型

Pipelines exist on a spectrum. Pick the simplest option that does the job:
SituationApproach
Same steps every build, branch-level filtering at mostStatic YAML
Org-wide enforcement of pipeline structure, admin-controlled (Enterprise plan)Pipeline templates
Reusable, vetted logic (caching, Docker, artifact transfer) shared across many pipelinesPinned plugin
Skip steps when specific files haven't changed
if_changed
Monorepo with separate pipelines per service
monorepo-diff
plugin
Combine
if
and
if_changed
with OR logic
Dynamic generation
Apply consistent retry / timeout / env config across many pipelinesDynamic (shared config)
Calculate test shards, matrix combos,
parallelism × matrix
at runtime
Dynamic (often SDK)
Monorepo with transitive dependencies between servicesDynamic (custom dep graph)
Recover from infra failures (OOM → bigger agent)Dynamic (
pre-exit
hook)
Steps depend on output from previous steps (multi-stage)Dynamic, often
--replace
or chained uploads
Cleanup / teardown step that must run regardless of earlier failuresDynamic (
pre-exit
uploads a finalizer)
Fallback step only when the primary step failsDynamic (
pre-exit
checking exit status)
Pipeline YAML has outgrown what the team can maintainDynamic (SDK in Python / TS / Go / Ruby)
流水线的实现方式有多种,选择最简单的方案即可:
场景方案
每次构建步骤相同,最多仅需按分支过滤静态 YAML
组织层面强制流水线结构,由管理员控制(企业版)流水线模板
在多条流水线间共享经过验证的可复用逻辑(缓存、Docker、制品传输)固定版本的插件
特定文件未变更时跳过步骤
if_changed
单仓库多服务,每个服务独立流水线
monorepo-diff
插件
结合
if
if_changed
实现或逻辑
动态生成
在多条流水线间统一应用重试/超时/环境变量配置动态生成(共享配置)
在运行时计算测试分片、矩阵组合、
parallelism × matrix
动态生成(通常使用 SDK)
单仓库多服务且服务间存在传递依赖动态生成(自定义依赖图)
从基础设施故障中恢复(如 OOM 后切换到更大规格的 Agent)动态生成(
pre-exit
钩子)
步骤依赖前置步骤的输出(多阶段)动态生成,通常使用
--replace
或链式上传
无论前置步骤是否失败都必须运行的清理/销毁步骤动态生成(
pre-exit
上传最终步骤)
仅当主步骤失败时才运行的 fallback 步骤动态生成(
pre-exit
检查退出状态)
流水线 YAML 复杂度超出团队维护能力动态生成(使用 Python/TS/Go/Ruby 编写的 SDK)

Don't reach for dynamic pipelines for the wrong job

避免误用动态流水线

Dynamic generation is the right tool when the steps themselves need to change. For passing data between steps, simpler primitives exist:
  • buildkite-agent meta-data set/get
    — small key-value pairs any later step in the same build can read (a version string, a commit SHA, a feature flag).
  • Artifacts — files passed between steps (
    buildkite-agent artifact upload/download
    ).
  • Trigger step
    env:
    — env vars passed to a build in a different pipeline.
If only data needs to move, metadata or artifacts is simpler and safer. See the buildkite-agent-runtime skill.
仅当步骤本身需要变更时,才适合使用动态生成。如需在步骤间传递数据,更简单的方案如下:
  • buildkite-agent meta-data set/get
    —— 存储小型键值对,同一构建中的后续步骤均可读取(如版本字符串、提交 SHA、功能开关)。
  • 制品 —— 在步骤间传递文件(
    buildkite-agent artifact upload/download
    )。
  • Trigger 步骤的
    env:
    —— 将环境变量传递到另一条流水线的构建中。
如果仅需传递数据,元数据或制品方案更简单安全。请查看 buildkite-agent-runtime 技能。

Bootstrap script

引导脚本

Always start generator scripts with
set -euo pipefail
.
Without
pipefail
, a failing
pipeline upload
returns the exit code of the last piped command, the build step reports success, and no generated steps appear — the most common dynamic pipeline failure mode.
Example generator that runs tests only for changed services:
bash
#!/bin/bash
set -euo pipefail
CHANGED=$(git diff --name-only HEAD~1)
cat <<YAML
steps:
YAML
for dir in services/*/; do
  svc=$(basename "$dir")
  if echo "$CHANGED" | grep -q "^services/$svc/"; then
    cat <<YAML
  - label: ":test_tube: $svc"
    command: "cd services/$svc && make test"
    key: "test-$svc"
YAML
  fi
done
Set
key:
on every generated step. It enables
depends_on
, makes retries idempotent (
DuplicateKeyError
blocks silent duplication if the upload step re-runs), and gives stable identifiers across builds. Validate locally with
buildkite-agent pipeline upload --dry-run
before pushing.
Keep uploads under 500 steps per call and 4,000 jobs per build (platform defaults, raisable via support). For larger monorepos, use trigger steps to fan out across separate builds.
For type-checked, unit-testable generators, the Buildkite SDK supports JavaScript/TypeScript, Python, Go, and Ruby. Wrap related steps in group steps once a generator produces more than ~10 steps — adding any group enables DAG mode for the build, and
concurrency
attributes are rejected on groups (see
references/group-steps.md
).
A generator step can also read runtime state (meta-data, artifacts, git diff) and upload the next phase of the pipeline — the handler pattern used by multi-stage builds. For this, fan-out/fan-in, and finalizer / always-run steps via
pre-exit
hooks, see
references/dynamic-pipeline-patterns.md
. For failure modes, see
references/dynamic-pipeline-troubleshooting.md
. For advanced generator patterns (Python, monorepo, multi-stage), see
references/advanced-patterns.md
.
生成器脚本必须以
set -euo pipefail
开头。
若缺少
pipefail
,失败的
pipeline upload
会返回最后一个管道命令的退出码,导致构建步骤报告成功,但生成的步骤不会显示——这是动态流水线最常见的故障模式。
示例生成器:仅对变更的服务运行测试
bash
#!/bin/bash
set -euo pipefail
CHANGED=$(git diff --name-only HEAD~1)
cat <<YAML
steps:
YAML
for dir in services/*/; do
  svc=$(basename "$dir")
  if echo "$CHANGED" | grep -q "^services/$svc/"; then
    cat <<YAML
  - label: ":test_tube: $svc"
    command: "cd services/$svc && make test"
    key: "test-$svc"
YAML
  fi
done
为每个生成的步骤设置
key:
。这能启用
depends_on
,让重试具有幂等性(若上传步骤重新运行,
DuplicateKeyError
会阻止静默重复),并在多次构建间提供稳定的标识符。推送前请使用
buildkite-agent pipeline upload --dry-run
本地验证。
每次上传的步骤数请控制在 500 步以内,每个构建的任务数控制在 4000 个以内(平台默认限制,可通过支持申请提高)。对于大型单仓库,使用 trigger 步骤将任务分散到多个独立构建中。
如需类型检查、可单元测试的生成器,Buildkite SDK 支持 JavaScript/TypeScript、Python、Go 和 Ruby。当生成器输出超过约 10 个步骤时,请将相关步骤包装在 group 中——添加任何 group 都会启用构建的 DAG 模式,且 group 上不支持
concurrency
属性(请查看
references/group-steps.md
)。
生成器步骤还可读取运行时状态(元数据、制品、Git 差异)并上传流水线的下一阶段——这是多阶段构建使用的处理程序模式。如需了解扇出/扇入、通过
pre-exit
钩子实现的最终步骤/始终运行步骤,请查看
references/dynamic-pipeline-patterns.md
。如需故障排查,请查看
references/dynamic-pipeline-troubleshooting.md
。如需高级生成器模式(Python、单仓库、多阶段),请查看
references/advanced-patterns.md

Conditional Execution

条件执行

Step-level conditions

步骤级条件

Use
if
to conditionally run steps based on build state:
yaml
steps:
  - label: ":rocket: Deploy"
    command: "scripts/deploy.sh"
    if: build.branch == "main" && build.message !~ /\[skip deploy\]/
For the full list of condition expressions, see Conditionals.
[skip ci]
gotcha:
Buildkite only checks the HEAD commit message for
[skip ci]
/
[ci skip]
. If the tag is in an earlier commit in a multi-commit push, the build still triggers.
使用
if
根据构建状态条件性运行步骤:
yaml
steps:
  - label: ":rocket: Deploy"
    command: "scripts/deploy.sh"
    if: build.branch == "main" && build.message !~ /\[skip deploy\]/
如需完整的条件表达式列表,请查看 条件语句
[skip ci]
注意事项:
Buildkite 仅检查 HEAD 提交信息中的
[skip ci]
/
[ci skip]
。若标签存在于多提交推送中的早期提交,构建仍会触发。

Directory-based step filtering (if_changed)

基于目录的步骤过滤(if_changed)

Skip steps when relevant files haven't changed. Only applied by the Buildkite agent when uploading a pipeline. See https://buildkite.com/docs/pipelines/configure/dynamic-pipelines/if-changed.md.
yaml
steps:
  - label: ":nodejs: Frontend tests"
    command: "npm test"
    if_changed:
      - "src/frontend/**"
      - "package.json"
For exclude patterns and monorepo configurations, see
references/advanced-patterns.md
.
For large monorepos, use the Sparse Checkout plugin to check out only
.buildkite/
for the upload step — dramatically faster pipeline uploads.
当相关文件未变更时跳过步骤。仅当 Buildkite Agent 上传流水线时生效。请查看 https://buildkite.com/docs/pipelines/configure/dynamic-pipelines/if-changed.md。
yaml
steps:
  - label: ":nodejs: Frontend tests"
    command: "npm test"
    if_changed:
      - "src/frontend/**"
      - "package.json"
如需排除模式和单仓库配置,请查看
references/advanced-patterns.md
对于大型单仓库,使用 Sparse Checkout 插件 仅检出
.buildkite/
用于上传步骤——可大幅加快流水线上传速度。

Conditionally running plugins

条件性运行插件

Step-level
if
does not prevent plugins from executing. Wrap steps in a
group
to skip plugins entirely:
yaml
steps:
  - group: ":docker: Build"
    if: build.env("DOCKER_PASSWORD") != null
    steps:
      - label: "Build image"
        command: "docker build -t myapp ."
        plugins:
          - docker-login#v2.1.0:
              username: myuser
              password-env: DOCKER_PASSWORD
步骤级的
if
无法阻止插件执行。请将步骤包装在
group
中以完全跳过插件:
yaml
steps:
  - group: ":docker: Build"
    if: build.env("DOCKER_PASSWORD") != null
    steps:
      - label: "Build image"
        command: "docker build -t myapp ."
        plugins:
          - docker-login#v2.1.0:
              username: myuser
              password-env: DOCKER_PASSWORD

Matrix Builds

矩阵构建

Run the same step across multiple configurations:
yaml
steps:
  - label: "Test {{matrix.ruby}} on {{matrix.os}}"
    command: "bundle exec rake test"
    matrix:
      setup:
        ruby:
          - "3.2"
          - "3.3"
        os:
          - "ubuntu"
          - "alpine"
      adjustments:
        - with:
            ruby: "3.2"
            os: "alpine"
          skip: true  # Known incompatible
Valid properties inside each
adjustments
entry:
with
,
skip
,
soft_fail
,
env
. The
agents:
key is not valid inside
adjustments
— Buildkite rejects the pipeline with "agents is not a valid property on the matrix.adjustments configuration". To route matrix combinations to different queues (e.g., Linux vs Windows agents), use separate steps or a dynamic pipeline generator.
在多种配置下运行相同步骤:
yaml
steps:
  - label: "Test {{matrix.ruby}} on {{matrix.os}}"
    command: "bundle exec rake test"
    matrix:
      setup:
        ruby:
          - "3.2"
          - "3.3"
        os:
          - "ubuntu"
          - "alpine"
      adjustments:
        - with:
            ruby: "3.2"
            os: "alpine"
          skip: true  # 已知不兼容
adjustments
中的有效属性包括:
with
skip
soft_fail
env
agents:
键在
adjustments
无效——Buildkite 会拒绝流水线并提示“agents is not a valid property on the matrix.adjustments configuration”。如需将矩阵组合路由到不同队列(如 Linux 与 Windows Agent),请使用独立步骤或动态流水线生成器。

Plugins

插件

Add capabilities with 3-line YAML blocks. Pin versions for reproducibility:
yaml
plugins:
  - docker-compose#v5.5.0:
      run: app
      config: docker-compose.ci.yml
PluginPurpose
cache#v1.8.1
Dependency caching with manifest-based invalidation
docker#v5.12.0
Run steps inside a Docker container
docker-compose#v5.5.0
Build and run with Docker Compose
artifacts#v1.9.4
Download artifacts between steps
test-collector#v2.0.0
Upload test results to Test Engine
Always pin plugin versions (e.g.,
docker#v5.12.0
not
docker#v5
). Unpinned versions can break builds when plugins release new major versions.
For private organizational plugins, use full Git URLs — the shorthand syntax only works for public plugins:
yaml
plugins:
  - ssh://git@github.com/my-org/my-plugin.git#v1.0.0:
      config: value
通过 3 行 YAML 块扩展功能。固定版本以确保可重复性:
yaml
plugins:
  - docker-compose#v5.5.0:
      run: app
      config: docker-compose.ci.yml
插件用途
cache#v1.8.1
基于清单文件失效控制的依赖缓存
docker#v5.12.0
在 Docker 容器内运行步骤
docker-compose#v5.5.0
使用 Docker Compose 构建和运行
artifacts#v1.9.4
在步骤间下载制品
test-collector#v2.0.0
将测试结果上传到 Test Engine
请始终固定插件版本(如
docker#v5.12.0
而非
docker#v5
)。未固定的版本可能在插件发布重大更新时导致构建失败。
对于私有组织插件,请使用完整 Git URL——简写语法仅适用于公共插件:
yaml
plugins:
  - ssh://git@github.com/my-org/my-plugin.git#v1.0.0:
      config: value

Notifications and Artifacts

通知与制品

Add pipeline-level
notify:
above
steps:
to send Slack, email, or webhook notifications on build state changes. See Notifications for syntax.
steps:
上方添加流水线级别的
notify:
,可在构建状态变更时发送 Slack、邮件或 Webhook 通知。语法请查看 通知

Artifact upload and download

制品上传与下载

Upload artifacts from steps, download in later steps:
yaml
steps:
  - label: "Build"
    command: "make build"
    artifact_paths: "dist/**/*"

  - wait

  - label: "Package"
    command: |
      buildkite-agent artifact download "dist/*" .
      make package
When using artifacts in a Docker build, download artifacts before starting the Docker build since
buildkite-agent
is not available inside the container:
yaml
steps:
  - label: "Docker build"
    command: |
      buildkite-agent artifact download "dist/*" .
      docker build -t myapp .
在步骤中上传制品,在后续步骤中下载:
yaml
steps:
  - label: "Build"
    command: "make build"
    artifact_paths: "dist/**/*"

  - wait

  - label: "Package"
    command: |
      buildkite-agent artifact download "dist/*" .
      make package
在 Docker 构建中使用制品时,请在启动 Docker 构建前下载制品,因为
buildkite-agent
在容器内不可用:
yaml
steps:
  - label: "Docker build"
    command: |
      buildkite-agent artifact download "dist/*" .
      docker build -t myapp .

Concurrency

并发控制

Limit parallel execution of steps sharing a resource. Always pair
concurrency
with
concurrency_group
— without a group name, the limit is silently ignored.
yaml
steps:
  - label: ":rocket: Deploy"
    command: "scripts/deploy.sh"
    concurrency: 1
    concurrency_group: "deploy/production"
    concurrency_method: "eager"
Use
concurrency_method: "eager"
(next available) for independent jobs like deploys. Use the default
"ordered"
(FIFO) when execution order matters. Set
priority
(default
0
, higher = first) to control which queued jobs run next.
For full concurrency configuration options, see Controlling Concurrency.
For triggering, watching, and debugging pipelines from the terminal, see the buildkite-cli skill.
限制共享资源的步骤并行执行次数。请始终将
concurrency
concurrency_group
配合使用——若缺少组名称,限制会被静默忽略。
yaml
steps:
  - label: ":rocket: Deploy"
    command: "scripts/deploy.sh"
    concurrency: 1
    concurrency_group: "deploy/production"
    concurrency_method: "eager"
对于独立任务(如部署),使用
concurrency_method: "eager"
(抢占式)。当执行顺序重要时,使用默认的
"ordered"
(先进先出)。设置
priority
(默认 0,值越高越优先)控制队列中任务的执行顺序。
如需完整的并发配置选项,请查看 控制并发
如需从终端触发、查看和调试流水线,请查看 buildkite-cli 技能。

Common Mistakes

常见错误

MistakeWhat happensFix
Missing
wait
between dependent steps
Steps run in parallel, second step fails because first hasn't finishedAdd
- wait
or use
depends_on:
Using only
wait
steps for all dependencies
Valid but non-idiomatic;
wait
blocks ALL prior steps, making it impossible to run independent steps in parallel
Give named steps a
key:
and use
depends_on: "key"
to express fine-grained dependencies; reserve
wait
for unconditional barriers
No
plugins:
in pipeline for package install steps
Dependencies reinstalled from scratch on every build, slowing builds and inflating costsAdd
cache
plugin (or the built-in
cache:
key for hosted agents) to cache
node_modules/
,
.gradle/
, etc. See the Caching section above
Using step-level
if
to skip plugins
Plugins still execute (they run before
if
is evaluated)
Wrap in a
group
with the
if
condition
Not pinning plugin versionsBuilds break when plugin releases breaking changeAlways use full semver:
plugin#v1.2.3
Forgetting
concurrency_group
with
concurrency
concurrency
is ignored without a group name
Always pair
concurrency
with
concurrency_group
artifact_paths
glob doesn't match output
Artifacts silently not uploaded, downstream steps failTest glob pattern locally; use
**/*
for nested directories
Hardcoding parallel job split logicUneven test distribution, one slow job blocks the buildUse
parallelism: N
with timing-based splitting via Test Engine
Inline secrets in pipeline YAMLSecrets visible in build logs and Buildkite UIUse cluster secrets or agent environment hooks
Using
retry.automatic
with
exit_status: "*"
and high limit
Genuine bugs retry repeatedly, wasting computeTarget specific exit codes; keep wildcard limit at 1
Using
agents:
inside
matrix.adjustments
Pipeline upload fails: "agents is not a valid property on the matrix.adjustments configuration"Remove
agents:
from
adjustments
; use separate steps per platform or a dynamic pipeline generator for per-combination queue routing
Build fails but all visible steps passedA trigger step started a child pipeline that failed, or a step was cancelled rather than unblockedCheck the triggered pipeline's build status; inspect block steps for cancellations
Pipeline upload fails with no clear errorYAML syntax error or agent-side issue not shown in build logsValidate YAML locally; check agent logs on the host machine for detailed upload errors; run
buildkite-agent pipeline upload --debug
Fork builds enabled on public pipelinesContributors can modify
pipeline.yml
to extract secrets
Disable fork builds in pipeline settings for public repos; use a separate pipeline for external PRs with no secret access
Docker Compose steps produce artifacts but agent can't find themFiles created inside containers are invisible to the host agentMount the working directory as a volume in
docker-compose.yml
so container outputs are visible for
artifact_paths:
Dynamic pipeline generates 1000+ stepsUI becomes slow, pipeline processing degradesKeep generated pipelines under ~500 steps; use orchestrator pipelines with trigger steps for larger monorepos
错误后果修复方案
依赖步骤间缺少
wait
步骤并行运行,后续步骤因前置步骤未完成而失败添加
- wait
或使用
depends_on:
所有依赖都仅使用
wait
步骤
虽有效但不符合最佳实践;
wait
会阻塞所有前置步骤,无法并行运行独立步骤
为命名步骤设置
key:
并使用
depends_on: "key"
定义细粒度依赖;将
wait
保留用于无条件屏障
包安装步骤未配置
plugins:
每次构建都从头重新安装依赖,拖慢构建速度并增加成本添加
cache
插件(或托管 Agent 的内置
cache:
键)缓存
node_modules/
.gradle/
等目录。请查看上方的缓存章节
使用步骤级
if
跳过插件
插件仍会执行(它们在
if
评估前运行)
将步骤包装在带有
if
条件的
group
未固定插件版本插件发布破坏性更新时构建失败始终使用完整语义化版本:
plugin#v1.2.3
使用
concurrency
但忘记
concurrency_group
concurrency
被忽略
始终将
concurrency
concurrency_group
配合使用
artifact_paths
通配符不匹配输出
制品被静默忽略未上传,下游步骤失败本地测试通配符模式;对嵌套目录使用
**/*
硬编码并行任务拆分逻辑测试分布不均,单个慢任务阻塞构建使用
parallelism: N
结合 Test Engine 的基于时间的拆分
在流水线 YAML 中硬编码密钥密钥会暴露在构建日志和 Buildkite UI 中使用集群密钥或 Agent 环境钩子
retry.automatic
搭配
exit_status: "*"
且限制值过高
真正的 bug 会反复重试,浪费计算资源针对特定退出码设置重试;通配符重试限制保持为 1
matrix.adjustments
中使用
agents:
流水线上传失败:"agents is not a valid property on the matrix.adjustments configuration"
adjustments
中移除
agents:
;针对不同平台使用独立步骤,或使用动态流水线生成器实现组合级队列路由
构建失败但所有可见步骤都显示成功Trigger 步骤启动的子流水线失败,或步骤被取消而非解除阻塞检查触发流水线的构建状态;检查 block 步骤是否被取消
流水线上传失败且无明确错误YAML 语法错误或 Agent 端问题未在构建日志中显示本地验证 YAML;查看主机上的 Agent 日志获取详细上传错误;运行
buildkite-agent pipeline upload --debug
公开流水线启用了 fork 构建贡献者可修改
pipeline.yml
提取密钥
公开仓库的流水线设置中禁用 fork 构建;为外部 PR 使用无密钥访问的独立流水线
Docker Compose 步骤生成制品但 Agent 无法找到容器内创建的文件对主机 Agent 不可见
docker-compose.yml
中将工作目录挂载为卷,使容器输出对
artifact_paths:
可见
动态流水线生成 1000+ 步骤UI 变慢,流水线处理性能下降将生成的流水线控制在约 500 步以内;对大型单仓库使用编排器流水线和 trigger 步骤

Additional Resources

额外资源

Reference Files

参考文件

  • references/step-types-reference.md
    — Detailed attribute tables for all step types
  • references/advanced-patterns.md
    — Dynamic pipeline generators, matrix adjustments, monorepo patterns, multi-stage pipelines
  • references/retry-and-error-codes.md
    — Comprehensive exit code table, retry strategies by failure type
  • references/group-steps.md
    — Group step attributes, DAG mode, merging across uploads, no-nesting workaround, job-limit impact
  • references/dynamic-pipeline-troubleshooting.md
    — Silent upload failures, quota limits, env var interpolation, duplicate-on-retry, retry storms
  • references/dynamic-pipeline-patterns.md
    — Fan-out/fan-in, SDK generation, the handler pattern, finalizer steps, trigger-based fan-out
  • references/step-types-reference.md
    —— 所有步骤类型的详细属性表
  • references/advanced-patterns.md
    —— 动态流水线生成器、矩阵调整、单仓库模式、多阶段流水线
  • references/retry-and-error-codes.md
    —— 完整的退出代码表、按故障类型划分的重试策略
  • references/group-steps.md
    —— Group 步骤属性、DAG 模式、跨上传合并、非嵌套解决方案、任务限制影响
  • references/dynamic-pipeline-troubleshooting.md
    —— 静默上传失败、配额限制、环境变量插值、重试重复、重试风暴
  • references/dynamic-pipeline-patterns.md
    —— 扇出/扇入、SDK 生成、处理程序模式、最终步骤、基于 trigger 的扇出

Examples

示例

  • examples/basic-pipeline.yml
    — Minimal working pipeline (test, wait, deploy)
  • examples/optimized-pipeline.yml
    — Full-featured pipeline with caching, parallelism, annotations, retry, artifacts, and notifications
For migrating pipelines from other CI systems, see the buildkite-migration skill.
  • examples/basic-pipeline.yml
    —— 最简可用流水线(测试、等待、部署)
  • examples/optimized-pipeline.yml
    —— 全功能流水线,包含缓存、并行处理、注释、重试、制品和通知
如需从其他 CI 系统迁移流水线,请查看 buildkite-migration 技能。

Further Reading

延伸阅读