orchestrating-datacloud
Compare original and translation side by side
🇺🇸
Original
English🇨🇳
Translation
Chineseorchestrating-datacloud: Salesforce Data Cloud Orchestrator
orchestrating-datacloud:Salesforce Data Cloud编排器
Use this skill when the user needs product-level Data Cloud workflow guidance rather than a single isolated command family: pipeline setup, cross-phase troubleshooting, data spaces, data kits, or deciding whether a task belongs in Connect, Prepare, Harmonize, Segment, Act, or Retrieve.
This skill intentionally follows sf-skills house style while using the external command surface as the runtime. The plugin is not vendored into this repo.
sf data360当用户需要产品级Data Cloud工作流指导而非单一独立命令组时,使用此技能:包括管道设置、跨阶段故障排查、数据空间、数据套件,或判断任务属于Connect、Prepare、Harmonize、Segment、Act还是Retrieve阶段。
此技能严格遵循sf-skills内部风格,同时使用外部命令作为运行时。该插件未纳入本仓库。
sf data360When This Skill Owns the Task
此技能负责的任务场景
Use when the work involves:
orchestrating-datacloud- multi-phase Data Cloud setup or remediation
- data spaces ()
sf data360 data-space * - data kits ()
sf data360 data-kit * - health checks ()
sf data360 doctor - CRM-to-unified-profile pipeline design
- deciding how to move from ingestion → harmonization → segmentation → activation
- cross-phase troubleshooting where the root cause is not yet clear
Delegate to a phase-specific skill when the user is focused on one area:
| Phase | Use this skill | Typical scope |
|---|---|---|
| Connect | connecting-datacloud | connections, connectors, source discovery |
| Prepare | preparing-datacloud | data streams, DLOs, transforms, DocAI |
| Harmonize | harmonizing-datacloud | DMOs, mappings, identity resolution, data graphs |
| Segment | segmenting-datacloud | segments, calculated insights |
| Act | activating-datacloud | activations, activation targets, data actions |
| Retrieve | retrieving-datacloud | SQL, search indexes, vector search, async query |
Delegate outside the family when the user is:
- extracting Session Tracing / STDM telemetry → observing-agentforce
- writing CRM SOQL only → querying-soql
- loading CRM source data → handling-sf-data
- creating missing CRM schema → generating-custom-object or generating-custom-field
- implementing downstream Apex or Flow logic → generating-apex, generating-flow
当工作涉及以下内容时,使用:
orchestrating-datacloud- 多阶段Data Cloud设置或修复
- 数据空间()
sf data360 data-space * - 数据套件()
sf data360 data-kit * - 健康检查()
sf data360 doctor - CRM到统一档案的管道设计
- 规划从 ingestion(摄入)→ harmonization(协调)→ segmentation(细分)→ activation(激活)的流程
- 根本原因尚不明确的跨阶段故障排查
当用户仅关注单个领域时,委托给对应阶段的专属技能:
| 阶段 | 使用技能 | 典型范围 |
|---|---|---|
| Connect | connecting-datacloud | 连接、连接器、源发现 |
| Prepare | preparing-datacloud | 数据流、DLO、转换、DocAI |
| Harmonize | harmonizing-datacloud | DMO、映射、身份解析、数据图谱 |
| Segment | segmenting-datacloud | 细分群体、计算洞察 |
| Act | activating-datacloud | 激活、激活目标、数据操作 |
| Retrieve | retrieving-datacloud | SQL、搜索索引、向量搜索、异步查询 |
当用户进行以下操作时,委托给外部技能:
- 提取会话追踪/STDM遥测数据 → observing-agentforce
- 仅编写CRM SOQL → querying-soql
- 加载CRM源数据 → handling-sf-data
- 创建缺失的CRM schema → generating-custom-object 或 generating-custom-field
- 实现下游Apex或Flow逻辑 → generating-apex、generating-flow
Required Context to Gather First
首先需要收集的必要上下文
Ask for or infer:
- target org alias
- whether the plugin is already installed and linked
- whether the user wants design guidance, read-only inspection, or live mutation
- data sources involved: CRM objects, external databases, file ingestion, knowledge, etc.
- desired outcome: unified profiles, segments, activations, vector search, analytics, or troubleshooting
- whether the user is working in the default data space or a custom one
- whether the org has already been classified with
scripts/diagnose-org.mjs - which command family is failing today, if any
If plugin availability or org readiness is uncertain, start with:
- references/plugin-setup.md
- references/feature-readiness.md
scripts/verify-plugin.shscripts/diagnose-org.mjsscripts/bootstrap-plugin.sh
询问或推断:
- 目标组织别名
- 插件是否已安装并关联
- 用户需要的是设计指导、只读检查还是实时修改
- 涉及的数据源:CRM对象、外部数据库、文件摄入、知识库等
- 期望结果:统一档案、细分群体、激活、向量搜索、分析或故障排查
- 用户是否在默认数据空间或自定义数据空间中工作
- 组织是否已通过完成分类
scripts/diagnose-org.mjs - 当前失败的命令组(如有)
如果插件可用性或组织就绪状态不确定,从以下内容开始:
- references/plugin-setup.md
- references/feature-readiness.md
scripts/verify-plugin.shscripts/diagnose-org.mjsscripts/bootstrap-plugin.sh
Core Operating Rules
核心操作规则
- Use the external plugin runtime; do not reimplement or vendor the command layer.
sf data360 - Prefer the smallest phase-specific skill once the task is localized.
- Run readiness classification before mutation-heavy work. Prefer over guessing from one failing command.
scripts/diagnose-org.mjs - For commands, suppress linked-plugin warning noise with
sf data360unless the stderr output is needed for debugging.2>/dev/null - Distinguish Data Cloud SQL from CRM SOQL.
- Do not treat as a full-product readiness check; the current upstream command only checks the search-index surface.
sf data360 doctor - Do not treat as a universal tenant probe; only use it with a known DMO/DLO table after broader readiness is confirmed.
query describe - Preserve Data Cloud-specific API-version workarounds when they matter.
- Prefer generic, reusable JSON definition files over org-specific workshop payloads.
- 使用外部插件运行时;不要重新实现或纳入命令层。
sf data360 - 一旦任务定位到具体阶段,优先使用最小粒度的阶段专属技能。
- 在进行大量修改操作前,先运行就绪状态分类。优先使用,而非通过单个失败命令猜测。
scripts/diagnose-org.mjs - 对于命令,使用
sf data360抑制关联插件的警告信息,除非调试需要stderr输出。2>/dev/null - 区分Data Cloud SQL与CRM SOQL。
- 不要将视为全产品就绪检查;当前上游命令仅检查搜索索引层面。
sf data360 doctor - 不要将视为通用租户探测工具;仅在确认整体就绪状态后,针对已知的DMO/DLO表使用。
query describe - 当Data Cloud特定的API版本 workaround 重要时,保留这些方案。
- 优先使用通用、可复用的JSON定义文件,而非特定组织的工作负载 payload。
Recommended Workflow
推荐工作流
1. Verify the runtime and auth
1. 验证运行时与认证
Confirm:
- is installed
sf - the community Data Cloud plugin is linked
- the target org is authenticated
Recommended checks:
bash
sf data360 man
sf org display -o <alias>
bash ~/.claude/skills/orchestrating-datacloud/scripts/verify-plugin.sh <alias>Treat as a broad health signal, not the sole gate. On partially provisioned orgs it can fail even when read-only command families like connectors, DMOs, or segments still work.
sf data360 doctor确认:
- 已安装
sf - 社区Data Cloud插件已关联
- 目标组织已完成认证
推荐检查命令:
bash
sf data360 man
sf org display -o <alias>
bash ~/.claude/skills/orchestrating-datacloud/scripts/verify-plugin.sh <alias>将视为广泛的健康信号,而非唯一的准入条件。在部分配置的组织中,即使连接器、DMO或细分群体等只读命令组仍可正常工作,该命令也可能失败。
sf data360 doctor2. Classify readiness before changing anything
2. 修改前先分类就绪状态
Run the shared classifier first:
bash
node ~/.claude/skills/orchestrating-datacloud/scripts/diagnose-org.mjs -o <org> --jsonOnly use a query-plane probe after you know the table name is real:
bash
node ~/.claude/skills/orchestrating-datacloud/scripts/diagnose-org.mjs -o <org> --phase retrieve --describe-table MyDMO__dlm --jsonUse the classifier to distinguish:
- empty-but-enabled modules
- feature-gated modules
- query-plane issues
- runtime/auth failures
先运行共享分类器:
bash
node ~/.claude/skills/orchestrating-datacloud/scripts/diagnose-org.mjs -o <org> --json仅在确认表名真实存在后,使用查询层面的探测:
bash
node ~/.claude/skills/orchestrating-datacloud/scripts/diagnose-org.mjs -o <org> --phase retrieve --describe-table MyDMO__dlm --json使用分类器区分:
- 空但已启用的模块
- 功能 gated 的模块
- 查询层面问题
- 运行时/认证失败
3. Discover existing state with read-only commands
3. 使用只读命令发现现有状态
Use targeted inspection after classification:
bash
sf data360 doctor -o <org> 2>/dev/null
sf data360 data-space list -o <org> 2>/dev/null
sf data360 data-stream list -o <org> 2>/dev/null
sf data360 dmo list -o <org> 2>/dev/null
sf data360 identity-resolution list -o <org> 2>/dev/null
sf data360 segment list -o <org> 2>/dev/null
sf data360 activation platforms -o <org> 2>/dev/null分类后使用针对性检查:
bash
sf data360 doctor -o <org> 2>/dev/null
sf data360 data-space list -o <org> 2>/dev/null
sf data360 data-stream list -o <org> 2>/dev/null
sf data360 dmo list -o <org> 2>/dev/null
sf data360 identity-resolution list -o <org> 2>/dev/null
sf data360 segment list -o <org> 2>/dev/null
sf data360 activation platforms -o <org> 2>/dev/null4. Localize the phase
4. 定位阶段
Route the task:
- source/connector issue → Connect
- ingestion/DLO/stream issue → Prepare
- mapping/IR/unified profile issue → Harmonize
- audience or insight issue → Segment
- downstream push issue → Act
- SQL/search/index issue → Retrieve
将任务路由到对应阶段:
- 源/连接器问题 → Connect
- 摄入/DLO/流问题 → Prepare
- 映射/身份解析/统一档案问题 → Harmonize
- 受众或洞察问题 → Segment
- 下游推送问题 → Act
- SQL/搜索/索引问题 → Retrieve
5. Choose deterministic artifacts when possible
5. 尽可能选择确定性工件
Prefer JSON definition files and repeatable scripts over one-off manual steps. Generic templates live in:
assets/definitions/data-stream.template.jsonassets/definitions/dmo.template.jsonassets/definitions/mapping.template.jsonassets/definitions/relationship.template.jsonassets/definitions/identity-resolution.template.jsonassets/definitions/data-graph.template.jsonassets/definitions/calculated-insight.template.jsonassets/definitions/segment.template.jsonassets/definitions/activation-target.template.jsonassets/definitions/activation.template.jsonassets/definitions/data-action-target.template.jsonassets/definitions/data-action.template.jsonassets/definitions/search-index.template.json
优先使用JSON定义文件和可重复脚本,而非一次性手动步骤。通用模板位于:
assets/definitions/data-stream.template.jsonassets/definitions/dmo.template.jsonassets/definitions/mapping.template.jsonassets/definitions/relationship.template.jsonassets/definitions/identity-resolution.template.jsonassets/definitions/data-graph.template.jsonassets/definitions/calculated-insight.template.jsonassets/definitions/segment.template.jsonassets/definitions/activation-target.template.jsonassets/definitions/activation.template.jsonassets/definitions/data-action-target.template.jsonassets/definitions/data-action.template.jsonassets/definitions/search-index.template.json
6. Verify after each phase
6. 每个阶段后进行验证
Typical verification:
- stream/DLO exists
- DMO/mapping exists
- identity resolution run completed
- unified records or segment counts look correct
- activation/search index status is healthy
典型验证内容:
- 流/DLO已存在
- DMO/映射已存在
- 身份解析运行完成
- 统一记录或细分群体计数符合预期
- 激活/搜索索引状态健康
High-Signal Gotchas
高信号注意事项
- requires
connection list.--connector-type - is useful when you need the full catalog, but first-page
dmo list --allis often enough for readiness checks and much faster.dmo list - Segment creation may need .
--api-version 64.0 - returns opaque IDs; use SQL joins for human-readable details.
segment members - can fail on partially provisioned orgs even when some read-only commands still work; fall back to targeted smoke checks.
sf data360 doctor - errors such as
query describeorCouldn't find CDP tenant IDare query-plane clues, not automatic proof that the whole product is disabled.DataModelEntity ... not found - Many long-running jobs are asynchronous in practice even when the command returns quickly.
- Some Data Cloud operations still require UI setup outside the CLI runtime.
- 需要
connection list参数。--connector-type - 当需要完整目录时,很有用,但首次执行
dmo list --all通常足以用于就绪检查,且速度快得多。dmo list - 创建细分群体可能需要参数。
--api-version 64.0 - 返回不透明ID;使用SQL关联获取可读详情。
segment members - 在部分配置的组织中,即使某些只读命令仍可正常工作,也可能失败;此时应使用针对性的冒烟测试。
sf data360 doctor - 错误(如
query describe或Couldn't find CDP tenant ID)是查询层面的线索,并非产品整体禁用的直接证据。DataModelEntity ... not found - 许多长时间运行的作业实际上是异步的,即使命令快速返回。
- 部分Data Cloud操作仍需要在CLI运行时之外通过UI完成设置。
Output Format
输出格式
When finishing, report in this order:
- Task classification
- Runtime status
- Readiness classification
- Phase(s) involved
- Commands or artifacts used
- Verification result
- Next recommended step
Suggested shape:
text
Data Cloud task: <setup / inspect / troubleshoot / migrate>
Runtime: <plugin ready / missing / partially verified>
Readiness: <ready / ready_empty / partial / feature_gated / blocked>
Phases: <connect / prepare / harmonize / segment / act / retrieve>
Artifacts: <json files, commands, scripts>
Verification: <passed / partial / blocked>
Next step: <next phase, setup guidance, or cross-skill handoff>完成任务后,按以下顺序报告:
- 任务分类
- 运行时状态
- 就绪状态分类
- 涉及阶段
- 使用的命令或工件
- 验证结果
- 推荐下一步
建议格式:
text
Data Cloud任务: <设置 / 检查 / 故障排查 / 迁移>
运行时: <插件就绪 / 缺失 / 部分验证>
就绪状态: <就绪 / 就绪但空 / 部分就绪 / 功能受限 / 阻塞>
涉及阶段: <connect / prepare / harmonize / segment / act / retrieve>
使用工件: <JSON文件、命令、脚本>
验证结果: <通过 / 部分通过 / 阻塞>
下一步: <下一阶段、设置指导或跨技能移交>Cross-Skill Integration
跨技能集成
| Need | Delegate to | Reason |
|---|---|---|
| load or clean CRM source data | handling-sf-data | seed or fix source records before ingestion |
| create missing CRM schema | generating-custom-object, generating-custom-field | Data Cloud expects existing objects/fields |
| deploy permissions or bundles | deploying-metadata | environment preparation |
| write Apex against Data Cloud outputs | generating-apex | code implementation |
| Flow automation after segmentation/activation | generating-flow | declarative orchestration |
| session tracing / STDM / parquet analysis | observing-agentforce | different Data Cloud use case |
| 需求 | 委托给 | 原因 |
|---|---|---|
| 加载或清理CRM源数据 | handling-sf-data | 在摄入前填充或修复源记录 |
| 创建缺失的CRM schema | generating-custom-object、generating-custom-field | Data Cloud依赖已存在的对象/字段 |
| 部署权限或包 | deploying-metadata | 环境准备 |
| 针对Data Cloud输出编写Apex | generating-apex | 代码实现 |
| 细分/激活后的Flow自动化 | generating-flow | 声明式编排 |
| 会话追踪/STDM/Parquet分析 | observing-agentforce | 不同的Data Cloud使用场景 |
Reference Map
参考地图
Start here
入门参考
- README.md
- references/plugin-setup.md
- references/feature-readiness.md
- UPSTREAM.md
- README.md
- references/plugin-setup.md
- references/feature-readiness.md
- UPSTREAM.md
Phase skills
阶段技能
- connecting-datacloud
- preparing-datacloud
- harmonizing-datacloud
- segmenting-datacloud
- activating-datacloud
- retrieving-datacloud
- connecting-datacloud
- preparing-datacloud
- harmonizing-datacloud
- segmenting-datacloud
- activating-datacloud
- retrieving-datacloud
Deterministic helpers
确定性辅助工具
- scripts/bootstrap-plugin.sh
- scripts/verify-plugin.sh
- scripts/diagnose-org.mjs
- assets/definitions/
- scripts/bootstrap-plugin.sh
- scripts/verify-plugin.sh
- scripts/diagnose-org.mjs
- assets/definitions/