connecting-datacloud
Compare original and translation side by side
🇺🇸
Original
English🇨🇳
Translation
Chineseconnecting-datacloud: Data Cloud Connect Phase
connecting-datacloud: Data Cloud Connect阶段
Use this skill when the user needs source connection work: connector discovery, connection metadata, connection testing, source-object browsing, connector schema inspection, or connector-specific setup payloads for external sources.
当用户需要进行源连接工作时使用此技能:连接器发现、连接元数据、连接测试、源对象浏览、连接器架构检查,或为外部源准备特定于连接器的设置负载。
When This Skill Owns the Task
此技能负责的任务场景
Use when the work involves:
connecting-datacloudsf data360 connection *- connector catalog inspection
- connection creation, update, test, or delete
- browsing source objects, fields, databases, or schemas
- identifying connector types already in use
- preparing connector definitions for Snowflake, SharePoint Unstructured, or Ingestion API sources
Delegate elsewhere when the user is:
- creating data streams or DLOs → preparing-datacloud
- creating DMOs, mappings, IR rulesets, or data graphs → harmonizing-datacloud
- writing Data Cloud SQL or search-index workflows → retrieving-datacloud
当工作涉及以下内容时,使用:
connecting-datacloud- 命令
sf data360 connection * - 连接器目录检查
- 连接的创建、更新、测试或删除
- 浏览源对象、字段、数据库或架构
- 识别已在使用的连接器类型
- 为Snowflake、SharePoint Unstructured或Ingestion API源准备连接器定义
当用户进行以下操作时,请转交至其他技能:
- 创建数据流或DLO → preparing-datacloud
- 创建DMO、映射、IR规则集或数据图谱 → harmonizing-datacloud
- 编写Data Cloud SQL或搜索索引工作流 → retrieving-datacloud
Required Context to Gather First
首先需要收集的必要上下文
Ask for or infer:
- target org alias
- connector type or source system
- whether the user wants inspection only or live mutation
- connection name or ID if one already exists
- whether credentials are already configured outside the CLI
- whether the user also expects stream creation right after connection setup
- whether the source is a database, an unstructured document source, or an Ingestion API feed
询问或推断:
- 目标组织别名
- 连接器类型或源系统
- 用户仅需检查还是需要实际修改
- 若连接已存在,需提供连接名称或ID
- 凭据是否已在CLI外部配置完成
- 用户是否希望在连接设置完成后立即创建数据流
- 源是数据库、非结构化文档源还是Ingestion API馈送
Core Operating Rules
核心操作规则
- Verify the plugin runtime first; see ../orchestrating-datacloud/references/plugin-setup.md.
- Run the shared readiness classifier before mutating connections: .
node ~/.claude/skills/orchestrating-datacloud/scripts/diagnose-org.mjs -o <org> --phase connect --json - Prefer read-only discovery before connection creation.
- Suppress linked-plugin warning noise with for standard usage.
2>/dev/null - Remember that requires
connection list.--connector-type - For , pass
connection testwhen resolving a non-Salesforce connection by name.--connector-type - Discover existing connector types from streams first when the org is unfamiliar.
- Use curated example payloads before inventing connector-specific credentials or parameters.
- For connector types outside the curated examples, inspect a known-good UI-created connection via REST before building JSON.
- Do not promise API-based stream creation for every connector type just because connection creation succeeds.
- 首先验证插件运行时;请参阅../orchestrating-datacloud/references/plugin-setup.md。
- 在修改连接之前运行共享就绪分类器:。
node ~/.claude/skills/orchestrating-datacloud/scripts/diagnose-org.mjs -o <org> --phase connect --json - 在创建连接之前优先进行只读发现操作。
- 使用抑制标准使用中的链接插件警告信息。
2>/dev/null - 记住命令需要
connection list参数。--connector-type - 对于命令,当通过名称解析非Salesforce连接时,需传递
connection test参数。--connector-type - 当不熟悉组织环境时,先从数据流中发现现有连接器类型。
- 在自行创建连接器特定的凭据或参数之前,优先使用经过整理的示例负载。
- 对于整理示例之外的连接器类型,先通过REST检查一个已知可用的UI创建连接,再构建JSON。
- 不要仅仅因为连接创建成功,就承诺为每种连接器类型都支持基于API的数据流创建。
Recommended Workflow
推荐工作流
1. Classify readiness for connect work
1. 分类连接工作的就绪状态
bash
node ~/.claude/skills/orchestrating-datacloud/scripts/diagnose-org.mjs -o <org> --phase connect --jsonbash
node ~/.claude/skills/orchestrating-datacloud/scripts/diagnose-org.mjs -o <org> --phase connect --json2. Discover connector types
2. 发现连接器类型
bash
sf data360 connection connector-list -o <org> 2>/dev/null
sf data360 data-stream list -o <org> 2>/dev/nullbash
sf data360 connection connector-list -o <org> 2>/dev/null
sf data360 data-stream list -o <org> 2>/dev/null3. Inspect connections by type
3. 按类型检查连接
bash
sf data360 connection list -o <org> --connector-type SalesforceDotCom 2>/dev/null
sf data360 connection list -o <org> --connector-type REDSHIFT 2>/dev/null
sf data360 connection list -o <org> --connector-type SNOWFLAKE 2>/dev/nullbash
sf data360 connection list -o <org> --connector-type SalesforceDotCom 2>/dev/null
sf data360 connection list -o <org> --connector-type REDSHIFT 2>/dev/null
sf data360 connection list -o <org> --connector-type SNOWFLAKE 2>/dev/null4. Inspect a specific connection or uploaded schema
4. 检查特定连接或已上传的架构
bash
sf data360 connection get -o <org> --name <connection> 2>/dev/null
sf data360 connection objects -o <org> --name <connection> 2>/dev/null
sf data360 connection fields -o <org> --name <connection> 2>/dev/null
sf data360 connection schema-get -o <org> --name <connection-id> 2>/dev/nullbash
sf data360 connection get -o <org> --name <connection> 2>/dev/null
sf data360 connection objects -o <org> --name <connection> 2>/dev/null
sf data360 connection fields -o <org> --name <connection> 2>/dev/null
sf data360 connection schema-get -o <org> --name <connection-id> 2>/dev/null5. Test or create only after discovery
5. 仅在发现后进行测试或创建
bash
sf data360 connection test -o <org> --name <connection> --connector-type <type> 2>/dev/null
sf data360 connection create -o <org> -f connection.json 2>/dev/nullbash
sf data360 connection test -o <org> --name <connection> --connector-type <type> 2>/dev/null
sf data360 connection create -o <org> -f connection.json 2>/dev/null6. Start from curated example payloads for external connectors
6. 从外部连接器的整理示例负载开始
Use the phase-owned examples before inventing a payload from scratch:
examples/connections/heroku-postgres.jsonexamples/connections/redshift.jsonexamples/connections/sharepoint-unstructured.jsonexamples/connections/snowflake-connection.jsonexamples/connections/ingest-api-connection.jsonexamples/connections/ingest-api-schema.json
Typical Ingestion API setup flow:
bash
sf data360 connection create -o <org> -f examples/connections/ingest-api-connection.json 2>/dev/null
sf data360 connection schema-upsert -o <org> --name <connector-id> -f examples/connections/ingest-api-schema.json 2>/dev/null
sf data360 connection schema-get -o <org> --name <connector-id> 2>/dev/null在自行创建负载之前,先使用本阶段提供的示例:
examples/connections/heroku-postgres.jsonexamples/connections/redshift.jsonexamples/connections/sharepoint-unstructured.jsonexamples/connections/snowflake-connection.jsonexamples/connections/ingest-api-connection.jsonexamples/connections/ingest-api-schema.json
典型的Ingestion API设置流程:
bash
sf data360 connection create -o <org> -f examples/connections/ingest-api-connection.json 2>/dev/null
sf data360 connection schema-upsert -o <org> --name <connector-id> -f examples/connections/ingest-api-schema.json 2>/dev/null
sf data360 connection schema-get -o <org> --name <connector-id> 2>/dev/null7. Discover payload fields for unknown connector types
7. 为未知连接器类型发现负载字段
Create one in the UI, then inspect it directly:
bash
sf api request rest "/services/data/v66.0/ssot/connections/<id>" -o <org>先在UI中创建一个连接,然后直接检查它:
bash
sf api request rest "/services/data/v66.0/ssot/connections/<id>" -o <org>High-Signal Gotchas
高信号注意事项
- has no true global "list all" mode; query by connector type.
connection list - The connector catalog name and connection connector type are not always the same label.
- may need
connection testfor name resolution when the source is not a default Salesforce connector.--connector-type - An empty connection list usually means "enabled but not configured yet", not "feature disabled".
- Heroku Postgres, Redshift, Snowflake, SharePoint Unstructured, and Ingestion API all use different credential and parameter shapes; reuse the curated examples instead of guessing.
- SharePoint Unstructured uses ,
clientId, andclientSecretin thetokenEndpointarray and does not require acredentialsarray.parameters - Snowflake uses key-pair auth and can often be created through the API, but downstream stream creation can still remain UI-only.
- Ingestion API connector setup is incomplete until has uploaded the object schema.
connection schema-upsert - Some external connector credential setup still depends on UI-side configuration or external-system permissions.
- 没有真正的全局“列出全部”模式;需按连接器类型查询。
connection list - 连接器目录名称和连接连接器类型并不总是相同的标签。
- 当源不是默认Salesforce连接器时,可能需要
connection test来进行名称解析。--connector-type - 空的连接列表通常意味着“已启用但尚未配置”,而非“功能已禁用”。
- Heroku Postgres、Redshift、Snowflake、SharePoint Unstructured和Ingestion API都使用不同的凭据和参数结构;请复用整理好的示例,不要自行猜测。
- SharePoint Unstructured在数组中使用
credentials、clientId和clientSecret,且不需要tokenEndpoint数组。parameters - Snowflake使用密钥对认证,通常可以通过API创建,但下游数据流创建可能仍需通过UI完成。
- 直到上传了对象架构,Ingestion API连接器的设置才算完成。
connection schema-upsert - 某些外部连接器的凭据设置仍然依赖于UI端配置或外部系统权限。
Output Format
输出格式
text
Connect task: <inspect / create / test / update>
Connector type: <SalesforceDotCom / REDSHIFT / SNOWFLAKE / SPUnstructuredDocument / IngestApi / ...>
Target org: <alias>
Commands: <key commands run>
Verification: <passed / partial / blocked>
Next step: <prepare phase or connector follow-up>text
连接任务: <检查 / 创建 / 测试 / 更新>
连接器类型: <SalesforceDotCom / REDSHIFT / SNOWFLAKE / SPUnstructuredDocument / IngestApi / ...>
目标组织: <别名>
执行命令: <已运行的关键命令>
验证结果: <通过 / 部分通过 / 受阻>
下一步: <准备阶段或连接器后续操作>References
参考资料
- README.md
- examples/connections/heroku-postgres.json
- examples/connections/redshift.json
- examples/connections/sharepoint-unstructured.json
- examples/connections/snowflake-connection.json
- examples/connections/ingest-api-connection.json
- examples/connections/ingest-api-schema.json
- ../orchestrating-datacloud/references/plugin-setup.md
- ../orchestrating-datacloud/references/feature-readiness.md
- ../orchestrating-datacloud/UPSTREAM.md
- README.md
- examples/connections/heroku-postgres.json
- examples/connections/redshift.json
- examples/connections/sharepoint-unstructured.json
- examples/connections/snowflake-connection.json
- examples/connections/ingest-api-connection.json
- examples/connections/ingest-api-schema.json
- ../orchestrating-datacloud/references/plugin-setup.md
- ../orchestrating-datacloud/references/feature-readiness.md
- ../orchestrating-datacloud/UPSTREAM.md