migrate-to-dovetail
Compare original and translation side by side
🇺🇸
Original
English🇨🇳
Translation
ChineseMigrate data into Dovetail
迁移数据到Dovetail
Use the CLI to import content from an existing tool into your Dovetail workspace. Always preview before importing and wait for user approval before running the actual migration.
dt使用 CLI将现有工具中的内容导入到你的Dovetail工作区。请始终遵循导入前先预览的原则,在执行实际迁移操作前必须获得用户的明确同意。
dtPrerequisites
前提条件
- is installed — if not, run
dtnpm install -g @heydovetail/dt - Workspace is configured — if not, run
dt init - is set (from workspace Settings > API)
DOVETAIL_API_KEY
- 已安装— 若未安装,执行
dtnpm install -g @heydovetail/dt - 已配置工作区 — 若未配置,执行
dt init - 已设置(可在工作区设置 > API中获取)
DOVETAIL_API_KEY
Standard workflow
标准工作流
install → init → source add → validate → preview → [user approves] → migrate runbash
undefinedinstall → init → source add → validate → preview → [用户确认] → migrate runbash
undefined1. Connect a source
1. 连接数据源
dt source add <source>
dt source add <source>
2. Test connectivity
2. 测试连通性
dt source validate <source>
dt source validate <source>
3. Preview what will be imported
3. 预览待导入内容
dt migrate run --source <source> --preview
dt migrate run --source <source> --preview
4. [Wait for user approval]
4. [等待用户确认]
5. Run the import
5. 执行导入
dt migrate run --source <source>
**Always show the preview output to the user and ask for explicit approval before step 5.**
---dt migrate run --source <source>
**在执行第5步前,请务必向用户展示预览结果并获取明确同意。**
---dt.json config structure
dt.json配置结构
When writing manually, migrations follow this structure:
dt.jsonjson
{
"dovetail": {
"api_key": "your-api-key",
"base_url": "https://mycompany.dovetail.com"
},
"sources": {
"<source>": {
"<option-key>": "<value>"
}
},
"migrations": [
{
"source": "<source>",
"content": ["<url-id-or-type>"],
"destination": {
"type": "data",
"parent": {
"type": "project",
"name": "Destination Project Name"
}
},
"options": {}
}
]
}- — array of URLs, IDs, space keys, or type names passed to the source
content - —
destination.type(recordings, transcripts, files) or"data"(reports, notes)"doc" - —
destination.parent.typeor"project""folder" - — source-specific extras; rarely needed (see Marvin and Local below)
options
手动编写时,迁移配置遵循以下结构:
dt.jsonjson
{
"dovetail": {
"api_key": "your-api-key",
"base_url": "https://mycompany.dovetail.com"
},
"sources": {
"<source>": {
"<option-key>": "<value>"
}
},
"migrations": [
{
"source": "<source>",
"content": ["<url-id-or-type>"],
"destination": {
"type": "data",
"parent": {
"type": "project",
"name": "Destination Project Name"
}
},
"options": {}
}
]
}- — 传递给数据源的URL、ID、空间键或类型名称组成的数组
content - —
destination.type(录制文件、转录文本、文件)或"data"(报告、笔记)"doc" - —
destination.parent.type或"project""folder" - — 数据源专属的额外配置,极少需要用到(参见下文的Marvin和本地文件部分)
options
Sources
数据源
Notion
Notion
What migrates: Pages and database entries, formatting, attachments, metadata, child pages (up to 3 levels deep).
Prerequisites: Create a Notion integration at https://www.notion.so/my-integrations and share your databases with it.
bash
export NOTION_TOKEN="ntn_xxxxxxxxxxxx"
dt source add notion
dt source validate notion
dt migrate run --source notion --preview可迁移内容: 页面和数据库条目、格式、附件、元数据、子页面(最多3层)
bash
export NOTION_TOKEN="ntn_xxxxxxxxxxxx"
dt source add notion
dt source validate notion
dt migrate run --source notion --previewwait for approval
wait for approval
dt migrate run --source notion
**Manual `dt.json`:**
```json
{
"sources": {
"notion": {
"token": "ntn_xxxxxxxxxxxx"
}
},
"migrations": [
{
"source": "notion",
"content": ["notion-database-id-here"],
"destination": {
"type": "data",
"parent": { "type": "project", "name": "Customer Research" }
}
}
]
}dt migrate run --source notion
**手动配置`dt.json`示例:**
```json
{
"sources": {
"notion": {
"token": "ntn_xxxxxxxxxxxx"
}
},
"migrations": [
{
"source": "notion",
"content": ["notion-database-id-here"],
"destination": {
"type": "data",
"parent": { "type": "project", "name": "Customer Research" }
}
}
]
}Confluence
Confluence
What migrates: Cloud pages and blog posts, attachments, labels, author info, creation/modification dates.
Prerequisites: Create a Confluence API token at https://id.atlassian.com/manage-profile/security/api-tokens.
bash
export CONFLUENCE_API_TOKEN="your-api-token"
dt source add confluence
dt source validate confluence
dt migrate run --source confluence --preview可迁移内容: 云端页面和博客文章、附件、标签、作者信息、创建/修改日期
bash
export CONFLUENCE_API_TOKEN="your-api-token"
dt source add confluence
dt source validate confluence
dt migrate run --source confluence --previewwait for approval
wait for approval
dt migrate run --source confluence
**Manual `dt.json`:**
```json
{
"sources": {
"confluence": {
"base_url": "https://mycompany.atlassian.net/wiki",
"email": "you@company.com",
"token": ""
}
},
"migrations": [
{
"source": "confluence",
"content": ["PROD"],
"destination": {
"type": "doc",
"parent": { "type": "project", "name": "Product Knowledge Base" }
}
}
]
}The value is the Confluence space key (e.g. ).
content"PROD"dt migrate run --source confluence
**手动配置`dt.json`示例:**
```json
{
"sources": {
"confluence": {
"base_url": "https://mycompany.atlassian.net/wiki",
"email": "you@company.com",
"token": ""
}
},
"migrations": [
{
"source": "confluence",
"content": ["PROD"],
"destination": {
"type": "doc",
"parent": { "type": "project", "name": "Product Knowledge Base" }
}
}
]
}content"PROD"Google Drive
Google Drive
What migrates: Google Docs (as HTML), optionally Sheets (as CSV), PDFs, images, and other files. Folder structure is preserved.
Prerequisites (choose one):
- OAuth: Create a Google Cloud project, enable the Drive API, create an OAuth Desktop client ID. The wizard opens a browser sign-in.
- Service account: Create a service account, download the JSON key, share your Drive folder with the service account email.
bash
undefined可迁移内容: Google Docs(转为HTML格式),可选导入Sheets(转为CSV格式)、PDF、图片和其他文件,文件夹结构会保留。
前提条件(二选一):
- OAuth: 创建Google Cloud项目,启用Drive API,创建OAuth桌面客户端ID,向导会打开浏览器登录页面。
- 服务账号: 创建服务账号,下载JSON密钥,将你的Drive文件夹共享给服务账号邮箱。
bash
undefinedService account only:
仅服务账号模式需要:
export GOOGLE_APPLICATION_CREDENTIALS="/path/to/key.json"
dt source add gdrive
dt source validate gdrive
dt migrate run --source gdrive --preview
export GOOGLE_APPLICATION_CREDENTIALS="/path/to/key.json"
dt source add gdrive
dt source validate gdrive
dt migrate run --source gdrive --preview
wait for approval
wait for approval
dt migrate run --source gdrive
**Manual `dt.json`:**
```json
{
"sources": {
"gdrive": {
"credentials_file": "/path/to/key.json",
"recursive": "true",
"include_binary": "true"
}
},
"migrations": [
{
"source": "gdrive",
"content": ["1BxiMVs0XRA5nFMdKvBdBZjgmUUqptlbs"],
"destination": {
"type": "data",
"parent": { "type": "project", "name": "UX Research Archive" }
}
}
]
}The value is the Google Drive folder ID.
contentdt migrate run --source gdrive
**手动配置`dt.json`示例:**
```json
{
"sources": {
"gdrive": {
"credentials_file": "/path/to/key.json",
"recursive": "true",
"include_binary": "true"
}
},
"migrations": [
{
"source": "gdrive",
"content": ["1BxiMVs0XRA5nFMdKvBdBZjgmUUqptlbs"],
"destination": {
"type": "data",
"parent": { "type": "project", "name": "UX Research Archive" }
}
}
]
}contentAirtable
Airtable
What migrates: Records with all field values, attachments, linked records, and metadata.
Prerequisites: Create a personal access token at https://airtable.com/create/tokens with and scopes.
data.records:readschema.bases:readbash
export AIRTABLE_TOKEN="your-token"
dt source add airtable
dt source validate airtable
dt migrate run --source airtable --preview可迁移内容: 包含所有字段值的记录、附件、关联记录和元数据。
bash
export AIRTABLE_TOKEN="your-token"
dt source add airtable
dt source validate airtable
dt migrate run --source airtable --previewwait for approval
wait for approval
dt migrate run --source airtable
**Manual `dt.json`:**
```json
{
"sources": {
"airtable": {
"token": ""
}
},
"migrations": [
{
"source": "airtable",
"content": ["https://airtable.com/appXXXXXXXXXXXXXX/tblXXXXXXXXXXXXXX"],
"destination": {
"type": "data",
"parent": { "type": "project", "name": "User Interview Tracker" }
}
}
]
}The value is the full URL to the Airtable table.
contentdt migrate run --source airtable
**手动配置`dt.json`示例:**
```json
{
"sources": {
"airtable": {
"token": ""
}
},
"migrations": [
{
"source": "airtable",
"content": ["https://airtable.com/appXXXXXXXXXXXXXX/tblXXXXXXXXXXXXXX"],
"destination": {
"type": "data",
"parent": { "type": "project", "name": "User Interview Tracker" }
}
}
]
}contentMarvin
Marvin
What migrates: Video/audio recordings (Dovetail auto-transcribes) and insights as docs.
Marvin has no public export API. You must first export your data locally:
Step 1: Export from Marvin
Use the skill to automate this with browser automation, or manually:
/marvin-download- Download recordings from each project: All Actions > Download Video
- Copy insight content into markdown files
Expected export structure:
marvin-export/
Project Name/
files/
Interview 1.mp4
insights/
My Report/
content.md
metadata.jsonStep 2: Import with dt
bash
dt source add marvin
dt migrate run --source marvin --preview可迁移内容: 视频/音频录制文件(Dovetail会自动转录)和洞察内容(作为文档导入)。
Marvin没有公开的导出API,你需要先将数据导出到本地:
步骤1:从Marvin导出数据
使用 skill通过浏览器自动化完成导出,也可以手动操作:
/marvin-download- 下载每个项目的录制文件:所有操作 > 下载视频
- 将洞察内容复制到markdown文件中
预期的导出目录结构:
marvin-export/
Project Name/
files/
Interview 1.mp4
insights/
My Report/
content.md
metadata.json步骤2:使用dt导入
bash
dt source add marvin
dt migrate run --source marvin --previewwait for approval
wait for approval
dt migrate run --source marvin
**Manual `dt.json`:**
```json
{
"migrations": [
{
"source": "marvin",
"content": ["./marvin-export/Customer Interviews"],
"destination": {
"type": "data",
"parent": { "type": "project", "name": "Customer Interviews" }
},
"options": { "types": "files" }
},
{
"source": "marvin",
"content": ["./marvin-export/Customer Interviews"],
"destination": {
"type": "doc",
"parent": { "type": "project", "name": "Customer Interviews" }
},
"options": { "types": "insights" }
}
]
}The value is the path to the exported project directory. is or .
contentoptions.types"files""insights"dt migrate run --source marvin
**手动配置`dt.json`示例:**
```json
{
"migrations": [
{
"source": "marvin",
"content": ["./marvin-export/Customer Interviews"],
"destination": {
"type": "data",
"parent": { "type": "project", "name": "Customer Interviews" }
},
"options": { "types": "files" }
},
{
"source": "marvin",
"content": ["./marvin-export/Customer Interviews"],
"destination": {
"type": "doc",
"parent": { "type": "project", "name": "Customer Interviews" }
},
"options": { "types": "insights" }
}
]
}contentoptions.types"files""insights"Condens
Condens
What migrates: Sessions (notes, transcripts, video) as data; artifacts (reports) as docs.
Step 1: Export from Condens
In Condens: Project > Settings > Export data — download the file.
.zipStep 2: Import with dt
bash
dt source add condens
dt migrate run --source condens --preview可迁移内容: 会话数据(笔记、转录文本、视频)作为数据类型导入;产物(报告)作为文档类型导入。
步骤1:从Condens导出数据
在Condens中操作:项目 > 设置 > 导出数据 — 下载格式的导出文件。
.zip步骤2:使用dt导入
bash
dt source add condens
dt migrate run --source condens --previewwait for approval
wait for approval
dt migrate run --source condens
**Manual `dt.json`:**
```json
{
"migrations": [
{
"source": "condens",
"content": ["/path/to/Onboarding Research.abc123.zip"],
"destination": {
"type": "data",
"parent": { "type": "project", "name": "Onboarding Research" }
}
}
]
}The value is the path to the Condens export zip file.
contentdt migrate run --source condens
**手动配置`dt.json`示例:**
```json
{
"migrations": [
{
"source": "condens",
"content": ["/path/to/Onboarding Research.abc123.zip"],
"destination": {
"type": "data",
"parent": { "type": "project", "name": "Onboarding Research" }
}
}
]
}contentEnjoyHQ
EnjoyHQ
What migrates: Stories, projects, and documents with labels, state, customer info, and tags.
Prerequisites: Get your API token from EnjoyHQ workspace settings.
bash
export ENJOYHQ_API_TOKEN="your-token"
dt source add enjoyhq
dt source validate enjoyhq
dt migrate run --source enjoyhq --preview可迁移内容: 故事、项目、文档,附带标签、状态、客户信息和标签。
前提条件: 从EnjoyHQ工作区设置中获取你的API令牌。
bash
export ENJOYHQ_API_TOKEN="your-token"
dt source add enjoyhq
dt source validate enjoyhq
dt migrate run --source enjoyhq --previewwait for approval
wait for approval
dt migrate run --source enjoyhq
---dt migrate run --source enjoyhq
---Productboard
Productboard
What migrates: Notes (customer feedback) and features (with descriptions and metadata).
Prerequisites: Go to Settings → Integrations → Public API → Generate Access Token.
bash
export PRODUCTBOARD_API_TOKEN="your-token"
dt source add productboard
dt source validate productboard
dt migrate run --source productboard --preview可迁移内容: 笔记(客户反馈)和功能(附带描述和元数据)。
前提条件: 前往设置 → 集成 → 公共API → 生成访问令牌。
bash
export PRODUCTBOARD_API_TOKEN="your-token"
dt source add productboard
dt source validate productboard
dt migrate run --source productboard --previewwait for approval
wait for approval
dt migrate run --source productboard
**Manual `dt.json`:**
```json
{
"sources": {
"productboard": {
"token": ""
}
},
"migrations": [
{
"source": "productboard",
"content": ["notes"],
"destination": {
"type": "data",
"parent": { "type": "project", "name": "Productboard Notes" }
}
},
{
"source": "productboard",
"content": ["features"],
"destination": {
"type": "data",
"parent": { "type": "project", "name": "Productboard Features" }
}
}
]
}The value is the entity type: or . Both import as data.
content"notes""features"dt migrate run --source productboard
**手动配置`dt.json`示例:**
```json
{
"sources": {
"productboard": {
"token": ""
}
},
"migrations": [
{
"source": "productboard",
"content": ["notes"],
"destination": {
"type": "data",
"parent": { "type": "project", "name": "Productboard Notes" }
}
},
{
"source": "productboard",
"content": ["features"],
"destination": {
"type": "data",
"parent": { "type": "project", "name": "Productboard Features" }
}
}
]
}content"notes""features"Local files
本地文件
What migrates: , , files become the record body. Everything else (PDFs, videos, images, Word docs) is uploaded as an attachment.
.md.txt.htmlbash
dt source add local
dt migrate run --source local --preview可迁移内容: 、、文件会作为记录主体导入,其他所有文件(PDF、视频、图片、Word文档)会作为附件上传。
.md.txt.htmlbash
dt source add local
dt migrate run --source local --previewwait for approval
wait for approval
dt migrate run --source local
**Manual `dt.json`:**
```json
{
"migrations": [
{
"source": "local",
"content": ["/path/to/your/folder"],
"destination": {
"type": "data",
"parent": { "type": "project", "name": "Research Files" }
},
"options": { "extensions": ".md,.pdf,.mp4" }
}
]
}The value is the path to the local folder. is optional; omit to include all files.
contentoptions.extensionsdt migrate run --source local
**手动配置`dt.json`示例:**
```json
{
"migrations": [
{
"source": "local",
"content": ["/path/to/your/folder"],
"destination": {
"type": "data",
"parent": { "type": "project", "name": "Research Files" }
},
"options": { "extensions": ".md,.pdf,.mp4" }
}
]
}contentoptions.extensionsUseful flags
实用参数
| Flag | Description |
|---|---|
| Show what would be migrated without fetching |
| Fetch and transform but skip upload |
| Re-run without creating duplicates |
| Process at most N records per migration |
| Filter by title |
| Filter by creation date |
| Filter by update date |
| Detailed per-record progress logs |
| Run only migrations for source X |
| 参数 | 说明 |
|---|---|
| 展示待迁移内容,无需拉取实际数据 |
| 拉取并转换数据,但跳过上传步骤 |
| 重复执行时不会创建重复数据 |
| 每次迁移最多处理N条记录 |
| 按标题过滤 |
| 按创建日期过滤 |
| 按更新日期过滤 |
| 输出每条记录的详细进度日志 |
| 仅执行数据源X对应的迁移任务 |
Destination types
目标类型
When using and flags on the CLI, use the legacy type strings:
--project--type | | | Description |
|---|---|---|---|
| | | Data (recordings, transcripts, files) inside a project |
| | | Docs inside a project |
| | | Docs directly in a folder |
在CLI中使用和参数时,请使用旧版类型字符串:
--project--type | dt.json中的 | | 说明 |
|---|---|---|---|
| | | 项目内的数据(录制文件、转录文本、文件) |
| | | 项目内的文档 |
| | | 直接存放在文件夹中的文档 |
Running all migrations at once
批量执行所有迁移任务
If multiple sources are configured in :
dt.jsonbash
dt migrate run # run everything
dt migrate run --source notion # run one source only如果中配置了多个数据源:
dt.jsonbash
dt migrate run # 执行所有迁移任务
dt migrate run --source notion # 仅执行Notion数据源的迁移任务Troubleshooting
故障排查
| Problem | Fix |
|---|---|
| "no records fetched" | Check source access: Notion database shared with integration, Drive folder shared with service account, correct Confluence space key, valid Airtable token scopes |
| "403 Forbidden" | Dovetail API key may lack permissions, or source account doesn't have read access |
| "401 Unauthorized" | API key or token is invalid or expired |
| Rate limited | Automatic — the CLI retries with exponential backoff |
| Migration interrupted | Run |
| Migration seems stuck | Run |
Run for a full environment and connectivity diagnostic.
dt doctor| 问题 | 解决方法 |
|---|---|
| "未拉取到任何记录" | 检查数据源访问权限:Notion数据库是否已共享给集成、Drive文件夹是否已共享给服务账号、Confluence空间键是否正确、Airtable令牌的权限范围是否有效 |
| "403 Forbidden" | Dovetail API密钥可能缺少权限,或数据源账号没有读取权限 |
| "401 Unauthorized" | API密钥或令牌无效或已过期 |
| 触发频率限制 | 自动处理 — CLI会采用指数退避策略重试 |
| 迁移中断 | 执行 |
| 迁移看起来卡住了 | 执行 |
执行可完成完整的环境和连通性诊断。
dt doctor