migrate-to-dovetail

Compare original and translation side by side

🇺🇸

Original

English
🇨🇳

Translation

Chinese

Migrate data into Dovetail

迁移数据到Dovetail

Use the
dt
CLI to import content from an existing tool into your Dovetail workspace. Always preview before importing and wait for user approval before running the actual migration.
使用
dt
CLI将现有工具中的内容导入到你的Dovetail工作区。请始终遵循导入前先预览的原则,在执行实际迁移操作前必须获得用户的明确同意。

Prerequisites

前提条件

  1. dt
    is installed — if not, run
    npm install -g @heydovetail/dt
  2. Workspace is configured — if not, run
    dt init
  3. DOVETAIL_API_KEY
    is set (from workspace Settings > API)
  1. 已安装
    dt
    — 若未安装,执行
    npm install -g @heydovetail/dt
  2. 已配置工作区 — 若未配置,执行
    dt init
  3. 已设置
    DOVETAIL_API_KEY
    (可在工作区设置 > API中获取)

Standard workflow

标准工作流

install → init → source add → validate → preview → [user approves] → migrate run
bash
undefined
install → init → source add → validate → preview → [用户确认] → migrate run
bash
undefined

1. Connect a source

1. 连接数据源

dt source add <source>
dt source add <source>

2. Test connectivity

2. 测试连通性

dt source validate <source>
dt source validate <source>

3. Preview what will be imported

3. 预览待导入内容

dt migrate run --source <source> --preview
dt migrate run --source <source> --preview

4. [Wait for user approval]

4. [等待用户确认]

5. Run the import

5. 执行导入

dt migrate run --source <source>

**Always show the preview output to the user and ask for explicit approval before step 5.**

---
dt migrate run --source <source>

**在执行第5步前,请务必向用户展示预览结果并获取明确同意。**

---

dt.json config structure

dt.json配置结构

When writing
dt.json
manually, migrations follow this structure:
json
{
  "dovetail": {
    "api_key": "your-api-key",
    "base_url": "https://mycompany.dovetail.com"
  },
  "sources": {
    "<source>": {
      "<option-key>": "<value>"
    }
  },
  "migrations": [
    {
      "source": "<source>",
      "content": ["<url-id-or-type>"],
      "destination": {
        "type": "data",
        "parent": {
          "type": "project",
          "name": "Destination Project Name"
        }
      },
      "options": {}
    }
  ]
}
  • content
    — array of URLs, IDs, space keys, or type names passed to the source
  • destination.type
    "data"
    (recordings, transcripts, files) or
    "doc"
    (reports, notes)
  • destination.parent.type
    "project"
    or
    "folder"
  • options
    — source-specific extras; rarely needed (see Marvin and Local below)

手动编写
dt.json
时,迁移配置遵循以下结构:
json
{
  "dovetail": {
    "api_key": "your-api-key",
    "base_url": "https://mycompany.dovetail.com"
  },
  "sources": {
    "<source>": {
      "<option-key>": "<value>"
    }
  },
  "migrations": [
    {
      "source": "<source>",
      "content": ["<url-id-or-type>"],
      "destination": {
        "type": "data",
        "parent": {
          "type": "project",
          "name": "Destination Project Name"
        }
      },
      "options": {}
    }
  ]
}
  • content
    — 传递给数据源的URL、ID、空间键或类型名称组成的数组
  • destination.type
    "data"
    (录制文件、转录文本、文件)或
    "doc"
    (报告、笔记)
  • destination.parent.type
    "project"
    "folder"
  • options
    — 数据源专属的额外配置,极少需要用到(参见下文的Marvin和本地文件部分)

Sources

数据源

Notion

Notion

What migrates: Pages and database entries, formatting, attachments, metadata, child pages (up to 3 levels deep).
Prerequisites: Create a Notion integration at https://www.notion.so/my-integrations and share your databases with it.
bash
export NOTION_TOKEN="ntn_xxxxxxxxxxxx"
dt source add notion
dt source validate notion
dt migrate run --source notion --preview
可迁移内容: 页面和数据库条目、格式、附件、元数据、子页面(最多3层)
bash
export NOTION_TOKEN="ntn_xxxxxxxxxxxx"
dt source add notion
dt source validate notion
dt migrate run --source notion --preview

wait for approval

wait for approval

dt migrate run --source notion

**Manual `dt.json`:**
```json
{
  "sources": {
    "notion": {
      "token": "ntn_xxxxxxxxxxxx"
    }
  },
  "migrations": [
    {
      "source": "notion",
      "content": ["notion-database-id-here"],
      "destination": {
        "type": "data",
        "parent": { "type": "project", "name": "Customer Research" }
      }
    }
  ]
}

dt migrate run --source notion

**手动配置`dt.json`示例:**
```json
{
  "sources": {
    "notion": {
      "token": "ntn_xxxxxxxxxxxx"
    }
  },
  "migrations": [
    {
      "source": "notion",
      "content": ["notion-database-id-here"],
      "destination": {
        "type": "data",
        "parent": { "type": "project", "name": "Customer Research" }
      }
    }
  ]
}

Confluence

Confluence

What migrates: Cloud pages and blog posts, attachments, labels, author info, creation/modification dates.
Prerequisites: Create a Confluence API token at https://id.atlassian.com/manage-profile/security/api-tokens.
bash
export CONFLUENCE_API_TOKEN="your-api-token"
dt source add confluence
dt source validate confluence
dt migrate run --source confluence --preview
可迁移内容: 云端页面和博客文章、附件、标签、作者信息、创建/修改日期
bash
export CONFLUENCE_API_TOKEN="your-api-token"
dt source add confluence
dt source validate confluence
dt migrate run --source confluence --preview

wait for approval

wait for approval

dt migrate run --source confluence

**Manual `dt.json`:**
```json
{
  "sources": {
    "confluence": {
      "base_url": "https://mycompany.atlassian.net/wiki",
      "email": "you@company.com",
      "token": ""
    }
  },
  "migrations": [
    {
      "source": "confluence",
      "content": ["PROD"],
      "destination": {
        "type": "doc",
        "parent": { "type": "project", "name": "Product Knowledge Base" }
      }
    }
  ]
}
The
content
value is the Confluence space key (e.g.
"PROD"
).

dt migrate run --source confluence

**手动配置`dt.json`示例:**
```json
{
  "sources": {
    "confluence": {
      "base_url": "https://mycompany.atlassian.net/wiki",
      "email": "you@company.com",
      "token": ""
    }
  },
  "migrations": [
    {
      "source": "confluence",
      "content": ["PROD"],
      "destination": {
        "type": "doc",
        "parent": { "type": "project", "name": "Product Knowledge Base" }
      }
    }
  ]
}
content
的值为Confluence空间键(例如
"PROD"
)。

Google Drive

Google Drive

What migrates: Google Docs (as HTML), optionally Sheets (as CSV), PDFs, images, and other files. Folder structure is preserved.
Prerequisites (choose one):
  • OAuth: Create a Google Cloud project, enable the Drive API, create an OAuth Desktop client ID. The wizard opens a browser sign-in.
  • Service account: Create a service account, download the JSON key, share your Drive folder with the service account email.
bash
undefined
可迁移内容: Google Docs(转为HTML格式),可选导入Sheets(转为CSV格式)、PDF、图片和其他文件,文件夹结构会保留。
前提条件(二选一):
  • OAuth: 创建Google Cloud项目,启用Drive API,创建OAuth桌面客户端ID,向导会打开浏览器登录页面。
  • 服务账号: 创建服务账号,下载JSON密钥,将你的Drive文件夹共享给服务账号邮箱。
bash
undefined

Service account only:

仅服务账号模式需要:

export GOOGLE_APPLICATION_CREDENTIALS="/path/to/key.json"
dt source add gdrive dt source validate gdrive dt migrate run --source gdrive --preview
export GOOGLE_APPLICATION_CREDENTIALS="/path/to/key.json"
dt source add gdrive dt source validate gdrive dt migrate run --source gdrive --preview

wait for approval

wait for approval

dt migrate run --source gdrive

**Manual `dt.json`:**
```json
{
  "sources": {
    "gdrive": {
      "credentials_file": "/path/to/key.json",
      "recursive": "true",
      "include_binary": "true"
    }
  },
  "migrations": [
    {
      "source": "gdrive",
      "content": ["1BxiMVs0XRA5nFMdKvBdBZjgmUUqptlbs"],
      "destination": {
        "type": "data",
        "parent": { "type": "project", "name": "UX Research Archive" }
      }
    }
  ]
}
The
content
value is the Google Drive folder ID.

dt migrate run --source gdrive

**手动配置`dt.json`示例:**
```json
{
  "sources": {
    "gdrive": {
      "credentials_file": "/path/to/key.json",
      "recursive": "true",
      "include_binary": "true"
    }
  },
  "migrations": [
    {
      "source": "gdrive",
      "content": ["1BxiMVs0XRA5nFMdKvBdBZjgmUUqptlbs"],
      "destination": {
        "type": "data",
        "parent": { "type": "project", "name": "UX Research Archive" }
      }
    }
  ]
}
content
的值为Google Drive文件夹ID。

Airtable

Airtable

What migrates: Records with all field values, attachments, linked records, and metadata.
Prerequisites: Create a personal access token at https://airtable.com/create/tokens with
data.records:read
and
schema.bases:read
scopes.
bash
export AIRTABLE_TOKEN="your-token"
dt source add airtable
dt source validate airtable
dt migrate run --source airtable --preview
可迁移内容: 包含所有字段值的记录、附件、关联记录和元数据。
bash
export AIRTABLE_TOKEN="your-token"
dt source add airtable
dt source validate airtable
dt migrate run --source airtable --preview

wait for approval

wait for approval

dt migrate run --source airtable

**Manual `dt.json`:**
```json
{
  "sources": {
    "airtable": {
      "token": ""
    }
  },
  "migrations": [
    {
      "source": "airtable",
      "content": ["https://airtable.com/appXXXXXXXXXXXXXX/tblXXXXXXXXXXXXXX"],
      "destination": {
        "type": "data",
        "parent": { "type": "project", "name": "User Interview Tracker" }
      }
    }
  ]
}
The
content
value is the full URL to the Airtable table.

dt migrate run --source airtable

**手动配置`dt.json`示例:**
```json
{
  "sources": {
    "airtable": {
      "token": ""
    }
  },
  "migrations": [
    {
      "source": "airtable",
      "content": ["https://airtable.com/appXXXXXXXXXXXXXX/tblXXXXXXXXXXXXXX"],
      "destination": {
        "type": "data",
        "parent": { "type": "project", "name": "User Interview Tracker" }
      }
    }
  ]
}
content
的值为Airtable表格的完整URL。

Marvin

Marvin

What migrates: Video/audio recordings (Dovetail auto-transcribes) and insights as docs.
Marvin has no public export API. You must first export your data locally:
Step 1: Export from Marvin
Use the
/marvin-download
skill to automate this with browser automation, or manually:
  • Download recordings from each project: All Actions > Download Video
  • Copy insight content into markdown files
Expected export structure:
marvin-export/
  Project Name/
    files/
      Interview 1.mp4
    insights/
      My Report/
        content.md
        metadata.json
Step 2: Import with dt
bash
dt source add marvin
dt migrate run --source marvin --preview
可迁移内容: 视频/音频录制文件(Dovetail会自动转录)和洞察内容(作为文档导入)。
Marvin没有公开的导出API,你需要先将数据导出到本地:
步骤1:从Marvin导出数据
使用
/marvin-download
skill通过浏览器自动化完成导出,也可以手动操作:
  • 下载每个项目的录制文件:所有操作 > 下载视频
  • 将洞察内容复制到markdown文件中
预期的导出目录结构:
marvin-export/
  Project Name/
    files/
      Interview 1.mp4
    insights/
      My Report/
        content.md
        metadata.json
步骤2:使用dt导入
bash
dt source add marvin
dt migrate run --source marvin --preview

wait for approval

wait for approval

dt migrate run --source marvin

**Manual `dt.json`:**
```json
{
  "migrations": [
    {
      "source": "marvin",
      "content": ["./marvin-export/Customer Interviews"],
      "destination": {
        "type": "data",
        "parent": { "type": "project", "name": "Customer Interviews" }
      },
      "options": { "types": "files" }
    },
    {
      "source": "marvin",
      "content": ["./marvin-export/Customer Interviews"],
      "destination": {
        "type": "doc",
        "parent": { "type": "project", "name": "Customer Interviews" }
      },
      "options": { "types": "insights" }
    }
  ]
}
The
content
value is the path to the exported project directory.
options.types
is
"files"
or
"insights"
.

dt migrate run --source marvin

**手动配置`dt.json`示例:**
```json
{
  "migrations": [
    {
      "source": "marvin",
      "content": ["./marvin-export/Customer Interviews"],
      "destination": {
        "type": "data",
        "parent": { "type": "project", "name": "Customer Interviews" }
      },
      "options": { "types": "files" }
    },
    {
      "source": "marvin",
      "content": ["./marvin-export/Customer Interviews"],
      "destination": {
        "type": "doc",
        "parent": { "type": "project", "name": "Customer Interviews" }
      },
      "options": { "types": "insights" }
    }
  ]
}
content
的值为导出项目目录的路径,
options.types
可选值为
"files"
"insights"

Condens

Condens

What migrates: Sessions (notes, transcripts, video) as data; artifacts (reports) as docs.
Step 1: Export from Condens
In Condens: Project > Settings > Export data — download the
.zip
file.
Step 2: Import with dt
bash
dt source add condens
dt migrate run --source condens --preview
可迁移内容: 会话数据(笔记、转录文本、视频)作为数据类型导入;产物(报告)作为文档类型导入。
步骤1:从Condens导出数据
在Condens中操作:项目 > 设置 > 导出数据 — 下载
.zip
格式的导出文件。
步骤2:使用dt导入
bash
dt source add condens
dt migrate run --source condens --preview

wait for approval

wait for approval

dt migrate run --source condens

**Manual `dt.json`:**
```json
{
  "migrations": [
    {
      "source": "condens",
      "content": ["/path/to/Onboarding Research.abc123.zip"],
      "destination": {
        "type": "data",
        "parent": { "type": "project", "name": "Onboarding Research" }
      }
    }
  ]
}
The
content
value is the path to the Condens export zip file.

dt migrate run --source condens

**手动配置`dt.json`示例:**
```json
{
  "migrations": [
    {
      "source": "condens",
      "content": ["/path/to/Onboarding Research.abc123.zip"],
      "destination": {
        "type": "data",
        "parent": { "type": "project", "name": "Onboarding Research" }
      }
    }
  ]
}
content
的值为Condens导出zip文件的路径。

EnjoyHQ

EnjoyHQ

What migrates: Stories, projects, and documents with labels, state, customer info, and tags.
Prerequisites: Get your API token from EnjoyHQ workspace settings.
bash
export ENJOYHQ_API_TOKEN="your-token"
dt source add enjoyhq
dt source validate enjoyhq
dt migrate run --source enjoyhq --preview
可迁移内容: 故事、项目、文档,附带标签、状态、客户信息和标签。
前提条件: 从EnjoyHQ工作区设置中获取你的API令牌。
bash
export ENJOYHQ_API_TOKEN="your-token"
dt source add enjoyhq
dt source validate enjoyhq
dt migrate run --source enjoyhq --preview

wait for approval

wait for approval

dt migrate run --source enjoyhq

---
dt migrate run --source enjoyhq

---

Productboard

Productboard

What migrates: Notes (customer feedback) and features (with descriptions and metadata).
Prerequisites: Go to Settings → Integrations → Public API → Generate Access Token.
bash
export PRODUCTBOARD_API_TOKEN="your-token"
dt source add productboard
dt source validate productboard
dt migrate run --source productboard --preview
可迁移内容: 笔记(客户反馈)和功能(附带描述和元数据)。
前提条件: 前往设置 → 集成 → 公共API → 生成访问令牌
bash
export PRODUCTBOARD_API_TOKEN="your-token"
dt source add productboard
dt source validate productboard
dt migrate run --source productboard --preview

wait for approval

wait for approval

dt migrate run --source productboard

**Manual `dt.json`:**
```json
{
  "sources": {
    "productboard": {
      "token": ""
    }
  },
  "migrations": [
    {
      "source": "productboard",
      "content": ["notes"],
      "destination": {
        "type": "data",
        "parent": { "type": "project", "name": "Productboard Notes" }
      }
    },
    {
      "source": "productboard",
      "content": ["features"],
      "destination": {
        "type": "data",
        "parent": { "type": "project", "name": "Productboard Features" }
      }
    }
  ]
}
The
content
value is the entity type:
"notes"
or
"features"
. Both import as data.

dt migrate run --source productboard

**手动配置`dt.json`示例:**
```json
{
  "sources": {
    "productboard": {
      "token": ""
    }
  },
  "migrations": [
    {
      "source": "productboard",
      "content": ["notes"],
      "destination": {
        "type": "data",
        "parent": { "type": "project", "name": "Productboard Notes" }
      }
    },
    {
      "source": "productboard",
      "content": ["features"],
      "destination": {
        "type": "data",
        "parent": { "type": "project", "name": "Productboard Features" }
      }
    }
  ]
}
content
的值为实体类型:
"notes"
"features"
,两者都会作为数据类型导入。

Local files

本地文件

What migrates:
.md
,
.txt
,
.html
files become the record body. Everything else (PDFs, videos, images, Word docs) is uploaded as an attachment.
bash
dt source add local
dt migrate run --source local --preview
可迁移内容:
.md
.txt
.html
文件会作为记录主体导入,其他所有文件(PDF、视频、图片、Word文档)会作为附件上传。
bash
dt source add local
dt migrate run --source local --preview

wait for approval

wait for approval

dt migrate run --source local

**Manual `dt.json`:**
```json
{
  "migrations": [
    {
      "source": "local",
      "content": ["/path/to/your/folder"],
      "destination": {
        "type": "data",
        "parent": { "type": "project", "name": "Research Files" }
      },
      "options": { "extensions": ".md,.pdf,.mp4" }
    }
  ]
}
The
content
value is the path to the local folder.
options.extensions
is optional; omit to include all files.

dt migrate run --source local

**手动配置`dt.json`示例:**
```json
{
  "migrations": [
    {
      "source": "local",
      "content": ["/path/to/your/folder"],
      "destination": {
        "type": "data",
        "parent": { "type": "project", "name": "Research Files" }
      },
      "options": { "extensions": ".md,.pdf,.mp4" }
    }
  ]
}
content
的值为本地文件夹的路径,
options.extensions
为可选配置,省略则包含所有文件。

Useful flags

实用参数

FlagDescription
--preview
Show what would be migrated without fetching
--dry-run
Fetch and transform but skip upload
--upsert
Re-run without creating duplicates
--limit N
Process at most N records per migration
--filter "title:X"
Filter by title
--filter "created_after:2024-01-01"
Filter by creation date
--filter "updated_after:2024-01-01"
Filter by update date
--verbose
Detailed per-record progress logs
--source X
Run only migrations for source X
参数说明
--preview
展示待迁移内容,无需拉取实际数据
--dry-run
拉取并转换数据,但跳过上传步骤
--upsert
重复执行时不会创建重复数据
--limit N
每次迁移最多处理N条记录
--filter "title:X"
按标题过滤
--filter "created_after:2024-01-01"
按创建日期过滤
--filter "updated_after:2024-01-01"
按更新日期过滤
--verbose
输出每条记录的详细进度日志
--source X
仅执行数据源X对应的迁移任务

Destination types

目标类型

When using
--project
and
--type
flags on the CLI, use the legacy type strings:
--type
destination.type
in dt.json
destination.parent.type
Description
project-data
"data"
"project"
Data (recordings, transcripts, files) inside a project
project-doc
"doc"
"project"
Docs inside a project
folder-doc
"doc"
"folder"
Docs directly in a folder
在CLI中使用
--project
--type
参数时,请使用旧版类型字符串:
--type
dt.json中的
destination.type
destination.parent.type
说明
project-data
"data"
"project"
项目内的数据(录制文件、转录文本、文件)
project-doc
"doc"
"project"
项目内的文档
folder-doc
"doc"
"folder"
直接存放在文件夹中的文档

Running all migrations at once

批量执行所有迁移任务

If multiple sources are configured in
dt.json
:
bash
dt migrate run           # run everything
dt migrate run --source notion  # run one source only
如果
dt.json
中配置了多个数据源:
bash
dt migrate run           # 执行所有迁移任务
dt migrate run --source notion  # 仅执行Notion数据源的迁移任务

Troubleshooting

故障排查

ProblemFix
"no records fetched"Check source access: Notion database shared with integration, Drive folder shared with service account, correct Confluence space key, valid Airtable token scopes
"403 Forbidden"Dovetail API key may lack permissions, or source account doesn't have read access
"401 Unauthorized"API key or token is invalid or expired
Rate limitedAutomatic — the CLI retries with exponential backoff
Migration interruptedRun
dt migrate resume
to continue from where it stopped
Migration seems stuckRun
dt migrate run --verbose
for detailed progress
Run
dt doctor
for a full environment and connectivity diagnostic.
问题解决方法
"未拉取到任何记录"检查数据源访问权限:Notion数据库是否已共享给集成、Drive文件夹是否已共享给服务账号、Confluence空间键是否正确、Airtable令牌的权限范围是否有效
"403 Forbidden"Dovetail API密钥可能缺少权限,或数据源账号没有读取权限
"401 Unauthorized"API密钥或令牌无效或已过期
触发频率限制自动处理 — CLI会采用指数退避策略重试
迁移中断执行
dt migrate resume
从中断处继续
迁移看起来卡住了执行
dt migrate run --verbose
查看详细进度
执行
dt doctor
可完成完整的环境和连通性诊断。