deploy

Compare original and translation side by side

🇺🇸

Original

English
🇨🇳

Translation

Chinese

Deploy to Databricks Apps

部署到Databricks Apps

App Naming Convention

应用命名规范

Unless the user specifies a different name, apps should use the prefix
agent-*
:
  • agent-data-analyst
  • agent-customer-support
  • agent-code-helper
Update the app name in
databricks.yml
:
yaml
resources:
  apps:
    agent_langgraph_long_term_memory:
      name: "agent-your-app-name"  # Use agent-* prefix
除非用户指定其他名称,否则应用应使用
agent-*
作为前缀:
  • agent-data-analyst
  • agent-customer-support
  • agent-code-helper
databricks.yml
中更新应用名称:
yaml
resources:
  apps:
    agent_langgraph_long_term_memory:
      name: "agent-your-app-name"  # Use agent-* prefix

Deploy Commands

部署命令

IMPORTANT: Always run BOTH commands to deploy and start your app:
bash
undefined
重要提示: 请务必运行以下所有命令来部署并启动应用:
bash
undefined

1. Validate bundle configuration (catches errors before deploy)

1. Validate bundle configuration (catches errors before deploy)

databricks bundle validate
databricks bundle validate

2. Deploy the bundle (creates/updates resources, uploads files)

2. Deploy the bundle (creates/updates resources, uploads files)

databricks bundle deploy
databricks bundle deploy

3. Run the app (starts/restarts with uploaded source code) - REQUIRED!

3. Run the app (starts/restarts with uploaded source code) - REQUIRED!

databricks bundle run agent_langgraph_long_term_memory

> **Note:** `bundle deploy` only uploads files and configures resources. `bundle run` is **required** to actually start/restart the app with the new code. If you only run `deploy`, the app will continue running old code!

The resource key `agent_langgraph_long_term_memory` matches the app name in `databricks.yml` under `resources.apps`.
databricks bundle run agent_langgraph_long_term_memory

> **注意:** `bundle deploy`仅上传文件并配置资源。必须运行`bundle run`才能使用新代码实际启动/重启应用。如果仅运行`deploy`,应用将继续运行旧代码!

资源键`agent_langgraph_long_term_memory`与`databricks.yml`中`resources.apps`下的应用名称一致。

Handling "App Already Exists" Error

处理“App Already Exists”错误

If
databricks bundle deploy
fails with:
Error: failed to create app
Failed to create app <app-name>. An app with the same name already exists.
Ask the user: "Would you like to bind the existing app to this bundle, or delete it and create a new one?"
如果
databricks bundle deploy
执行失败并提示:
Error: failed to create app
Failed to create app <app-name>. An app with the same name already exists.
询问用户:“您希望将现有应用绑定到此bundle,还是删除它并创建新应用?”

Option 1: Bind Existing App (Recommended)

选项1:绑定现有应用(推荐)

Step 1: Get the existing app's full configuration:
bash
undefined
步骤1: 获取现有应用的完整配置:
bash
undefined

Get app config including budget_policy_id and other server-side settings

Get app config including budget_policy_id and other server-side settings

databricks apps get <existing-app-name> --output json | jq '{name, budget_policy_id, description}'

**Step 2:** Update `databricks.yml` to match the existing app's configuration exactly:
```yaml
resources:
  apps:
    agent_langgraph_long_term_memory:
      name: "existing-app-name"  # Must match exactly
      budget_policy_id: "xxx-xxx-xxx"  # Copy from step 1 if present
Why this matters: Existing apps may have server-side configuration (like
budget_policy_id
) that isn't in your bundle. If these don't match, Terraform will fail with "Provider produced inconsistent result after apply". Always sync the app's current config to
databricks.yml
before binding.
Step 3: If deploying to a
mode: production
target, set
workspace.root_path
:
yaml
targets:
  prod:
    mode: production
    workspace:
      root_path: /Workspace/Users/${workspace.current_user.userName}/.bundle/${bundle.name}/${bundle.target}
Why this matters: Production mode requires an explicit root path to ensure only one copy of the bundle is deployed. Without this, the deploy will fail with a recommendation to set
workspace.root_path
.
Step 4: Check if already bound, then bind if needed:
bash
undefined
databricks apps get <existing-app-name> --output json | jq '{name, budget_policy_id, description}'

**步骤2:** 更新`databricks.yml`使其与现有应用的配置完全匹配:
```yaml
resources:
  apps:
    agent_langgraph_long_term_memory:
      name: "existing-app-name"  # Must match exactly
      budget_policy_id: "xxx-xxx-xxx"  # Copy from step 1 if present
重要原因: 现有应用可能包含不在您bundle中的服务器端配置(如
budget_policy_id
)。如果这些配置不匹配,Terraform会提示“Provider produced inconsistent result after apply”错误。绑定前请务必将应用当前配置同步到
databricks.yml
步骤3: 如果部署到
mode: production
目标,请设置
workspace.root_path
yaml
targets:
  prod:
    mode: production
    workspace:
      root_path: /Workspace/Users/${workspace.current_user.userName}/.bundle/${bundle.name}/${bundle.target}
重要原因: 生产模式需要明确的根路径,以确保仅部署一份bundle副本。如果未设置此项,部署会失败并提示您设置
workspace.root_path
步骤4: 检查是否已绑定,若未绑定则执行绑定操作:
bash
undefined

Check if resource is already managed by this bundle

Check if resource is already managed by this bundle

databricks bundle summary --output json | jq '.resources.apps'
databricks bundle summary --output json | jq '.resources.apps'

If the app appears in the summary, skip binding and go to Step 5

If the app appears in the summary, skip binding and go to Step 5

If NOT in summary, bind the resource:

If NOT in summary, bind the resource:

databricks bundle deployment bind agent_langgraph_long_term_memory <existing-app-name> --auto-approve

> **Note:** If bind fails with "Resource already managed by Terraform", the app is already bound to this bundle. Skip to Step 5 and deploy directly.

**Step 5:** Deploy:
```bash
databricks bundle deploy
databricks bundle run agent_langgraph_long_term_memory
databricks bundle deployment bind agent_langgraph_long_term_memory <existing-app-name> --auto-approve

> **注意:** 如果绑定失败并提示“Resource already managed by Terraform”,说明该应用已绑定到此bundle。跳过此步骤直接执行步骤5进行部署。

**步骤5:** 部署:
```bash
databricks bundle deploy
databricks bundle run agent_langgraph_long_term_memory

Option 2: Delete and Recreate

选项2:删除并重新创建

bash
databricks apps delete <app-name>
databricks bundle deploy
Warning: This permanently deletes the app's URL, OAuth credentials, and service principal.
bash
databricks apps delete <app-name>
databricks bundle deploy
警告: 此操作会永久删除应用的URL、OAuth凭据和服务主体。

Unbinding an App

解绑应用

To remove the link between bundle and deployed app:
bash
databricks bundle deployment unbind agent_langgraph_long_term_memory
Use when:
  • Switching to a different app
  • Letting bundle create a new app
  • Switching between deployed instances
Note: Unbinding doesn't delete the deployed app.
要解除bundle与已部署应用的关联:
bash
databricks bundle deployment unbind agent_langgraph_long_term_memory
适用于以下场景:
  • 切换到其他应用时
  • 让bundle创建新应用时
  • 在已部署实例之间切换时
注意:解绑操作不会删除已部署的应用。

Query Deployed App

查询已部署应用

IMPORTANT: Databricks Apps are only queryable via OAuth token. You cannot use a Personal Access Token (PAT) to query your agent. Attempting to use a PAT will result in a 302 redirect error.
Get OAuth token:
bash
databricks auth token | jq -r '.access_token'
Send request:
bash
curl -X POST <app-url>/invocations \
  -H "Authorization: Bearer <oauth-token>" \
  -H "Content-Type: application/json" \
  -d '{ "input": [{ "role": "user", "content": "hi" }], "stream": true }'
If using memory - include
user_id
to scope memories per user:
bash
curl -X POST <app-url>/invocations \
  -H "Authorization: Bearer <oauth-token>" \
  -H "Content-Type: application/json" \
  -d '{
      "input": [{"role": "user", "content": "What do you remember about me?"}],
      "custom_inputs": {"user_id": "user@example.com"}
  }'
重要提示: Databricks Apps仅支持通过OAuth令牌进行查询。您无法使用个人访问令牌(PAT)查询Agent,否则会出现302重定向错误。
获取OAuth令牌:
bash
databricks auth token | jq -r '.access_token'
发送请求:
bash
curl -X POST <app-url>/invocations \
  -H "Authorization: Bearer <oauth-token>" \
  -H "Content-Type: application/json" \
  -d '{ "input": [{ "role": "user", "content": "hi" }], "stream": true }'
如果使用记忆功能 - 请包含
user_id
以按用户划分记忆范围:
bash
curl -X POST <app-url>/invocations \
  -H "Authorization: Bearer <oauth-token>" \
  -H "Content-Type: application/json" \
  -d '{
      "input": [{"role": "user", "content": "What do you remember about me?"}],
      "custom_inputs": {"user_id": "user@example.com"}
  }'

On-Behalf-Of (OBO) User Authentication

代表用户(OBO)身份验证

To authenticate as the requesting user instead of the app service principal:
python
from agent_server.utils import get_user_workspace_client
要以请求用户而非应用服务主体的身份进行身份验证:
python
from agent_server.utils import get_user_workspace_client

In your agent code

In your agent code

user_client = get_user_workspace_client()
user_client = get_user_workspace_client()

Use user_client for operations that should run as the user

Use user_client for operations that should run as the user


This is useful when you want the agent to access resources with the user's permissions rather than the app's service principal permissions.

See: [OBO authentication documentation](https://docs.databricks.com/aws/en/dev-tools/databricks-apps/auth#retrieve-user-authorization-credentials)

当您希望Agent使用用户权限而非应用服务主体权限访问资源时,此功能非常有用。

参考:[OBO身份验证文档](https://docs.databricks.com/aws/en/dev-tools/databricks-apps/auth#retrieve-user-authorization-credentials)

Debug Deployed Apps

调试已部署应用

bash
undefined
bash
undefined

View logs (follow mode)

View logs (follow mode)

databricks apps logs <app-name> --follow
databricks apps logs <app-name> --follow

Check app status

Check app status

databricks apps get <app-name> --output json | jq '{app_status, compute_status}'
databricks apps get <app-name> --output json | jq '{app_status, compute_status}'

Get app URL

Get app URL

databricks apps get <app-name> --output json | jq -r '.url'
undefined
databricks apps get <app-name> --output json | jq -r '.url'
undefined

Important Notes

重要说明

  • App naming convention: App names must be prefixed with
    agent-
    (e.g.,
    agent-my-assistant
    ,
    agent-data-analyst
    )
  • Name is immutable: Changing the
    name
    field in
    databricks.yml
    forces app replacement (destroy + create)
  • Remote Terraform state: Databricks stores state remotely; same app detected across directories
  • Review the plan: Look for
    # forces replacement
    in Terraform output before confirming
  • 应用命名规范:应用名称必须以
    agent-
    为前缀(例如
    agent-my-assistant
    agent-data-analyst
  • 名称不可变:修改
    databricks.yml
    中的
    name
    字段会强制替换应用(销毁+重新创建)
  • 远程Terraform状态:Databricks远程存储状态;可跨目录检测到同一应用
  • 查看部署计划:在确认部署前,查看Terraform输出中的
    # forces replacement
    标记

FAQ

常见问题

Q: I see a 200 OK in the logs, but get an error in the actual stream. What's going on?
This is expected behavior. The initial 200 OK confirms stream setup was successful. Errors that occur during streaming don't affect the initial HTTP status code. Check the stream content for the actual error message.
Q: When querying my agent, I get a 302 redirect error. What's wrong?
You're likely using a Personal Access Token (PAT). Databricks Apps only support OAuth tokens. Generate one with:
bash
databricks auth token
Q: How do I add dependencies to my agent?
Use
uv add
:
bash
uv add <package_name>
问:日志中显示200 OK,但实际流中出现错误,这是怎么回事?
这是预期行为。初始200 OK仅确认流设置成功。流过程中出现的错误不会影响初始HTTP状态码。请检查流内容以获取实际错误信息。
问:查询Agent时出现302重定向错误,该如何解决?
您可能使用了个人访问令牌(PAT)。Databricks Apps仅支持OAuth令牌。请使用以下命令生成:
bash
databricks auth token
问:如何为Agent添加依赖项?
使用
uv add
命令:
bash
uv add <package_name>

Example: uv add "mlflow-skinny[databricks]"

Example: uv add "mlflow-skinny[databricks]"

undefined
undefined

Troubleshooting

故障排查

IssueSolution
Validation errorsRun
databricks bundle validate
to see detailed errors before deploying
Permission errors at runtimeGrant resources in
databricks.yml
(see add-tools skill)
Lakebase access errorsSee lakebase-setup skill for permissions (if using memory)
App not startingCheck
databricks apps logs <app-name>
Auth token expiredRun
databricks auth token
again
302 redirect errorUse OAuth token, not PAT
"Provider produced inconsistent result"Sync app config to
databricks.yml
"should set workspace.root_path"Add
root_path
to production target
App running old code after deployRun
databricks bundle run agent_langgraph_long_term_memory
after deploy
Env var is None in deployed appCheck
valueFrom
in app.yaml matches resource
name
in databricks.yml
问题解决方案
验证错误部署前运行
databricks bundle validate
查看详细错误信息
运行时权限错误
databricks.yml
中配置资源权限(参考add-tools技能)
Lakebase访问错误若使用记忆功能,请参考lakebase-setup技能配置权限
应用无法启动运行
databricks apps logs <app-name>
查看日志
认证令牌过期重新运行
databricks auth token
生成令牌
302重定向错误使用OAuth令牌,而非PAT
“Provider produced inconsistent result”错误将应用配置同步到
databricks.yml
“should set workspace.root_path”错误为生产目标添加
root_path
配置
部署后应用仍运行旧代码部署后运行
databricks bundle run agent_langgraph_long_term_memory
已部署应用中环境变量为None检查app.yaml中的
valueFrom
是否与databricks.yml中的资源
name
匹配