deploy
Compare original and translation side by side
🇺🇸
Original
English🇨🇳
Translation
ChineseDeploy to Databricks Apps
部署到Databricks Apps
App Naming Convention
应用命名规范
Unless the user specifies a different name, apps should use the prefix :
agent-*agent-data-analystagent-customer-supportagent-code-helper
Update the app name in :
databricks.ymlyaml
resources:
apps:
agent_langgraph_long_term_memory:
name: "agent-your-app-name" # Use agent-* prefix除非用户指定其他名称,否则应用应使用作为前缀:
agent-*agent-data-analystagent-customer-supportagent-code-helper
在中更新应用名称:
databricks.ymlyaml
resources:
apps:
agent_langgraph_long_term_memory:
name: "agent-your-app-name" # Use agent-* prefixDeploy Commands
部署命令
IMPORTANT: Always run BOTH commands to deploy and start your app:
bash
undefined重要提示: 请务必运行以下所有命令来部署并启动应用:
bash
undefined1. Validate bundle configuration (catches errors before deploy)
1. Validate bundle configuration (catches errors before deploy)
databricks bundle validate
databricks bundle validate
2. Deploy the bundle (creates/updates resources, uploads files)
2. Deploy the bundle (creates/updates resources, uploads files)
databricks bundle deploy
databricks bundle deploy
3. Run the app (starts/restarts with uploaded source code) - REQUIRED!
3. Run the app (starts/restarts with uploaded source code) - REQUIRED!
databricks bundle run agent_langgraph_long_term_memory
> **Note:** `bundle deploy` only uploads files and configures resources. `bundle run` is **required** to actually start/restart the app with the new code. If you only run `deploy`, the app will continue running old code!
The resource key `agent_langgraph_long_term_memory` matches the app name in `databricks.yml` under `resources.apps`.databricks bundle run agent_langgraph_long_term_memory
> **注意:** `bundle deploy`仅上传文件并配置资源。必须运行`bundle run`才能使用新代码实际启动/重启应用。如果仅运行`deploy`,应用将继续运行旧代码!
资源键`agent_langgraph_long_term_memory`与`databricks.yml`中`resources.apps`下的应用名称一致。Handling "App Already Exists" Error
处理“App Already Exists”错误
If fails with:
databricks bundle deployError: failed to create app
Failed to create app <app-name>. An app with the same name already exists.Ask the user: "Would you like to bind the existing app to this bundle, or delete it and create a new one?"
如果执行失败并提示:
databricks bundle deployError: failed to create app
Failed to create app <app-name>. An app with the same name already exists.询问用户:“您希望将现有应用绑定到此bundle,还是删除它并创建新应用?”
Option 1: Bind Existing App (Recommended)
选项1:绑定现有应用(推荐)
Step 1: Get the existing app's full configuration:
bash
undefined步骤1: 获取现有应用的完整配置:
bash
undefinedGet app config including budget_policy_id and other server-side settings
Get app config including budget_policy_id and other server-side settings
databricks apps get <existing-app-name> --output json | jq '{name, budget_policy_id, description}'
**Step 2:** Update `databricks.yml` to match the existing app's configuration exactly:
```yaml
resources:
apps:
agent_langgraph_long_term_memory:
name: "existing-app-name" # Must match exactly
budget_policy_id: "xxx-xxx-xxx" # Copy from step 1 if presentWhy this matters: Existing apps may have server-side configuration (like) that isn't in your bundle. If these don't match, Terraform will fail with "Provider produced inconsistent result after apply". Always sync the app's current config tobudget_policy_idbefore binding.databricks.yml
Step 3: If deploying to a target, set :
mode: productionworkspace.root_pathyaml
targets:
prod:
mode: production
workspace:
root_path: /Workspace/Users/${workspace.current_user.userName}/.bundle/${bundle.name}/${bundle.target}Why this matters: Production mode requires an explicit root path to ensure only one copy of the bundle is deployed. Without this, the deploy will fail with a recommendation to set.workspace.root_path
Step 4: Check if already bound, then bind if needed:
bash
undefineddatabricks apps get <existing-app-name> --output json | jq '{name, budget_policy_id, description}'
**步骤2:** 更新`databricks.yml`使其与现有应用的配置完全匹配:
```yaml
resources:
apps:
agent_langgraph_long_term_memory:
name: "existing-app-name" # Must match exactly
budget_policy_id: "xxx-xxx-xxx" # Copy from step 1 if present重要原因: 现有应用可能包含不在您bundle中的服务器端配置(如)。如果这些配置不匹配,Terraform会提示“Provider produced inconsistent result after apply”错误。绑定前请务必将应用当前配置同步到budget_policy_id。databricks.yml
步骤3: 如果部署到目标,请设置:
mode: productionworkspace.root_pathyaml
targets:
prod:
mode: production
workspace:
root_path: /Workspace/Users/${workspace.current_user.userName}/.bundle/${bundle.name}/${bundle.target}重要原因: 生产模式需要明确的根路径,以确保仅部署一份bundle副本。如果未设置此项,部署会失败并提示您设置。workspace.root_path
步骤4: 检查是否已绑定,若未绑定则执行绑定操作:
bash
undefinedCheck if resource is already managed by this bundle
Check if resource is already managed by this bundle
databricks bundle summary --output json | jq '.resources.apps'
databricks bundle summary --output json | jq '.resources.apps'
If the app appears in the summary, skip binding and go to Step 5
If the app appears in the summary, skip binding and go to Step 5
If NOT in summary, bind the resource:
If NOT in summary, bind the resource:
databricks bundle deployment bind agent_langgraph_long_term_memory <existing-app-name> --auto-approve
> **Note:** If bind fails with "Resource already managed by Terraform", the app is already bound to this bundle. Skip to Step 5 and deploy directly.
**Step 5:** Deploy:
```bash
databricks bundle deploy
databricks bundle run agent_langgraph_long_term_memorydatabricks bundle deployment bind agent_langgraph_long_term_memory <existing-app-name> --auto-approve
> **注意:** 如果绑定失败并提示“Resource already managed by Terraform”,说明该应用已绑定到此bundle。跳过此步骤直接执行步骤5进行部署。
**步骤5:** 部署:
```bash
databricks bundle deploy
databricks bundle run agent_langgraph_long_term_memoryOption 2: Delete and Recreate
选项2:删除并重新创建
bash
databricks apps delete <app-name>
databricks bundle deployWarning: This permanently deletes the app's URL, OAuth credentials, and service principal.
bash
databricks apps delete <app-name>
databricks bundle deploy警告: 此操作会永久删除应用的URL、OAuth凭据和服务主体。
Unbinding an App
解绑应用
To remove the link between bundle and deployed app:
bash
databricks bundle deployment unbind agent_langgraph_long_term_memoryUse when:
- Switching to a different app
- Letting bundle create a new app
- Switching between deployed instances
Note: Unbinding doesn't delete the deployed app.
要解除bundle与已部署应用的关联:
bash
databricks bundle deployment unbind agent_langgraph_long_term_memory适用于以下场景:
- 切换到其他应用时
- 让bundle创建新应用时
- 在已部署实例之间切换时
注意:解绑操作不会删除已部署的应用。
Query Deployed App
查询已部署应用
IMPORTANT: Databricks Apps are only queryable via OAuth token. You cannot use a Personal Access Token (PAT) to query your agent. Attempting to use a PAT will result in a 302 redirect error.
Get OAuth token:
bash
databricks auth token | jq -r '.access_token'Send request:
bash
curl -X POST <app-url>/invocations \
-H "Authorization: Bearer <oauth-token>" \
-H "Content-Type: application/json" \
-d '{ "input": [{ "role": "user", "content": "hi" }], "stream": true }'If using memory - include to scope memories per user:
user_idbash
curl -X POST <app-url>/invocations \
-H "Authorization: Bearer <oauth-token>" \
-H "Content-Type: application/json" \
-d '{
"input": [{"role": "user", "content": "What do you remember about me?"}],
"custom_inputs": {"user_id": "user@example.com"}
}'重要提示: Databricks Apps仅支持通过OAuth令牌进行查询。您无法使用个人访问令牌(PAT)查询Agent,否则会出现302重定向错误。
获取OAuth令牌:
bash
databricks auth token | jq -r '.access_token'发送请求:
bash
curl -X POST <app-url>/invocations \
-H "Authorization: Bearer <oauth-token>" \
-H "Content-Type: application/json" \
-d '{ "input": [{ "role": "user", "content": "hi" }], "stream": true }'如果使用记忆功能 - 请包含以按用户划分记忆范围:
user_idbash
curl -X POST <app-url>/invocations \
-H "Authorization: Bearer <oauth-token>" \
-H "Content-Type: application/json" \
-d '{
"input": [{"role": "user", "content": "What do you remember about me?"}],
"custom_inputs": {"user_id": "user@example.com"}
}'On-Behalf-Of (OBO) User Authentication
代表用户(OBO)身份验证
To authenticate as the requesting user instead of the app service principal:
python
from agent_server.utils import get_user_workspace_client要以请求用户而非应用服务主体的身份进行身份验证:
python
from agent_server.utils import get_user_workspace_clientIn your agent code
In your agent code
user_client = get_user_workspace_client()
user_client = get_user_workspace_client()
Use user_client for operations that should run as the user
Use user_client for operations that should run as the user
This is useful when you want the agent to access resources with the user's permissions rather than the app's service principal permissions.
See: [OBO authentication documentation](https://docs.databricks.com/aws/en/dev-tools/databricks-apps/auth#retrieve-user-authorization-credentials)
当您希望Agent使用用户权限而非应用服务主体权限访问资源时,此功能非常有用。
参考:[OBO身份验证文档](https://docs.databricks.com/aws/en/dev-tools/databricks-apps/auth#retrieve-user-authorization-credentials)Debug Deployed Apps
调试已部署应用
bash
undefinedbash
undefinedView logs (follow mode)
View logs (follow mode)
databricks apps logs <app-name> --follow
databricks apps logs <app-name> --follow
Check app status
Check app status
databricks apps get <app-name> --output json | jq '{app_status, compute_status}'
databricks apps get <app-name> --output json | jq '{app_status, compute_status}'
Get app URL
Get app URL
databricks apps get <app-name> --output json | jq -r '.url'
undefineddatabricks apps get <app-name> --output json | jq -r '.url'
undefinedImportant Notes
重要说明
- App naming convention: App names must be prefixed with (e.g.,
agent-,agent-my-assistant)agent-data-analyst - Name is immutable: Changing the field in
nameforces app replacement (destroy + create)databricks.yml - Remote Terraform state: Databricks stores state remotely; same app detected across directories
- Review the plan: Look for in Terraform output before confirming
# forces replacement
- 应用命名规范:应用名称必须以为前缀(例如
agent-、agent-my-assistant)agent-data-analyst - 名称不可变:修改中的
databricks.yml字段会强制替换应用(销毁+重新创建)name - 远程Terraform状态:Databricks远程存储状态;可跨目录检测到同一应用
- 查看部署计划:在确认部署前,查看Terraform输出中的标记
# forces replacement
FAQ
常见问题
Q: I see a 200 OK in the logs, but get an error in the actual stream. What's going on?
This is expected behavior. The initial 200 OK confirms stream setup was successful. Errors that occur during streaming don't affect the initial HTTP status code. Check the stream content for the actual error message.
Q: When querying my agent, I get a 302 redirect error. What's wrong?
You're likely using a Personal Access Token (PAT). Databricks Apps only support OAuth tokens. Generate one with:
bash
databricks auth tokenQ: How do I add dependencies to my agent?
Use :
uv addbash
uv add <package_name>问:日志中显示200 OK,但实际流中出现错误,这是怎么回事?
这是预期行为。初始200 OK仅确认流设置成功。流过程中出现的错误不会影响初始HTTP状态码。请检查流内容以获取实际错误信息。
问:查询Agent时出现302重定向错误,该如何解决?
您可能使用了个人访问令牌(PAT)。Databricks Apps仅支持OAuth令牌。请使用以下命令生成:
bash
databricks auth token问:如何为Agent添加依赖项?
使用命令:
uv addbash
uv add <package_name>Example: uv add "mlflow-skinny[databricks]"
Example: uv add "mlflow-skinny[databricks]"
undefinedundefinedTroubleshooting
故障排查
| Issue | Solution |
|---|---|
| Validation errors | Run |
| Permission errors at runtime | Grant resources in |
| Lakebase access errors | See lakebase-setup skill for permissions (if using memory) |
| App not starting | Check |
| Auth token expired | Run |
| 302 redirect error | Use OAuth token, not PAT |
| "Provider produced inconsistent result" | Sync app config to |
| "should set workspace.root_path" | Add |
| App running old code after deploy | Run |
| Env var is None in deployed app | Check |
| 问题 | 解决方案 |
|---|---|
| 验证错误 | 部署前运行 |
| 运行时权限错误 | 在 |
| Lakebase访问错误 | 若使用记忆功能,请参考lakebase-setup技能配置权限 |
| 应用无法启动 | 运行 |
| 认证令牌过期 | 重新运行 |
| 302重定向错误 | 使用OAuth令牌,而非PAT |
| “Provider produced inconsistent result”错误 | 将应用配置同步到 |
| “should set workspace.root_path”错误 | 为生产目标添加 |
| 部署后应用仍运行旧代码 | 部署后运行 |
| 已部署应用中环境变量为None | 检查app.yaml中的 |