jetty-setup
Compare original and translation side by side
🇺🇸
Original
English🇨🇳
Translation
ChineseJetty Setup Wizard
Jetty 设置向导
You are guiding a user through first-time Jetty setup. The goal is to get them from zero to running their first AI workflow in under 5 minutes. Follow these steps IN ORDER. Be friendly and concise.
你正在引导用户完成Jetty的首次设置。目标是让用户在5分钟内从零基础到运行他们的第一个AI工作流。请严格按照以下步骤顺序操作,保持友好且简洁的语气。
Security Guidelines
安全指南
- Never echo, print, or log API tokens or keys in output. Use redacted forms (e.g., ) when referring to tokens in messages to the user.
mlc_...xxxx - Never store tokens in project files like that may be committed to version control. Use the user-scoped config directory
CLAUDE.md.~/.config/jetty/ - Read secrets interactively with so the raw value never appears in generated commands, tool-call logs, or shell history. Pipe the value directly from the variable into
read -rsandcurlit immediately after.unset - Confirm with the user before each API call that sends credentials to an external service.
- Never store provider API keys locally — they are sent directly to the Jetty API for server-side storage and are not written to any local file.
- 切勿在输出中回显、打印或记录API令牌(token)或密钥。在向用户提及令牌时,请使用脱敏格式(例如:)。
mlc_...xxxx - 切勿将令牌存储在项目文件中,比如可能会提交到版本控制的文件。请使用用户范围的配置目录
CLAUDE.md。~/.config/jetty/ - 使用交互式读取密钥,确保原始值不会出现在生成的命令、工具调用日志或Shell历史中。将值直接从变量传递到
read -rs,并在使用后立即curl变量。unset - 在每次向外部服务发送凭据前,务必与用户确认。
- 切勿在本地存储服务商API密钥——它们会直接发送到Jetty API进行服务器端存储,不会写入任何本地文件。
Step 1: Check for Existing Token
步骤1:检查现有令牌
Check if a Jetty API token already exists:
- Check for a stored token
~/.config/jetty/token - Also check the project's file (for backward compatibility) for a token starting with
CLAUDE.mdmlc_ - If found in but not in
CLAUDE.md, migrate it (see "Save the Token" below) and remove it from~/.config/jetty/tokenCLAUDE.md - If found, validate it:
bash
TOKEN="$(cat ~/.config/jetty/token 2>/dev/null)"
curl -s -H "Authorization: Bearer $TOKEN" "https://flows-api.jetty.io/api/v1/collections/" | head -c 200If the response contains collection data (not an error), the token is valid. Tell the user (with token redacted):
"Found a valid Jetty token (). You're already connected!"mlc_...{last 4 chars}
Then use AskUserQuestion:
- Header: "Setup"
- Question: "You already have a Jetty token configured. What would you like to do?"
- Options:
- "Create my first runbook" / "Learn about runbooks and build one"
- "Reconfigure" / "Start fresh with a new token or provider"
- "I'm good" / "No further setup needed"
If they choose "Create my first runbook", skip to Step 4.
If they choose "Reconfigure", continue to Step 2 but skip the signup part.
If they choose "I'm good", end the setup.
If no valid token is found, continue to Step 2.
检查是否已存在Jetty API令牌:
- 检查是否存储了令牌
~/.config/jetty/token - 同时检查项目的文件(为了向后兼容)中是否存在以
CLAUDE.md开头的令牌mlc_ - 如果在中找到但
CLAUDE.md中没有,则迁移令牌(请参阅下方“保存令牌”部分),并从~/.config/jetty/token中删除该令牌CLAUDE.md - 如果找到令牌,验证其有效性:
bash
TOKEN="$(cat ~/.config/jetty/token 2>/dev/null)"
curl -s -H "Authorization: Bearer $TOKEN" "https://flows-api.jetty.io/api/v1/collections/" | head -c 200如果响应包含集合数据(而非错误信息),则令牌有效。告知用户(令牌脱敏显示):
"找到有效的Jetty令牌()。你已完成连接!"mlc_...{最后4位字符}
然后使用AskUserQuestion工具:
- 标题:"设置"
- 问题:"你已配置了Jetty令牌。接下来你想做什么?"
- 选项:
- "创建我的第一个runbook" / "了解runbook并创建一个"
- "重新配置" / "使用新令牌或服务商重新开始设置"
- "我已完成" / "无需进一步设置"
如果用户选择“创建我的第一个runbook”,则跳转到步骤4。
如果用户选择“重新配置”,则继续步骤2,但跳过注册环节。
如果用户选择“我已完成”,则结束设置流程。
如果未找到有效令牌,则继续步骤2。
Step 2: Account Creation
步骤2:账户创建
Use AskUserQuestion:
- Header: "Account"
- Question: "Do you already have a Jetty account?"
- Options:
- "Yes, I have an API key" / "I have a Jetty account and can paste my API key"
- "No, I need to sign up" / "Open the Jetty signup page in my browser"
使用AskUserQuestion工具:
- 标题:"账户"
- 问题:"你是否已有Jetty账户?"
- 选项:
- "是,我有API密钥" / "我已有Jetty账户,可以粘贴我的API密钥"
- "否,我需要注册" / "在我的浏览器中打开Jetty注册页面"
If "Yes, I have an API key":
如果选择“是,我有API密钥”:
Ask the user to paste their API key using AskUserQuestion:
- Header: "API Key"
- Question: "Please paste your Jetty API key (starts with mlc_):"
- Options:
- "I'll type it in" / "Let me enter my API key" (they will use the "Other" option to type it)
- "I need to find it" / "Open flows.jetty.io so I can get my key"
If they need to find it, open the browser:
bash
open "https://flows.jetty.io/settings" 2>/dev/null || xdg-open "https://flows.jetty.io/settings" 2>/dev/null使用AskUserQuestion工具让用户粘贴其API密钥:
- 标题:"API密钥"
- 问题:"请粘贴你的Jetty API密钥(以mlc_开头):"
- 选项:
- "我将手动输入" / "让我输入我的API密钥"(用户将使用“其他”选项进行输入)
- "我需要找到它" / "打开flows.jetty.io以获取我的密钥"
如果用户需要查找密钥,打开浏览器:
bash
open "https://flows.jetty.io/settings" 2>/dev/null || xdg-open "https://flows.jetty.io/settings" 2>/dev/nullIf "No, I need to sign up":
如果选择“否,我需要注册”:
Tell the user:
"Opening Jetty in your browser. Here's what to do:
- Click Get started free to create your account
- Complete the onboarding (pick a collection name — this is your workspace)
- Once you're on the dashboard, go to Settings to find your API key
- Copy the API key and come back here to paste it"
Open the signup page:
bash
open "https://flows.jetty.io/sign-up" 2>/dev/null || xdg-open "https://flows.jetty.io/sign-up" 2>/dev/nullThen wait for them to come back and paste the key. Use AskUserQuestion:
- Header: "API Key"
- Question: "Once you've signed up, paste your Jetty API key here (starts with mlc_):"
- Options:
- "I'll type it in" / "Let me paste my API key" (they will use the "Other" option)
- "I'm stuck" / "I need help finding my API key"
If they're stuck, provide guidance:
"Your API key is at flows.jetty.io → Settings → API Tokens. Click Create Token, copy it, and paste it here."
告知用户:
"正在你的浏览器中打开Jetty页面。请按以下步骤操作:
- 点击免费开始使用创建账户
- 完成引导流程(选择一个集合名称——这将是你的工作区)
- 进入仪表盘后,前往设置页面查找你的API密钥
- 复制API密钥并返回此处粘贴"
打开注册页面:
bash
open "https://flows.jetty.io/sign-up" 2>/dev/null || xdg-open "https://flows.jetty.io/sign-up" 2>/dev/null然后等待用户返回并粘贴密钥。使用AskUserQuestion工具:
- 标题:"API密钥"
- 问题:"注册完成后,请在此处粘贴你的Jetty API密钥(以mlc_开头):"
- 选项:
- "我将粘贴密钥" / "让我粘贴我的API密钥"(用户将使用“其他”选项)
- "我遇到问题了" / "我需要帮助查找API密钥"
如果用户遇到问题,提供指引:
"你的API密钥位于flows.jetty.io → 设置 → API令牌。点击创建令牌,复制后粘贴到此处。"
Validate the Key
验证密钥
Once you have the key, save it to the secure config location first, then validate using the stored file. Never embed the raw token in a generated command, heredoc, or variable assignment. Instead, prompt the user to paste it interactively via :
read -rsbash
mkdir -p ~/.config/jetty && chmod 700 ~/.config/jetty
echo "Paste your Jetty API token and press Enter:"
read -rs JETTY_TOKEN && printf '%s' "$JETTY_TOKEN" > ~/.config/jetty/token && unset JETTY_TOKEN
chmod 600 ~/.config/jetty/token获取密钥后,先将其保存到安全的配置位置,再使用存储的文件进行验证。切勿在生成的命令、here文档或变量赋值中嵌入原始令牌。请改用命令交互式提示用户粘贴令牌:
read -rsbash
mkdir -p ~/.config/jetty && chmod 700 ~/.config/jetty
echo "请粘贴你的Jetty API令牌并按回车键:"
read -rs JETTY_TOKEN && printf '%s' "$JETTY_TOKEN" > ~/.config/jetty/token && unset JETTY_TOKEN
chmod 600 ~/.config/jetty/tokenNow validate using the stored file
现在使用存储的文件进行验证
curl -s -H "Authorization: Bearer $(cat ~/.config/jetty/token)" "https://flows-api.jetty.io/api/v1/collections/"
**Important:** The `read -rs` command reads input silently (no echo) directly from the terminal. The token value never appears in the generated command, shell history, or tool-call logs.
**If validation fails (401 or error):**
Tell the user the key didn't work and let them try again (up to 3 attempts). After 3 failures, suggest visiting https://flows.jetty.io/settings to verify.
**If validation succeeds:**
1. Parse the response to find the collection name(s)
2. Tell the user which collections they have access tocurl -s -H "Authorization: Bearer $(cat ~/.config/jetty/token)" "https://flows-api.jetty.io/api/v1/collections/"
**重要提示**:`read -rs`命令会从终端静默读取输入(无回显)。令牌值永远不会出现在生成的命令、Shell历史或工具调用日志中。
**如果验证失败(返回401或错误信息)**:
告知用户密钥无效,允许用户重试(最多3次)。3次失败后,建议用户访问https://flows.jetty.io/settings进行验证。
**如果验证成功**:
1. 解析响应以查找集合名称
2. 告知用户他们有权访问哪些集合Save the Token
保存令牌
The token was already saved during validation above. If validation failed and the user provided a corrected key, overwrite the file the same way (use so the raw token never appears in generated commands).
read -rsIf contains an old token line (), remove that line from to avoid leaving credentials in project files.
CLAUDE.mdI have a production jetty api token mlc_...CLAUDE.mdTell the user:
"Your API token is saved to(user-scoped, outside your project directory). It won't be accidentally committed to git."~/.config/jetty/token
令牌已在上述验证过程中保存。如果验证失败且用户提供了正确的密钥,请以相同方式覆盖文件(使用,确保原始令牌不会出现在生成的命令中)。
read -rs如果中包含旧令牌行(),请从中删除该行,避免在项目文件中遗留凭据。
CLAUDE.mdI have a production jetty api token mlc_...CLAUDE.md告知用户:
"你的API令牌已保存到(用户范围目录,位于你的项目目录之外),不会被意外提交到git。"~/.config/jetty/token
Step 3: Choose Provider & Store API Key
步骤3:选择服务商并存储API密钥
Step 3a: Check for Existing Keys & Offer Trial
步骤3a:检查现有密钥并提供试用选项
First, check whether the collection already has AI provider keys configured:
bash
TOKEN="$(cat ~/.config/jetty/token)"
RESPONSE=$(curl -s "https://flows-api.jetty.io/api/v1/collections/$COLLECTION" \
-H "Authorization: Bearer $TOKEN")
echo "$RESPONSE" | python3 -c "import sys,json; d=json.load(sys.stdin); evars=d.get('environment_variables',{}); print('Configured keys:', list(evars.keys()) if evars else 'none')"Parse from the response. If all four of , , , and are missing (i.e., none of them are present), offer the trial option below. If any of those keys are already configured, skip this check entirely and proceed to the provider selection prompt below.
environment_variablesOPENAI_API_KEYANTHROPIC_API_KEYGEMINI_API_KEYREPLICATE_API_TOKENUse AskUserQuestion:
- Header: "Getting Started"
- Question: "Your collection doesn't have any AI provider keys configured yet.\n\nWould you like to:"
- Options:
- "Try Jetty free" / "Get 10 free runs (up to 60 minutes) using Jetty-provided AI keys. No third-party signup needed."
- "Add your own keys" / "Configure your OpenAI, Anthropic, Gemini, or Replicate API keys now."
If the user chooses "Try Jetty free":
Activate the trial:
bash
TOKEN="$(cat ~/.config/jetty/token)"
curl -s -X POST "https://flows-api.jetty.io/api/v1/trial/$COLLECTION/activate" \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" | python3 -c "
import sys, json
d = json.load(sys.stdin)
if d.get('active') or d.get('status') == 'active':
print(f'Trial activated! Runs remaining: {d.get(\"runs_remaining\", \"?\")}, Minutes remaining: {d.get(\"minutes_remaining\", \"?\")}')
else:
print('Error:', json.dumps(d))
"- On success: Tell the user their trial is activated, show remaining runs and minutes, then skip directly to Step 4 (Deploy the Demo Workflow). The trial provides all necessary AI keys server-side.
- On error: Inform the user the trial could not be activated, then fall through to the "Add your own keys" flow below.
If the user chooses "Add your own keys":
Continue with the provider selection prompt below.
首先,检查集合是否已配置AI服务商密钥:
bash
TOKEN="$(cat ~/.config/jetty/token)"
RESPONSE=$(curl -s "https://flows-api.jetty.io/api/v1/collections/$COLLECTION" \
-H "Authorization: Bearer $TOKEN")
echo "$RESPONSE" | python3 -c "import sys,json; d=json.load(sys.stdin); evars=d.get('environment_variables',{}); print('Configured keys:', list(evars.keys()) if evars else 'none')"解析响应中的。如果全部四个密钥、、和均未配置(即均不存在),则提供以下试用选项。如果其中任何一个密钥已配置,则完全跳过此检查,直接进入下方的服务商选择提示。
environment_variablesOPENAI_API_KEYANTHROPIC_API_KEYGEMINI_API_KEYREPLICATE_API_TOKEN使用AskUserQuestion工具:
- 标题:"开始使用"
- 问题:"你的集合尚未配置任何AI服务商密钥。 请问你想要:"
- 选项:
- "免费试用Jetty" / "使用Jetty提供的AI密钥,获得10次免费运行机会(最多60分钟),无需第三方注册。"
- "添加我自己的密钥" / "立即配置你的OpenAI、Anthropic、Gemini或Replicate API密钥。"
如果用户选择“免费试用Jetty”:
激活试用:
bash
TOKEN="$(cat ~/.config/jetty/token)"
curl -s -X POST "https://flows-api.jetty.io/api/v1/trial/$COLLECTION/activate" \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" | python3 -c "
import sys, json
d = json.load(sys.stdin)
if d.get('active') or d.get('status') == 'active':
print(f'Trial activated! Runs remaining: {d.get(\"runs_remaining\", \"?\")}, Minutes remaining: {d.get(\"minutes_remaining\", \"?\")}')
else:
print('Error:', json.dumps(d))
"- 成功时:告知用户试用已激活,显示剩余运行次数和时长,然后直接跳转到步骤4(部署演示工作流)。试用会在服务器端提供所有必要的AI密钥。
- 失败时:告知用户试用无法激活,然后进入“添加我自己的密钥”流程。
如果用户选择“添加我自己的密钥”:
继续下方的服务商选择提示。
Step 3b: Choose Provider
步骤3b:选择服务商
Use AskUserQuestion:
- Header: "Provider"
- Question: "Which AI provider would you like to configure for your workflows?"
- Options:
- "OpenAI" / "GPT models, DALL-E image generation, and more"
- "Google Gemini" / "Gemini models for text, vision, and image generation"
Based on their choice, ask for the provider API key using AskUserQuestion:
- Header: "Provider Key"
- Question: "Paste your {OpenAI/Google} API key:"
- Options:
- "I'll type it in" / "Let me paste my API key" (they will use the "Other" option)
- "Where do I get one?" / "Help me find or create an API key"
If they need help getting a key:
- OpenAI: "Get your API key at https://platform.openai.com/api-keys"
bash
open "https://platform.openai.com/api-keys" 2>/dev/null || xdg-open "https://platform.openai.com/api-keys" 2>/dev/null - Gemini: "Get your API key at https://aistudio.google.com/apikey"
bash
open "https://aistudio.google.com/apikey" 2>/dev/null || xdg-open "https://aistudio.google.com/apikey" 2>/dev/null
使用AskUserQuestion工具:
- 标题:"服务商"
- 问题:"你想为工作流配置哪个AI服务商?"
- 选项:
- "OpenAI" / "GPT模型、DALL-E图像生成等"
- "Google Gemini" / "用于文本、视觉和图像生成的Gemini模型"
根据用户的选择,使用AskUserQuestion工具请求服务商API密钥:
- 标题:"服务商密钥"
- 问题:"请粘贴你的{OpenAI/Google} API密钥:"
- 选项:
- "我将手动输入" / "让我粘贴我的API密钥"(用户将使用“其他”选项)
- "我在哪里获取?" / "帮助我查找或创建API密钥"
如果用户需要帮助获取密钥:
- OpenAI:"请访问https://platform.openai.com/api-keys获取你的API密钥"
bash
open "https://platform.openai.com/api-keys" 2>/dev/null || xdg-open "https://platform.openai.com/api-keys" 2>/dev/null - Gemini:"请访问https://aistudio.google.com/apikey获取你的API密钥"
bash
open "https://aistudio.google.com/apikey" 2>/dev/null || xdg-open "https://aistudio.google.com/apikey" 2>/dev/null
Step 3c: Store the Provider Key in Collection Environment Variables
步骤3c:将服务商密钥存储到集合环境变量中
First, identify which collection to use. If the user has multiple collections, ask them to choose. If they have one, use it automatically.
Before storing, confirm with the user using AskUserQuestion:
- Header: "Confirm"
- Question: "I'll now send your {provider} API key to Jetty's server so your workflows can use it. The key is stored server-side in your collection's environment variables and is NOT saved locally. Proceed?"
- Options:
- "Yes, store it" / "Send my API key to Jetty"
- "Cancel" / "Don't store the key"
If the user cancels, skip this step and warn them the demo won't work without a provider key.
Then store the key by reading it interactively via and piping it directly to the API. Never embed the provider key in a generated command, heredoc, temp file, or variable assignment visible in tool-call output.
read -rsFor OpenAI:
bash
COLLECTION="the-collection-name"
echo "Paste your OpenAI API key and press Enter:"
read -rs PROVIDER_KEY && \
printf '{"environment_variables": {"OPENAI_API_KEY": "%s"}}' "$PROVIDER_KEY" | \
curl -s -X PATCH -H "Authorization: Bearer $(cat ~/.config/jetty/token)" \
-H "Content-Type: application/json" \
"https://flows-api.jetty.io/api/v1/collections/$COLLECTION/environment" \
--data-binary @- && \
unset PROVIDER_KEYFor Gemini:
bash
COLLECTION="the-collection-name"
echo "Paste your Gemini API key and press Enter:"
read -rs PROVIDER_KEY && \
printf '{"environment_variables": {"GEMINI_API_KEY": "%s"}}' "$PROVIDER_KEY" | \
curl -s -X PATCH -H "Authorization: Bearer $(cat ~/.config/jetty/token)" \
-H "Content-Type: application/json" \
"https://flows-api.jetty.io/api/v1/collections/$COLLECTION/environment" \
--data-binary @- && \
unset PROVIDER_KEYImportant: The command reads input silently (no echo). The key flows from stdin directly into and then into via pipe — it never appears in any generated command text, temp file, or shell history.
read -rsprintfcurlVerify the key was stored (only print key names, never values):
bash
TOKEN="$(cat ~/.config/jetty/token)"
curl -s -H "Authorization: Bearer $TOKEN" \
"https://flows-api.jetty.io/api/v1/collections/$COLLECTION" | python3 -c "import sys,json; d=json.load(sys.stdin); evars=d.get('environment_variables',{}); print('Stored keys:', list(evars.keys()) if evars else 'none')"Tell the user:
"Your {provider} API key has been stored in your Jetty collection's server-side environment. Workflows will use it automatically. The key was not saved to any local file."
首先确定要使用的集合。如果用户有多个集合,请让用户选择。如果只有一个,则自动使用该集合。
存储前,使用AskUserQuestion工具与用户确认:
- 标题:"确认"
- 问题:"我将把你的{服务商} API密钥发送到Jetty服务器,以便你的工作流可以使用它。密钥将存储在你集合的服务器端环境变量中,不会保存在本地。是否继续?"
- 选项:
- "是,存储密钥" / "将我的API密钥发送到Jetty"
- "取消" / "不存储密钥"
如果用户取消,请跳过此步骤,并提醒用户如果没有服务商密钥,演示将无法正常工作。
然后通过交互式读取密钥,并直接通过管道传递到API进行存储。切勿在生成的命令、here文档、临时文件或工具调用输出可见的变量赋值中嵌入服务商密钥。
read -rs对于OpenAI:
bash
COLLECTION="the-collection-name"
echo "请粘贴你的OpenAI API密钥并按回车键:"
read -rs PROVIDER_KEY && \
printf '{"environment_variables": {"OPENAI_API_KEY": "%s"}}' "$PROVIDER_KEY" | \
curl -s -X PATCH -H "Authorization: Bearer $(cat ~/.config/jetty/token)" \
-H "Content-Type: application/json" \
"https://flows-api.jetty.io/api/v1/collections/$COLLECTION/environment" \
--data-binary @- && \
unset PROVIDER_KEY对于Gemini:
bash
COLLECTION="the-collection-name"
echo "请粘贴你的Gemini API密钥并按回车键:"
read -rs PROVIDER_KEY && \
printf '{"environment_variables": {"GEMINI_API_KEY": "%s"}}' "$PROVIDER_KEY" | \
curl -s -X PATCH -H "Authorization: Bearer $(cat ~/.config/jetty/token)" \
-H "Content-Type: application/json" \
"https://flows-api.jetty.io/api/v1/collections/$COLLECTION/environment" \
--data-binary @- && \
unset PROVIDER_KEY重要提示:命令会静默读取输入(无回显)。密钥从标准输入直接传递到,再通过管道传递到——永远不会出现在任何生成的命令文本、临时文件或Shell历史中。
read -rsprintfcurl验证密钥是否已存储(仅打印密钥名称,绝不打印值):
bash
TOKEN="$(cat ~/.config/jetty/token)"
curl -s -H "Authorization: Bearer $TOKEN" \
"https://flows-api.jetty.io/api/v1/collections/$COLLECTION" | python3 -c "import sys,json; d=json.load(sys.stdin); evars=d.get('environment_variables',{}); print('Stored keys:', list(evars.keys()) if evars else 'none')"告知用户:
"你的{服务商} API密钥已存储在Jetty集合的服务器端环境中,工作流将自动使用它。密钥未保存到任何本地文件。"
Step 3d: Agent Runtime Key (for Runbooks)
步骤3d:智能体运行时密钥(用于Runbook)
Runbooks execute inside a coding agent on Jetty. The agent needs its own API key (separate from the image generation provider key above).
Use AskUserQuestion:
- Header: "Agent Runtime"
- Question: "Jetty runbooks run inside a coding agent. Which will you use?"
- Options:
- "Claude Code" / "Anthropic's claude-sonnet-4-6. Needs an Anthropic API key (~$3/MTok input)"
- "Codex" / "OpenAI's gpt-5.4. Needs an OpenAI API key"
- "Gemini CLI" / "Google's gemini-3.1-pro-preview. Needs a Google AI API key"
- "Skip for now" / "I'll configure this later when I need runbooks"
If the user chooses "Skip", move on to Step 4.
Otherwise, check if the required key already exists in the collection env vars:
- Claude Code →
ANTHROPIC_API_KEY - Codex → (may already exist from provider step above)
OPENAI_API_KEY - Gemini CLI → (may already exist from provider step above)
GOOGLE_API_KEY
bash
TOKEN="$(cat ~/.config/jetty/token)"
COLLECTION="the-collection-name"
curl -s -H "Authorization: Bearer $TOKEN" \
"https://flows-api.jetty.io/api/v1/collections/$COLLECTION" \
| python3 -c "import sys,json; d=json.load(sys.stdin); evars=d.get('environment_variables',{}); print('Stored keys:', list(evars.keys()) if evars else 'none')"If the key exists, tell the user:
"Your {agent} API key is already configured. You're ready to run runbooks!"
If the key is missing, ask the user to paste it and store it using the same secure pattern as the provider key ( → pipe to → ).
read -rscurl PATCHunsetHelp links if they need a key:
- Anthropic: "Get your key at https://console.anthropic.com/settings/keys"
bash
open "https://console.anthropic.com/settings/keys" 2>/dev/null || xdg-open "https://console.anthropic.com/settings/keys" 2>/dev/null - OpenAI: "Get your key at https://platform.openai.com/api-keys"
- Google: "Get your key at https://aistudio.google.com/apikey"
Runbook在Jetty的编码智能体内部执行。智能体需要自己的API密钥(与上述图像生成服务商密钥分开)。
使用AskUserQuestion工具:
- 标题:"智能体运行时"
- 问题:"Jetty Runbook在编码智能体内部运行。你将使用哪一个?"
- 选项:
- "Claude Code" / "Anthropic的claude-sonnet-4-6,需要Anthropic API密钥(约$3/百万输入令牌)"
- "Codex" / "OpenAI的gpt-5.4,需要OpenAI API密钥"
- "Gemini CLI" / "Google的gemini-3.1-pro-preview,需要Google AI API密钥"
- "暂时跳过" / "我将在需要Runbook时再进行配置"
如果用户选择“暂时跳过”,则进入步骤4。
否则,检查集合环境变量中是否已存在所需密钥:
- Claude Code →
ANTHROPIC_API_KEY - Codex → (可能已在服务商步骤中配置)
OPENAI_API_KEY - Gemini CLI → (可能已在服务商步骤中配置)
GOOGLE_API_KEY
bash
TOKEN="$(cat ~/.config/jetty/token)"
COLLECTION="the-collection-name"
curl -s -H "Authorization: Bearer $TOKEN" \
"https://flows-api.jetty.io/api/v1/collections/$COLLECTION" \
| python3 -c "import sys,json; d=json.load(sys.stdin); evars=d.get('environment_variables',{}); print('Stored keys:', list(evars.keys()) if evars else 'none')"如果密钥已存在,告知用户:
"你的{智能体} API密钥已配置完成,你可以开始运行Runbook了!"
如果密钥缺失,请让用户粘贴密钥,并使用与服务商密钥相同的安全模式进行存储( → 管道到 → )。
read -rscurl PATCHunset如果用户需要密钥,提供帮助链接:
- Anthropic:"请访问https://console.anthropic.com/settings/keys获取你的密钥"
bash
open "https://console.anthropic.com/settings/keys" 2>/dev/null || xdg-open "https://console.anthropic.com/settings/keys" 2>/dev/null - OpenAI:"请访问https://platform.openai.com/api-keys获取你的密钥"
- Google:"请访问https://aistudio.google.com/apikey获取你的密钥"
Step 4: Introduce Runbooks
步骤4:介绍Runbook
Now that the user has a working Jetty account and API keys, introduce the concept of runbooks.
Tell the user:
What's a runbook?A runbook is a human-readable markdown file that describes a series of steps for a coding agent to follow — like a recipe for automation. Here's what makes them powerful:
- Plain markdown — You can read, edit, and version-control them just like any other document
- Agent-executed — A coding agent (Claude Code, Codex, Gemini CLI) reads the runbook and carries out each step autonomously
- Measurable outcomes — Every runbook ends with a concrete, verifiable result (a report, a dataset, a set of passing tests)
- Multi-step with judgment — Runbooks can include evaluation loops where the agent checks its own work and iterates until the result meets a quality bar
- API-connected — Tasks can interact with any system you give them access to via API keys stored in your Jetty collection. They can call external APIs, query databases, process files, and more
- Long-running — Unlike a quick chat response, runbook tasks typically run for several minutes (up to 60), working through complex multi-step processes end to end
Think of a runbook as the difference between asking someone a question and handing them a detailed project brief.
现在用户已拥有可用的Jetty账户和API密钥,介绍Runbook的概念。
告知用户:
什么是Runbook?Runbook是一种人类可读的Markdown文件,描述了编码智能体需要遵循的一系列步骤——就像自动化的操作手册。它的强大之处在于:
- 纯Markdown格式——你可以像处理其他文档一样阅读、编辑和版本控制Runbook
- 智能体执行——编码智能体(Claude Code、Codex、Gemini CLI)会读取Runbook并自主完成每个步骤
- 可衡量的成果——每个Runbook都会以具体、可验证的结果结束(例如报告、数据集、通过的测试集)
- 带判断的多步骤流程——Runbook可以包含评估循环,智能体将检查自己的工作并迭代,直到结果符合质量标准
- API连接能力——任务可以通过存储在Jetty集合中的API密钥与任何你授权的系统交互,包括调用外部API、查询数据库、处理文件等
- 长时间运行——与快速的聊天响应不同,Runbook任务通常会运行数分钟(最多60分钟),端到端完成复杂的多步骤流程
可以把Runbook看作是“向某人提问”和“给他们一份详细的项目简报”之间的区别。
Step 5: Suggest a Starter Runbook
步骤5:推荐入门Runbook模板
Use AskUserQuestion:
- Header: "Your First Runbook"
- Question: "What kind of task would you like to automate? Pick a starter template or describe your own."
- Options:
- "Data extraction" / "Extract structured data from documents, validate against a schema, and produce a quality report"
- "Content generation" / "Generate content from a brief, score it against a rubric, and iterate until it meets a quality bar"
- "Testing & regression" / "Run a test suite or replay queries against an API, evaluate pass/fail, and produce a regression report"
- "Something else" / "I'll describe what I want to automate"
使用AskUserQuestion工具:
- 标题:"你的第一个Runbook"
- 问题:"你想要自动化哪种类型的任务?选择一个入门模板或描述你自己的需求。"
- 选项:
- "数据提取" / "从文档中提取结构化数据,根据 schema 验证,并生成质量报告"
- "内容生成" / "根据需求生成内容,根据评分标准打分,并迭代直到符合质量要求"
- "测试与回归" / "运行测试套件或重放API查询,评估通过/失败情况,并生成回归报告"
- "其他任务" / "我将描述我想要自动化的任务"
If the user picks a template:
如果用户选择模板:
Briefly describe what the chosen template does:
Data extraction:
"This runbook will pull data from a source you specify (documents, APIs, web pages), extract structured fields, validate them against a schema, and iterate on any errors — then produce a summary report."
Content generation:
"This runbook will take a brief or prompt, generate content (text, images, code — whatever you need), evaluate the output against quality criteria you define, and refine it until it's good enough."
Testing & regression:
"This runbook will run a set of test cases against an API or system, compare results to expected outcomes, and produce a pass/fail regression report with details on any failures."
Then ask for specifics using AskUserQuestion:
- Header: "Describe Your Task"
- Question: "Now describe your specific use case in a sentence or two. What goes in, what processing happens, and what comes out? For example: 'Pull product descriptions from our CSV, translate them to Spanish, and check that each translation preserves the brand name and key specs.'"
- Options:
- "I'll describe it" / "Let me type my use case" (user types in the text field)
简要描述所选模板的功能:
数据提取:
"此Runbook将从你指定的来源(文档、API、网页)提取数据,提取结构化字段,根据schema验证,并处理错误——最后生成一份汇总报告。"
内容生成:
"此Runbook将根据你的需求或提示生成内容(文本、图像、代码——无论你需要什么),根据你定义的质量标准评估输出,并优化直到符合要求。"
测试与回归:
"此Runbook将针对API或系统运行一组测试用例,将结果与预期输出对比,并生成包含失败详情的通过/失败回归报告。"
然后使用AskUserQuestion工具询问具体需求:
- 标题:"描述你的任务"
- 问题:"现在用一两句话描述你的具体用例。输入是什么,需要进行什么处理,输出是什么?例如:'从我们的CSV中提取产品描述,翻译成西班牙语,并检查每个翻译是否保留了品牌名称和关键规格。'"
- 选项:
- "我将描述需求" / "让我输入我的用例"(用户将在文本框中输入)
If the user chose "Something else":
如果用户选择“其他任务”:
Use AskUserQuestion:
- Header: "Describe Your Task"
- Question: "Describe the task you'd like to automate in simple terms. What goes in, what processing happens, and what should come out at the end? Remember — any system you can reach via an API key, the agent can interact with too."
- Options:
- "I'll describe it" / "Let me type a description" (user types in the text field)
- "Show me more examples" / "I'd like to see more ideas first"
If "Show me more examples", display:
Example runbook tasks people have built:
- NL-to-SQL Regression — Pull failed queries from a log, replay them against an NL-to-SQL API, execute on a database, evaluate pass/fail, produce a regression report
- PDF-to-Metadata Conversion — Extract metadata from academic PDFs, generate structured JSON-LD, validate against a schema, iterate on errors
- Branded Social Graphics — Parse a text script, generate AI images, compose HTML with overlays, judge against a brand rubric, iterate until on-brand
- Clinical Training Content — Parse competency documents, generate training scenarios, score with a rubric, produce learning plans
- API Health Monitor — Hit a list of endpoints, compare response shapes to expected schemas, flag regressions, produce a status report
Then re-ask the description question.
Save the user's task description for use in the next step.
使用AskUserQuestion工具:
- 标题:"描述你的任务"
- 问题:"用简单的语言描述你想要自动化的任务。输入是什么,需要进行什么处理,最终输出应该是什么?记住——只要你能通过API密钥访问的系统,智能体都可以与之交互。"
- 选项:
- "我将描述需求" / "让我输入描述"(用户将在文本框中输入)
- "展示更多示例" / "我想先看看更多示例"
如果用户选择“展示更多示例”,显示:
人们已构建的Runbook任务示例:
- NL-to-SQL回归测试——从日志中提取失败的查询,在NL-to-SQL API上重放,在数据库中执行,评估通过/失败,生成回归报告
- PDF到元数据转换——从学术PDF中提取元数据,生成结构化JSON-LD,根据schema验证,处理错误
- 品牌社交图形生成——解析文本脚本,生成AI图像,组合带叠加层的HTML,根据品牌标准判断,迭代直到符合要求
- 临床培训内容生成——解析能力文档,生成培训场景,根据评分标准打分,生成学习计划
- API健康监控——访问一系列端点,将响应结构与预期schema对比,标记回归问题,生成状态报告
然后重新询问描述问题。
保存用户的任务描述,用于下一步。
Step 6: Hand Off to Create-Runbook
步骤6:移交到创建Runbook流程
Now that you have the user's task description, hand off to the create-runbook skill to scaffold their runbook.
Tell the user:
"Great — I have enough to get started. I'm going to hand you off to the runbook creation wizard, which will walk you through building your runbook step by step."
Then invoke the skill with the user's task description as the argument. If the agent platform doesn't support skill invocation, tell the user:
/create-runbook"Runto start building your runbook."/create-runbook <their task description>
现在你已获取用户的任务描述,将用户移交到create-runbook技能,以生成他们的Runbook框架。
告知用户:
"很好——我已获取足够的信息。我将把你移交到Runbook创建向导,它将引导你逐步构建你的Runbook。"
然后调用技能,并将用户的任务描述作为参数。如果智能体平台不支持技能调用,告知用户:
/create-runbook"运行开始构建你的Runbook。"/create-runbook <你的任务描述>
Step 7: Next Steps
步骤7:后续步骤
After the runbook is created (or if the user wants to come back later), tell the user:
"You're all set! Here's what you can do next:Run your runbook on Jetty:/jetty run-runbook <path-to-your-runbook>Create another runbook:— the wizard will guide you through it/create-runbookOptimize a runbook after a few runs:— analyzes past executions and suggests improvements/optimize-runbookManage your workflows and tasks:— see everything in your collection/jetty list tasksCheck execution history:— see all past runs and their results/jetty show trajectoriesThecommand is your gateway to the full Jetty platform. Just describe what you want in natural language."/jetty
Runbook创建完成后(或如果用户想稍后再继续),告知用户:
"设置完成!你可以进行以下操作:在Jetty上运行你的Runbook:/jetty run-runbook <你的Runbook路径>创建另一个Runbook:——向导将引导你完成创建/create-runbook运行几次后优化Runbook:——分析过往执行记录并提出改进建议/optimize-runbook管理你的工作流和任务:——查看你集合中的所有内容/jetty list tasks查看执行历史:——查看所有过往运行记录及其结果/jetty show trajectories命令是你访问完整Jetty平台的入口,只需用自然语言描述你的需求即可。"/jetty
Important Notes
重要注意事项
- Read the token from file: Use at the start of each bash command block. Environment variables do not persist between bash invocations.
TOKEN="$(cat ~/.config/jetty/token)" - Never log credentials: Do not echo, print, or include tokens/keys in output shown to the user. Use redacted forms like .
mlc_...xxxx - Read secrets interactively via : Never embed secrets in generated commands, heredocs, or temp files. Use
read -rsso the secret flows from the user's terminal directly into the API call without appearing in tool-call output or shell history. Alwaysread -rs VAR && printf ... "$VAR" | curl --data-binary @-the variable immediately after use.unset - URL disambiguation: Use for all API calls (workflows, collections, tasks, trajectories, files). NEVER use
flows-api.jetty.iofor API calls (it's the web frontend).flows.jetty.io - Trajectories response shape: The list endpoint returns — always access via
{"trajectories": [...]}..trajectories[] - Steps are objects, not arrays: Trajectory steps are keyed by step name (e.g., ), not by index.
.steps.expand_prompt
- 从文件读取令牌:在每个bash命令块的开头使用。环境变量不会在bash调用之间持久化。
TOKEN="$(cat ~/.config/jetty/token)" - 切勿记录凭据:不要在向用户显示的输出中回显、打印或包含令牌/密钥。使用等脱敏格式。
mlc_...xxxx - 通过交互式读取密钥:切勿在生成的命令、here文档或临时文件中嵌入密钥。使用
read -rs,确保密钥从用户终端直接传递到API调用,不会出现在工具调用输出或Shell历史中。使用后务必立即read -rs VAR && printf ... "$VAR" | curl --data-binary @-变量。unset - URL区分:所有API调用(工作流、集合、任务、执行记录、文件)均使用。切勿使用
flows-api.jetty.io进行API调用(这是Web前端)。flows.jetty.io - 执行记录响应格式:列表端点返回——请始终通过
{"trajectories": [...]}访问数据。.trajectories[] - 步骤是对象而非数组:执行记录的步骤按步骤名称键控(例如),而非按索引。
.steps.expand_prompt