llm-router

Compare original and translation side by side

🇺🇸

Original

English
🇨🇳

Translation

Chinese

LLM Router

LLM 路由工具

Overview

概述

Route AI requests to different LLM providers using SwiftOpenAI-CLI's agent mode. This skill automatically configures the CLI to use the requested provider (OpenAI, Grok, Groq, DeepSeek, or OpenRouter), ensures the tool is installed and up-to-date, and executes one-shot agentic tasks.
通过SwiftOpenAI-CLI的Agent模式将AI请求路由至不同的LLM提供商。本Skill会自动配置CLI以使用用户指定的提供商(OpenAI、Grok、Groq、DeepSeek或OpenRouter),确保工具已安装并保持最新版本,同时执行一次性Agent任务。

Core Workflow

核心工作流

When a user requests to use a specific LLM provider (e.g., "use grok to explain quantum computing"), follow this workflow:
当用户请求使用特定LLM提供商时(例如:“用grok解释量子计算”),请遵循以下工作流:

Step 1: Ensure SwiftOpenAI-CLI is Ready

步骤1:确保SwiftOpenAI-CLI已就绪

Check if SwiftOpenAI-CLI is installed and up-to-date:
bash
scripts/check_install_cli.sh
This script will:
  • Check if
    swiftopenai
    is installed
  • Verify the version (minimum 1.4.4)
  • Install or update if necessary
  • Report the current installation status
检查SwiftOpenAI-CLI是否已安装且为最新版本:
bash
scripts/check_install_cli.sh
该脚本将:
  • 检查
    swiftopenai
    是否已安装
  • 验证版本(最低要求1.4.4)
  • 必要时进行安装或更新
  • 报告当前安装状态

Step 2: Configure the Provider

步骤2:配置目标提供商

Based on the user's request, identify the target provider and configure SwiftOpenAI-CLI:
bash
scripts/configure_provider.sh <provider> [model]
Supported providers:
  • openai
    - OpenAI (GPT-4, GPT-5, etc.)
  • grok
    - xAI Grok models
  • groq
    - Groq (Llama, Mixtral, etc.)
  • deepseek
    - DeepSeek models
  • openrouter
    - OpenRouter (300+ models)
Examples:
bash
undefined
根据用户请求识别目标提供商并配置SwiftOpenAI-CLI:
bash
scripts/configure_provider.sh <provider> [model]
支持的提供商:
  • openai
    - OpenAI(GPT-4、GPT-5等)
  • grok
    - xAI Grok系列模型
  • groq
    - Groq(Llama、Mixtral等)
  • deepseek
    - DeepSeek系列模型
  • openrouter
    - OpenRouter(支持300+模型)
示例:
bash
undefined

Configure for Grok

配置Grok

scripts/configure_provider.sh grok grok-4-0709
scripts/configure_provider.sh grok grok-4-0709

Configure for Groq with Llama

配置Groq并使用Llama模型

scripts/configure_provider.sh groq llama-3.3-70b-versatile
scripts/configure_provider.sh groq llama-3.3-70b-versatile

Configure for DeepSeek Reasoner

配置DeepSeek Reasoner模型

scripts/configure_provider.sh deepseek deepseek-reasoner
scripts/configure_provider.sh deepseek deepseek-reasoner

Configure for OpenAI GPT-5

配置OpenAI GPT-5

scripts/configure_provider.sh openai gpt-5

The script automatically:
- Sets the provider configuration
- Sets the appropriate base URL
- Sets the default model
- Provides guidance on API key configuration
scripts/configure_provider.sh openai gpt-5

该脚本将自动:
- 设置提供商配置
- 配置对应的基础URL
- 设置默认模型
- 提供API密钥配置指引

Step 3: Verify API Key

步骤3:验证API密钥

The configuration script automatically checks if an API key is set and will stop with clear instructions if no API key is found.
If API key is missing:
The script exits with error code 1 and displays:
  • ⚠️ Warning that API key is not set
  • Instructions for setting via environment variable
  • Instructions for setting via config (persistent)
Do not proceed to Step 4 if the configuration script fails due to missing API key.
Instead, inform the user they need to set their API key first:
Option 1 - Environment variable (session only):
bash
export XAI_API_KEY=xai-...           # for Grok
export GROQ_API_KEY=gsk_...          # for Groq
export DEEPSEEK_API_KEY=sk-...       # for DeepSeek
export OPENROUTER_API_KEY=sk-or-...  # for OpenRouter
export OPENAI_API_KEY=sk-...         # for OpenAI
Option 2 - Config file (persistent):
bash
swiftopenai config set api-key <api-key-value>
After the user sets their API key, re-run the configuration script to verify.
配置脚本会自动检查是否已设置API密钥,若未找到则会终止运行并显示清晰指引
若缺少API密钥:
脚本会以错误码1退出,并显示:
  • ⚠️ 未设置API密钥的警告
  • 通过环境变量设置密钥的指引
  • 通过配置文件设置密钥(持久化)的指引
若配置脚本因缺少API密钥而失败,请不要执行步骤4。
应告知用户需先设置API密钥:
选项1 - 环境变量(仅当前会话有效):
bash
export XAI_API_KEY=xai-...           # 适用于Grok
export GROQ_API_KEY=gsk_...          # 适用于Groq
export DEEPSEEK_API_KEY=sk-...       # 适用于DeepSeek
export OPENROUTER_API_KEY=sk-or-...  # 适用于OpenRouter
export OPENAI_API_KEY=sk-...         # 适用于OpenAI
选项2 - 配置文件(持久化有效):
bash
swiftopenai config set api-key <api-key-value>
用户设置好API密钥后,重新运行配置脚本进行验证。

Step 4: Execute the Agentic Task

步骤4:执行Agent任务

Run the user's request using agent mode:
bash
swiftopenai agent "<user's question or task>"
Agent mode features:
  • One-shot task execution
  • Built-in tool calling
  • MCP (Model Context Protocol) integration support
  • Conversation memory with session IDs
  • Multiple output formats
Examples:
bash
undefined
使用Agent模式运行用户的请求:
bash
swiftopenai agent "<用户的问题或任务>"
Agent模式特性:
  • 一次性任务执行
  • 内置工具调用
  • 支持MCP(Model Context Protocol)集成
  • 基于会话ID的对话记忆
  • 多种输出格式
示例:
bash
undefined

Simple question

简单问题

swiftopenai agent "What is quantum entanglement?"
swiftopenai agent "什么是量子纠缠?"

With specific model override

覆盖默认模型

swiftopenai agent "Write a Python function" --model grok-3
swiftopenai agent "编写一个Python函数" --model grok-3

With session for conversation continuity

使用会话保持对话连续性

swiftopenai agent "Remember my name is Alice" --session-id chat-123 swiftopenai agent "What's my name?" --session-id chat-123
swiftopenai agent "记住我的名字是Alice" --session-id chat-123 swiftopenai agent "我叫什么名字?" --session-id chat-123

With MCP tools (filesystem example)

集成MCP工具(文件系统示例)

swiftopenai agent "Read the README.md file"
--mcp-servers filesystem
--allowed-tools "mcp__filesystem__*"
undefined
swiftopenai agent "读取README.md文件"
--mcp-servers filesystem
--allowed-tools "mcp__filesystem__*"
undefined

Usage Patterns

使用场景

Pattern 1: Simple Provider Routing

场景1:简单提供商路由

User Request: "Use grok to explain quantum computing"
Execution:
bash
undefined
用户请求: "用grok解释量子计算"
执行流程:
bash
undefined

1. Check CLI installation

1. 检查CLI安装状态

scripts/check_install_cli.sh
scripts/check_install_cli.sh

2. Configure for Grok

2. 配置Grok

scripts/configure_provider.sh grok grok-4-0709
scripts/configure_provider.sh grok grok-4-0709

3. Execute the task

3. 执行任务

swiftopenai agent "Explain quantum computing"
undefined
swiftopenai agent "解释量子计算"
undefined

Pattern 2: Specific Model Selection

场景2:指定模型选择

User Request: "Ask DeepSeek Reasoner to solve this math problem step by step"
Execution:
bash
undefined
用户请求: "让DeepSeek Reasoner一步步解决这个数学问题"
执行流程:
bash
undefined

1. Check CLI installation

1. 检查CLI安装状态

scripts/check_install_cli.sh
scripts/check_install_cli.sh

2. Configure for DeepSeek with Reasoner model

2. 配置DeepSeek并使用Reasoner模型

scripts/configure_provider.sh deepseek deepseek-reasoner
scripts/configure_provider.sh deepseek deepseek-reasoner

3. Execute with explicit model

3. 指定模型执行任务

swiftopenai agent "Solve x^2 + 5x + 6 = 0 step by step" --model deepseek-reasoner
undefined
swiftopenai agent "一步步求解x^2 + 5x + 6 = 0" --model deepseek-reasoner
undefined

Pattern 3: Fast Inference with Groq

场景3:使用Groq实现快速推理

User Request: "Use groq to generate code quickly"
Execution:
bash
undefined
用户请求: "用groq快速生成代码"
执行流程:
bash
undefined

1. Check CLI installation

1. 检查CLI安装状态

scripts/check_install_cli.sh
scripts/check_install_cli.sh

2. Configure for Groq (known for fast inference)

2. 配置Groq(以快速推理著称)

scripts/configure_provider.sh groq llama-3.3-70b-versatile
scripts/configure_provider.sh groq llama-3.3-70b-versatile

3. Execute the task

3. 执行任务

swiftopenai agent "Write a function to calculate fibonacci numbers"
undefined
swiftopenai agent "编写一个计算斐波那契数列的函数"
undefined

Pattern 4: Access Multiple Models via OpenRouter

场景4:通过OpenRouter访问多模型

User Request: "Use OpenRouter to access Claude"
Execution:
bash
undefined
用户请求: "用OpenRouter访问Claude"
执行流程:
bash
undefined

1. Check CLI installation

1. 检查CLI安装状态

scripts/check_install_cli.sh
scripts/check_install_cli.sh

2. Configure for OpenRouter

2. 配置OpenRouter

scripts/configure_provider.sh openrouter anthropic/claude-3.5-sonnet
scripts/configure_provider.sh openrouter anthropic/claude-3.5-sonnet

3. Execute with Claude via OpenRouter

3. 通过OpenRouter调用Claude执行任务

swiftopenai agent "Explain the benefits of functional programming"
undefined
swiftopenai agent "解释函数式编程的优势"
undefined

Provider-Specific Considerations

各提供商注意事项

OpenAI (GPT-5 Models)

OpenAI(GPT-5模型)

GPT-5 models support advanced parameters:
bash
undefined
GPT-5模型支持高级参数配置:
bash
undefined

Minimal reasoning for fast coding tasks

最小推理量以加速编码任务

swiftopenai agent "Write a sort function"
--model gpt-5
--reasoning minimal
--verbose low
swiftopenai agent "编写一个排序函数"
--model gpt-5
--reasoning minimal
--verbose low

High reasoning for complex problems

高推理量处理复杂问题

swiftopenai agent "Explain quantum mechanics"
--model gpt-5
--reasoning high
--verbose high

**Verbosity levels:** `low`, `medium`, `high`
**Reasoning effort:** `minimal`, `low`, `medium`, `high`
swiftopenai agent "解释量子力学"
--model gpt-5
--reasoning high
--verbose high

**详细程度等级:** `low`, `medium`, `high`
**推理强度:** `minimal`, `low`, `medium`, `high`

Grok (xAI)

Grok(xAI)

Grok models are optimized for real-time information and coding:
  • grok-4-0709
    - Latest with enhanced reasoning
  • grok-3
    - General purpose
  • grok-code-fast-1
    - Optimized for code generation
Grok模型针对实时信息获取和编码进行了优化:
  • grok-4-0709
    - 最新版本,推理能力增强
  • grok-3
    - 通用型模型
  • grok-code-fast-1
    - 针对代码生成优化

Groq

Groq

Known for ultra-fast inference with open-source models:
  • llama-3.3-70b-versatile
    - Best general purpose
  • mixtral-8x7b-32768
    - Mixture of experts
以开源模型的超快速推理能力著称:
  • llama-3.3-70b-versatile
    - 最佳通用型模型
  • mixtral-8x7b-32768
    - 混合专家模型

DeepSeek

DeepSeek

Specialized in reasoning and coding:
  • deepseek-reasoner
    - Advanced step-by-step reasoning
  • deepseek-coder
    - Coding specialist
  • deepseek-chat
    - General chat
专注于推理和编码场景:
  • deepseek-reasoner
    - 高级分步推理模型
  • deepseek-coder
    - 编码专精模型
  • deepseek-chat
    - 通用对话模型

OpenRouter

OpenRouter

Provides access to 300+ models:
  • Anthropic Claude models
  • OpenAI models
  • Google Gemini models
  • Meta Llama models
  • And many more
支持访问300+模型:
  • Anthropic Claude系列模型
  • OpenAI系列模型
  • Google Gemini系列模型
  • Meta Llama系列模型
  • 以及更多其他模型

API Key Management

API密钥管理

Recommended: Use Environment Variables for Multiple Providers

推荐方案:为多提供商设置环境变量

The best practice for using multiple providers is to set all API keys as environment variables. This allows seamless switching between providers without reconfiguring keys.
Add to your shell profile (
~/.zshrc
or
~/.bashrc
):
bash
undefined
使用多提供商的最佳实践是将所有API密钥设置为环境变量,这样无需重新配置密钥即可在不同提供商之间无缝切换。
添加至Shell配置文件
~/.zshrc
~/.bashrc
):
bash
undefined

API Keys for LLM Providers

LLM提供商API密钥

export OPENAI_API_KEY=sk-... export XAI_API_KEY=xai-... export GROQ_API_KEY=gsk_... export DEEPSEEK_API_KEY=sk-... export OPENROUTER_API_KEY=sk-or-v1-...

After adding these, reload your shell:

```bash
source ~/.zshrc  # or source ~/.bashrc
How it works:
  • SwiftOpenAI-CLI automatically uses the correct provider-specific key based on the configured provider
  • When you switch to Grok, it uses
    XAI_API_KEY
  • When you switch to OpenAI, it uses
    OPENAI_API_KEY
  • No need to reconfigure keys each time
export OPENAI_API_KEY=sk-... export XAI_API_KEY=xai-... export GROQ_API_KEY=gsk_... export DEEPSEEK_API_KEY=sk-... export OPENROUTER_API_KEY=sk-or-v1-...

添加完成后,重新加载Shell:

```bash
source ~/.zshrc  # 或 source ~/.bashrc
工作原理:
  • SwiftOpenAI-CLI会根据已配置的提供商自动使用对应的提供商专属密钥
  • 切换至Grok时,将使用
    XAI_API_KEY
  • 切换至OpenAI时,将使用
    OPENAI_API_KEY
  • 无需每次切换都重新配置密钥

Alternative: Single API Key via Config (Not Recommended for Multiple Providers)

替代方案:通过配置文件设置单个API密钥(不推荐用于多提供商场景)

If you only use one provider, you can store the key in the config file:
bash
swiftopenai config set api-key <your-key>
Limitation: The config file only stores ONE api-key. If you switch providers, you'd need to reconfigure the key each time.
若仅使用单个提供商,可将密钥存储在配置文件中:
bash
swiftopenai config set api-key <your-key>
局限性: 配置文件仅能存储一个API密钥。若切换提供商,需重新配置密钥。

Checking Current API Key

查看当前API密钥配置

bash
undefined
bash
undefined

View current configuration (API key is masked)

查看当前配置(API密钥将被掩码处理)

swiftopenai config list
swiftopenai config list

Get specific API key setting

获取API密钥的具体设置

swiftopenai config get api-key

**Priority:** Provider-specific environment variables take precedence over config file settings.
swiftopenai config get api-key

**优先级:** 提供商专属环境变量的优先级高于配置文件设置。

Advanced Features

高级功能

Interactive Configuration

交互式配置

For complex setups, use the interactive wizard:
bash
swiftopenai config setup
This launches a guided setup that walks through:
  • Provider selection
  • API key entry
  • Model selection
  • Debug mode configuration
  • Base URL setup (if needed)
针对复杂设置,可使用交互式向导:
bash
swiftopenai config setup
该向导会引导你完成以下配置:
  • 提供商选择
  • API密钥输入
  • 模型选择
  • 调试模式配置
  • 基础URL设置(如需)

Session Management

会话管理

Maintain conversation context across multiple requests:
bash
undefined
在多次请求之间保持对话上下文:
bash
undefined

Start a session

启动会话

swiftopenai agent "My project is a React app" --session-id project-123
swiftopenai agent "我的项目是一个React应用" --session-id project-123

Continue the session

继续会话

swiftopenai agent "What framework did I mention?" --session-id project-123
undefined
swiftopenai agent "我提到的框架是什么?" --session-id project-123
undefined

MCP Tool Integration

MCP工具集成

Connect to external services via Model Context Protocol:
bash
undefined
通过Model Context Protocol连接至外部服务:
bash
undefined

With GitHub MCP

集成GitHub MCP

swiftopenai agent "List my repos"
--mcp-servers github
--allowed-tools "mcp__github__*"
swiftopenai agent "列出我的仓库"
--mcp-servers github
--allowed-tools "mcp__github__*"

With filesystem MCP

集成文件系统MCP

swiftopenai agent "Read package.json and explain dependencies"
--mcp-servers filesystem
--allowed-tools "mcp__filesystem__*"
swiftopenai agent "读取package.json并解释依赖项"
--mcp-servers filesystem
--allowed-tools "mcp__filesystem__*"

Multiple MCP servers

集成多个MCP服务

swiftopenai agent "Complex task"
--mcp-servers github,filesystem,postgres
--allowed-tools "mcp__*"
undefined
swiftopenai agent "复杂任务"
--mcp-servers github,filesystem,postgres
--allowed-tools "mcp__*"
undefined

Output Formats

输出格式

Control how results are presented:
bash
undefined
控制结果的展示形式:
bash
undefined

Plain text (default)

纯文本(默认)

swiftopenai agent "Calculate 5 + 3" --output-format plain
swiftopenai agent "计算5 + 3" --output-format plain

Structured JSON

结构化JSON

swiftopenai agent "List 3 colors" --output-format json
swiftopenai agent "列出3种颜色" --output-format json

Streaming JSON events (Claude SDK style)

流式JSON事件(Claude SDK风格)

swiftopenai agent "Analyze data" --output-format stream-json
undefined
swiftopenai agent "分析数据" --output-format stream-json
undefined

Troubleshooting

故障排查

Common Issues

常见问题

Issue: "swiftopenai: command not found"
Solution: Run the check_install_cli.sh script, which will install the CLI automatically.
Issue: Authentication errors
Solution: Verify the correct API key is set for the provider:
bash
undefined
问题:"swiftopenai: command not found"
解决方案:运行
check_install_cli.sh
脚本,该脚本会自动安装CLI。
问题:认证错误
解决方案:验证是否为目标提供商设置了正确的API密钥:
bash
undefined

Check current config

查看当前配置

swiftopenai config list
swiftopenai config list

Set the appropriate API key

设置正确的API密钥

swiftopenai config set api-key <your-key>
swiftopenai config set api-key <your-key>

Or use environment variable

或使用环境变量

export XAI_API_KEY=xai-... # for Grok

**Issue: Model not available**

Solution: Verify the model name matches the provider's available models. Check `references/providers.md` for correct model names or run:

```bash
swiftopenai models
Issue: Rate limiting or quota errors
Solution: These are provider-specific limits. Consider:
  • Using a different model tier
  • Switching to a different provider temporarily
  • Checking your API usage dashboard
export XAI_API_KEY=xai-... # 适用于Grok

**问题:模型不可用**

解决方案:验证模型名称是否与提供商的可用模型匹配。可查看`references/providers.md`获取正确的模型名称,或运行以下命令:

```bash
swiftopenai models
问题:速率限制或配额错误
解决方案:这些是提供商的专属限制,可尝试:
  • 使用不同层级的模型
  • 临时切换至其他提供商
  • 查看API使用仪表盘

Debug Mode

调试模式

Enable debug mode to see detailed HTTP information:
bash
swiftopenai config set debug true
This shows:
  • HTTP status codes and headers
  • API request details
  • Response metadata
启用调试模式以查看详细的HTTP信息:
bash
swiftopenai config set debug true
启用后将显示:
  • HTTP状态码和请求头
  • API请求详情
  • 响应元数据

Resources

资源

This skill includes bundled resources to support LLM routing:
本Skill包含以下配套资源以支持LLM路由:

scripts/

scripts/

  • check_install_cli.sh - Ensures SwiftOpenAI-CLI is installed and up-to-date
  • configure_provider.sh - Configures the CLI for a specific provider
  • check_install_cli.sh - 确保SwiftOpenAI-CLI已安装并为最新版本
  • configure_provider.sh - 为特定提供商配置CLI

references/

references/

  • providers.md - Comprehensive reference on all supported providers, models, configurations, and capabilities
  • providers.md - 所有支持的提供商、模型、配置及功能的综合参考文档

Best Practices

最佳实践

  1. Always check installation first - Run
    check_install_cli.sh
    before routing requests
  2. Configure provider explicitly - Use
    configure_provider.sh
    to ensure correct setup
  3. Verify API keys - Check that the appropriate API key is set for the target provider
  4. Choose the right model - Match the model to the task (coding, reasoning, general chat)
  5. Use sessions for continuity - Leverage
    --session-id
    for multi-turn conversations
  6. Enable debug mode for troubleshooting - When issues arise, debug mode provides valuable insights
  7. Reference provider documentation - Consult
    references/providers.md
    for detailed provider information
  1. 始终先检查安装状态 - 路由请求前运行
    check_install_cli.sh
  2. 显式配置提供商 - 使用
    configure_provider.sh
    确保设置正确
  3. 验证API密钥 - 确认已为目标提供商设置了正确的API密钥
  4. 选择合适的模型 - 根据任务类型选择对应模型(编码、推理、通用对话)
  5. 使用会话保持连续性 - 利用
    --session-id
    实现多轮对话
  6. 启用调试模式排查问题 - 遇到问题时,调试模式可提供有价值的信息
  7. 参考提供商文档 - 查阅
    references/providers.md
    获取详细的提供商信息

Examples

示例

Example 1: Routing to Grok for Real-Time Information

示例1:路由至Grok获取实时信息

bash
undefined
bash
undefined

User: "Use grok to tell me about recent AI developments"

用户请求:"用grok告诉我近期AI领域的发展"

scripts/check_install_cli.sh scripts/configure_provider.sh grok grok-4-0709 swiftopenai agent "Tell me about recent AI developments"
undefined
scripts/check_install_cli.sh scripts/configure_provider.sh grok grok-4-0709 swiftopenai agent "告诉我近期AI领域的发展"
undefined

Example 2: Using DeepSeek for Step-by-Step Reasoning

示例2:使用DeepSeek进行分步推理

bash
undefined
bash
undefined

User: "Ask deepseek to explain how to solve this algorithm problem"

用户请求:"让deepseek一步步解释如何解决这个算法问题"

scripts/check_install_cli.sh scripts/configure_provider.sh deepseek deepseek-reasoner swiftopenai agent "Explain step by step how to implement quicksort"
undefined
scripts/check_install_cli.sh scripts/configure_provider.sh deepseek deepseek-reasoner swiftopenai agent "一步步解释如何实现快速排序"
undefined

Example 3: Fast Code Generation with Groq

示例3:使用Groq快速生成代码

bash
undefined
bash
undefined

User: "Use groq to quickly generate a REST API"

用户请求:"用groq快速生成一个REST API"

scripts/check_install_cli.sh scripts/configure_provider.sh groq llama-3.3-70b-versatile swiftopenai agent "Generate a REST API with authentication in Python"
undefined
scripts/check_install_cli.sh scripts/configure_provider.sh groq llama-3.3-70b-versatile swiftopenai agent "用Python生成一个带认证功能的REST API"
undefined

Example 4: Accessing Claude via OpenRouter

示例4:通过OpenRouter访问Claude

bash
undefined
bash
undefined

User: "Use openrouter to access claude and write documentation"

用户请求:"用openrouter访问claude并编写文档"

scripts/check_install_cli.sh scripts/configure_provider.sh openrouter anthropic/claude-3.5-sonnet swiftopenai agent "Write comprehensive documentation for a todo app API"
undefined
scripts/check_install_cli.sh scripts/configure_provider.sh openrouter anthropic/claude-3.5-sonnet swiftopenai agent "为待办事项应用API编写完整的文档"
undefined

Example 5: GPT-5 with Custom Parameters

示例5:使用自定义参数的GPT-5

bash
undefined
bash
undefined

User: "Use gpt-5 with high reasoning to solve this complex problem"

用户请求:"用高推理能力的gpt-5解决这个复杂问题"

scripts/check_install_cli.sh scripts/configure_provider.sh openai gpt-5 swiftopenai agent "Design a distributed caching system"
--model gpt-5
--reasoning high
--verbose high
undefined
scripts/check_install_cli.sh scripts/configure_provider.sh openai gpt-5 swiftopenai agent "设计一个分布式缓存系统"
--model gpt-5
--reasoning high
--verbose high
undefined