tauri-dev
Compare original and translation side by side
🇺🇸
Original
English🇨🇳
Translation
ChineseTauri 2.0 Development Skill
Tauri 2.0 开发技能
Architecture Overview
架构概述
LocalCowork uses Tauri 2.0 with a three-layer architecture. This skill covers
the Rust backend (middle layer) and its integration with both the React frontend
(top layer) and the MCP servers + inference backend (bottom layer).
React Frontend ←──Tauri IPC──→ Rust Backend ←──JSON-RPC/stdio──→ MCP Servers
│
└──OpenAI API──→ Local LLM (Ollama/llama.cpp)LocalCowork采用Tauri 2.0的三层架构。本技能涵盖Rust后端(中间层)及其与React前端(上层)、MCP服务器+推理后端(下层)的集成。
React Frontend ←──Tauri IPC──→ Rust Backend ←──JSON-RPC/stdio──→ MCP Servers
│
└──OpenAI API──→ Local LLM (Ollama/llama.cpp)Key References
关键参考文档
- — why Tauri
docs/architecture-decisions/001-tauri-over-electron.md - — the OpenAI API contract
docs/architecture-decisions/003-model-abstraction-layer.md - — confirmation/undo flow
docs/patterns/human-in-the-loop.md - — 32k token budget
docs/patterns/context-window-management.md
- — 选择Tauri而非Electron的原因
docs/architecture-decisions/001-tauri-over-electron.md - — OpenAI API契约
docs/architecture-decisions/003-model-abstraction-layer.md - — 确认/撤销流程
docs/patterns/human-in-the-loop.md - — 32k令牌预算
docs/patterns/context-window-management.md
Rust Backend Modules
Rust后端模块
agent_core/conversation.rs — ConversationManager
agent_core/conversation.rs — ConversationManager(对话管理器)
Manages conversation state, history, and persistence.
rust
pub struct ConversationManager {
db: SqlitePool, // Conversation history stored in SQLite
current_session: Session,
context_manager: ContextWindowManager,
}
impl ConversationManager {
/// Create a new conversation session
pub async fn new_session(&mut self) -> Result<SessionId>;
/// Add a user message and get the model's response
pub async fn send_message(&mut self, message: &str) -> Result<MessageStream>;
/// Get conversation history for context window
pub fn get_history(&self, max_tokens: usize) -> Vec<Message>;
/// Persist a message to SQLite
async fn persist_message(&self, message: &Message) -> Result<()>;
}管理对话状态、历史记录与持久化。
rust
pub struct ConversationManager {
db: SqlitePool, // Conversation history stored in SQLite
current_session: Session,
context_manager: ContextWindowManager,
}
impl ConversationManager {
/// Create a new conversation session
pub async fn new_session(&mut self) -> Result<SessionId>;
/// Add a user message and get the model's response
pub async fn send_message(&mut self, message: &str) -> Result<MessageStream>;
/// Get conversation history for context window
pub fn get_history(&self, max_tokens: usize) -> Vec<Message>;
/// Persist a message to SQLite
async fn persist_message(&self, message: &Message) -> Result<()>;
}agent_core/tool_router.rs — ToolRouter
agent_core/tool_router.rs — ToolRouter(工具路由器)
Routes model tool calls to the appropriate MCP server.
rust
pub struct ToolRouter {
mcp_client: MCPClient,
audit_logger: AuditLogger,
}
impl ToolRouter {
/// Process a tool call from the model
pub async fn dispatch(&self, tool_call: ToolCall) -> Result<ToolResult> {
// 1. Look up tool in the MCP registry
// 2. Check if confirmation is required
// 3. If confirmed (or not required): send JSON-RPC call to server
// 4. Log to audit trail
// 5. If undo supported: push to undo stack
// 6. Return result
}
/// Check if a tool requires user confirmation
fn requires_confirmation(&self, tool_name: &str) -> bool;
/// Push to undo stack for reversible actions
async fn push_undo(&self, tool_call: &ToolCall, result: &ToolResult) -> Result<()>;
}将模型的工具调用路由到对应的MCP服务器。
rust
pub struct ToolRouter {
mcp_client: MCPClient,
audit_logger: AuditLogger,
}
impl ToolRouter {
/// Process a tool call from the model
pub async fn dispatch(&self, tool_call: ToolCall) -> Result<ToolResult> {
// 1. Look up tool in the MCP registry
// 2. Check if confirmation is required
// 3. If confirmed (or not required): send JSON-RPC call to server
// 4. Log to audit trail
// 5. If undo supported: push to undo stack
// 6. Return result
}
/// Check if a tool requires user confirmation
fn requires_confirmation(&self, tool_name: &str) -> bool;
/// Push to undo stack for reversible actions
async fn push_undo(&self, tool_call: &ToolCall, result: &ToolResult) -> Result<()>;
}agent_core/context_window.rs — ContextWindowManager
agent_core/context_window.rs — ContextWindowManager(上下文窗口管理器)
Manages the 32k token budget. See .
docs/patterns/context-window-management.mdrust
pub struct ContextWindowManager {
max_tokens: usize, // 32,768
tokenizer: Tokenizer, // tiktoken-rs
system_prompt: String,
tool_definitions: String,
}
impl ContextWindowManager {
/// Build the full prompt for the model
pub fn build_prompt(
&self,
history: &[Message],
active_context: Option<&str>,
) -> Result<Vec<ChatMessage>>;
/// Count tokens for a string
pub fn count_tokens(&self, text: &str) -> usize;
/// Evict old messages when context is tight
fn evict_oldest(&mut self, history: &mut Vec<Message>);
}管理32k令牌预算。详情见。
docs/patterns/context-window-management.mdrust
pub struct ContextWindowManager {
max_tokens: usize, // 32,768
tokenizer: Tokenizer, // tiktoken-rs
system_prompt: String,
tool_definitions: String,
}
impl ContextWindowManager {
/// Build the full prompt for the model
pub fn build_prompt(
&self,
history: &[Message],
active_context: Option<&str>,
) -> Result<Vec<ChatMessage>>;
/// Count tokens for a string
pub fn count_tokens(&self, text: &str) -> usize;
/// Evict old messages when context is tight
fn evict_oldest(&mut self, history: &mut Vec<Message>);
}mcp_client/ — MCP Client
mcp_client/ — MCP Client(MCP客户端)
Manages MCP server processes and JSON-RPC communication.
rust
pub struct MCPClient {
servers: HashMap<String, MCPServerProcess>,
tool_registry: ToolRegistry,
}
impl MCPClient {
/// Start all configured MCP servers
pub async fn start_servers(&mut self, config: &MCPConfig) -> Result<()>;
/// Get the aggregated tool definitions for the LLM
pub fn get_tool_definitions(&self) -> Vec<ToolDefinition>;
/// Send a tool call to the appropriate server
pub async fn call_tool(&self, name: &str, args: Value) -> Result<Value>;
/// Gracefully shutdown all servers
pub async fn shutdown(&mut self) -> Result<()>;
}
struct MCPServerProcess {
child: Child, // tokio::process::Child
stdin: ChildStdin,
stdout: BufReader<ChildStdout>,
tools: Vec<ToolDefinition>,
}管理MCP服务器进程与JSON-RPC通信。
rust
pub struct MCPClient {
servers: HashMap<String, MCPServerProcess>,
tool_registry: ToolRegistry,
}
impl MCPClient {
/// Start all configured MCP servers
pub async fn start_servers(&mut self, config: &MCPConfig) -> Result<()>;
/// Get the aggregated tool definitions for the LLM
pub fn get_tool_definitions(&self) -> Vec<ToolDefinition>;
/// Send a tool call to the appropriate server
pub async fn call_tool(&self, name: &str, args: Value) -> Result<Value>;
/// Gracefully shutdown all servers
pub async fn shutdown(&mut self) -> Result<()>;
}
struct MCPServerProcess {
child: Child, // tokio::process::Child
stdin: ChildStdin,
stdout: BufReader<ChildStdout>,
tools: Vec<ToolDefinition>,
}inference/ — Inference Client
inference/ — Inference Client(推理客户端)
OpenAI-compatible API client for the local LLM.
rust
pub struct InferenceClient {
base_url: String, // e.g., "http://localhost:11434/v1"
model: String, // e.g., "qwen2.5:32b-instruct"
http_client: reqwest::Client,
}
impl InferenceClient {
/// Send a chat completion request (streaming)
pub async fn chat_completion(
&self,
messages: Vec<ChatMessage>,
tools: Vec<ToolDefinition>,
) -> Result<impl Stream<Item = StreamChunk>>;
/// Parse tool calls from model response
fn parse_tool_calls(response: &str) -> Result<Vec<ToolCall>>;
}兼容OpenAI API的本地LLM客户端。
rust
pub struct InferenceClient {
base_url: String, // e.g., "http://localhost:11434/v1"
model: String, // e.g., "qwen2.5:32b-instruct"
http_client: reqwest::Client,
}
impl InferenceClient {
/// Send a chat completion request (streaming)
pub async fn chat_completion(
&self,
messages: Vec<ChatMessage>,
tools: Vec<ToolDefinition>,
) -> Result<impl Stream<Item = StreamChunk>>;
/// Parse tool calls from model response
fn parse_tool_calls(response: &str) -> Result<Vec<ToolCall>>;
}Tauri IPC Commands
Tauri IPC命令
The frontend communicates with the Rust backend via Tauri commands.
rust
// src-tauri/src/commands/chat.rs
#[tauri::command]
async fn send_message(
state: tauri::State<'_, AppState>,
message: String,
) -> Result<String, String> {
let mut conv = state.conversation_manager.lock().await;
let response = conv.send_message(&message).await
.map_err(|e| e.to_string())?;
Ok(response)
}
#[tauri::command]
async fn confirm_action(
state: tauri::State<'_, AppState>,
action_id: String,
confirmed: bool,
) -> Result<(), String> {
let router = state.tool_router.lock().await;
if confirmed {
router.execute_confirmed(&action_id).await.map_err(|e| e.to_string())?;
} else {
router.reject_action(&action_id).await.map_err(|e| e.to_string())?;
}
Ok(())
}
#[tauri::command]
async fn undo_last_action(
state: tauri::State<'_, AppState>,
) -> Result<String, String> {
let router = state.tool_router.lock().await;
router.undo_last().await.map_err(|e| e.to_string())
}Frontend invocation:
typescript
import { invoke } from '@tauri-apps/api/core';
const response = await invoke<string>('send_message', { message: userInput });
await invoke('confirm_action', { actionId: 'act-001', confirmed: true });
await invoke('undo_last_action');前端通过Tauri命令与Rust后端通信。
rust
// src-tauri/src/commands/chat.rs
#[tauri::command]
async fn send_message(
state: tauri::State<'_, AppState>,
message: String,
) -> Result<String, String> {
let mut conv = state.conversation_manager.lock().await;
let response = conv.send_message(&message).await
.map_err(|e| e.to_string())?;
Ok(response)
}
#[tauri::command]
async fn confirm_action(
state: tauri::State<'_, AppState>,
action_id: String,
confirmed: bool,
) -> Result<(), String> {
let router = state.tool_router.lock().await;
if confirmed {
router.execute_confirmed(&action_id).await.map_err(|e| e.to_string())?;
} else {
router.reject_action(&action_id).await.map_err(|e| e.to_string())?;
}
Ok(())
}
#[tauri::command]
async fn undo_last_action(
state: tauri::State<'_, AppState>,
) -> Result<String, String> {
let router = state.tool_router.lock().await;
router.undo_last().await.map_err(|e| e.to_string())
}前端调用示例:
typescript
import { invoke } from '@tauri-apps/api/core';
const response = await invoke<string>('send_message', { message: userInput });
await invoke('confirm_action', { actionId: 'act-001', confirmed: true });
await invoke('undo_last_action');Tauri Permissions (tauri.conf.json)
Tauri权限配置(tauri.conf.json)
Each capability is granted explicitly:
json
{
"app": {
"security": {
"capabilities": [
{
"identifier": "filesystem-access",
"description": "Access user-granted directories",
"permissions": [
"fs:allow-read",
"fs:allow-write",
"fs:scope-$DOCUMENTS",
"fs:scope-$DOWNLOADS"
]
},
{
"identifier": "process-management",
"description": "Manage MCP server child processes",
"permissions": [
"shell:allow-spawn",
"shell:allow-kill"
]
},
{
"identifier": "clipboard-access",
"permissions": ["clipboard-manager:allow-read", "clipboard-manager:allow-write"]
}
]
}
}
}每个能力都需要显式授权:
json
{
"app": {
"security": {
"capabilities": [
{
"identifier": "filesystem-access",
"description": "Access user-granted directories",
"permissions": [
"fs:allow-read",
"fs:allow-write",
"fs:scope-$DOCUMENTS",
"fs:scope-$DOWNLOADS"
]
},
{
"identifier": "process-management",
"description": "Manage MCP server child processes",
"permissions": [
"shell:allow-spawn",
"shell:allow-kill"
]
},
{
"identifier": "clipboard-access",
"permissions": ["clipboard-manager:allow-read", "clipboard-manager:allow-write"]
}
]
}
}
}Coding Standards (Rust)
Rust编码规范
- Edition 2021
- must pass (zero warnings)
cargo clippy -- -D warnings - All public functions have doc comments ()
/// - Error handling: for custom errors,
thiserrorfor application errorsanyhow - Async: runtime (multi-threaded)
tokio - Max 300 lines per file — extract to submodules when approaching
- Use crate for structured logging (integrates with shared Logger)
tracing - No in production code — use
unwrap()operator or explicit error handling?
- 使用Rust 2021版本
- 必须通过(零警告)
cargo clippy -- -D warnings - 所有公共函数需包含文档注释()
/// - 错误处理:自定义错误使用,应用错误使用
thiserroranyhow - 异步处理:使用多线程运行时
tokio - 单个文件最大不超过300行——接近上限时拆分到子模块
- 使用crate进行结构化日志(与共享Logger集成)
tracing - 生产代码中禁止使用——使用
unwrap()运算符或显式错误处理?
Dependencies (Cargo.toml)
依赖项(Cargo.toml)
Key crates:
toml
[dependencies]
tauri = { version = "2", features = ["shell-open"] }
tokio = { version = "1", features = ["full"] }
serde = { version = "1", features = ["derive"] }
serde_json = "1"
sqlx = { version = "0.7", features = ["runtime-tokio", "sqlite"] }
reqwest = { version = "0.12", features = ["json", "stream"] }
tiktoken-rs = "0.5"
thiserror = "1"
anyhow = "1"
tracing = "0.1"
tracing-subscriber = "0.3"核心依赖 crate:
toml
[dependencies]
tauri = { version = "2", features = ["shell-open"] }
tokio = { version = "1", features = ["full"] }
serde = { version = "1", features = ["derive"] }
serde_json = "1"
sqlx = { version = "0.7", features = ["runtime-tokio", "sqlite"] }
reqwest = { version = "0.12", features = ["json", "stream"] }
tiktoken-rs = "0.5"
thiserror = "1"
anyhow = "1"
tracing = "0.1"
tracing-subscriber = "0.3"