chainlit
Compare original and translation side by side
🇺🇸
Original
English🇨🇳
Translation
ChineseChainlit
Chainlit
Build production-ready conversational AI applications in Python with rich UI.
使用Python构建具备丰富UI的生产级对话式AI应用。
Installation
Installation
bash
pip install chainlitbash
pip install chainlitQuick Start
Quick Start
python
import chainlit as cl
@cl.on_message
async def on_message(message: cl.Message):
await cl.Message(content=f"You said: {message.content}").send()Run with:
bash
chainlit run app.py -wpython
import chainlit as cl
@cl.on_message
async def on_message(message: cl.Message):
await cl.Message(content=f"You said: {message.content}").send()运行命令:
bash
chainlit run app.py -wCore Concepts
Core Concepts
| Concept | Description |
|---|---|
| Messages | Text communication between user and assistant |
| Steps | Visible processing stages (LLM calls, tool use) |
| Elements | Rich UI (images, files, charts, dataframes) |
| Actions | Interactive buttons with callbacks |
| Sessions | Per-user state management |
| 概念 | 描述 |
|---|---|
| Messages | 用户与助手之间的文本通信 |
| Steps | 可见的处理阶段(LLM调用、工具使用) |
| Elements | 丰富UI组件(图片、文件、图表、数据框) |
| Actions | 带回调函数的交互式按钮 |
| Sessions | 按用户划分的状态管理 |
Lifecycle Hooks
Lifecycle Hooks
python
import chainlit as cl
@cl.on_chat_start
async def start():
cl.user_session.set("history", [])
await cl.Message(content="Hello!").send()
@cl.on_message
async def on_message(message: cl.Message):
await cl.Message(content="Got it!").send()
@cl.on_chat_end
async def end():
print("Session ended")python
import chainlit as cl
@cl.on_chat_start
async def start():
cl.user_session.set("history", [])
await cl.Message(content="Hello!").send()
@cl.on_message
async def on_message(message: cl.Message):
await cl.Message(content="Got it!").send()
@cl.on_chat_end
async def end():
print("Session ended")Streaming Responses
Streaming Responses
python
from openai import AsyncOpenAI
import chainlit as cl
client = AsyncOpenAI()
cl.instrument_openai()
@cl.on_message
async def on_message(message: cl.Message):
msg = cl.Message(content="")
await msg.send()
stream = await client.chat.completions.create(
model="gpt-4",
messages=[{"role": "user", "content": message.content}],
stream=True
)
async for chunk in stream:
if token := chunk.choices[0].delta.content:
await msg.stream_token(token)
await msg.update()python
from openai import AsyncOpenAI
import chainlit as cl
client = AsyncOpenAI()
cl.instrument_openai()
@cl.on_message
async def on_message(message: cl.Message):
msg = cl.Message(content="")
await msg.send()
stream = await client.chat.completions.create(
model="gpt-4",
messages=[{"role": "user", "content": message.content}],
stream=True
)
async for chunk in stream:
if token := chunk.choices[0].delta.content:
await msg.stream_token(token)
await msg.update()Steps (Chain of Thought)
Steps (Chain of Thought)
python
@cl.step(type="tool")
async def search(query: str):
return f"Results for: {query}"
@cl.step(type="llm")
async def generate(context: str):
return await llm_call(context)
@cl.on_message
async def on_message(message: cl.Message):
results = await search(message.content)
answer = await generate(results)
await cl.Message(content=answer).send()python
@cl.step(type="tool")
async def search(query: str):
return f"Results for: {query}"
@cl.step(type="llm")
async def generate(context: str):
return await llm_call(context)
@cl.on_message
async def on_message(message: cl.Message):
results = await search(message.content)
answer = await generate(results)
await cl.Message(content=answer).send()User Session
User Session
python
@cl.on_chat_start
async def start():
cl.user_session.set("counter", 0)
@cl.on_message
async def on_message(message: cl.Message):
count = cl.user_session.get("counter")
count += 1
cl.user_session.set("counter", count)python
@cl.on_chat_start
async def start():
cl.user_session.set("counter", 0)
@cl.on_message
async def on_message(message: cl.Message):
count = cl.user_session.get("counter")
count += 1
cl.user_session.set("counter", count)Ask User for Input
Ask User for Input
python
undefinedpython
undefinedText input
文本输入
response = await cl.AskUserMessage(content="What's your name?").send()
name = response.get("output") if response else "Anonymous"
response = await cl.AskUserMessage(content="What's your name?").send()
name = response.get("output") if response else "Anonymous"
File upload
文件上传
files = await cl.AskFileMessage(
content="Upload a file",
accept=["text/plain", "application/pdf"]
).send()
files = await cl.AskFileMessage(
content="Upload a file",
accept=["text/plain", "application/pdf"]
).send()
Action selection
操作选择
response = await cl.AskActionMessage(
content="Choose:",
actions=[
cl.Action(name="yes", label="Yes"),
cl.Action(name="no", label="No"),
]
).send()
undefinedresponse = await cl.AskActionMessage(
content="Choose:",
actions=[
cl.Action(name="yes", label="Yes"),
cl.Action(name="no", label="No"),
]
).send()
undefinedUI Elements
UI Elements
python
@cl.on_message
async def on_message(message: cl.Message):
elements = [
cl.Text(name="code.py", content="print('hello')", language="python"),
cl.Image(name="chart", path="./chart.png", display="inline"),
cl.File(name="report.pdf", path="./report.pdf"),
]
await cl.Message(content="Results:", elements=elements).send()python
@cl.on_message
async def on_message(message: cl.Message):
elements = [
cl.Text(name="code.py", content="print('hello')", language="python"),
cl.Image(name="chart", path="./chart.png", display="inline"),
cl.File(name="report.pdf", path="./report.pdf"),
]
await cl.Message(content="Results:", elements=elements).send()Actions (Buttons)
Actions (Buttons)
python
@cl.action_callback("approve")
async def on_approve(action: cl.Action):
await action.remove()
await cl.Message(content="Approved!").send()
@cl.on_message
async def on_message(message: cl.Message):
actions = [cl.Action(name="approve", label="Approve")]
await cl.Message(content="Review:", actions=actions).send()python
@cl.action_callback("approve")
async def on_approve(action: cl.Action):
await action.remove()
await cl.Message(content="Approved!").send()
@cl.on_message
async def on_message(message: cl.Message):
actions = [cl.Action(name="approve", label="Approve")]
await cl.Message(content="Review:", actions=actions).send()Reference Documentation
参考文档
For detailed guidance:
- lifecycle.md - on_chat_start, on_message, on_chat_end hooks
- messages.md - Message class, streaming, chat_context
- steps.md - Step decorator, context manager, nested steps
- elements.md - Text, Image, File, PDF, Audio, Video, Plotly
- actions.md - Action buttons, callbacks, payloads
- ask-user.md - AskUserMessage, AskFileMessage, AskActionMessage
- session.md - User session, reserved keys, state management
- auth.md - Password, OAuth, header authentication
- integrations.md - OpenAI, LangChain, LlamaIndex, Mistral
- patterns.md - RAG, document Q&A, multi-agent, feedback
如需详细指导:
- lifecycle.md - on_chat_start、on_message、on_chat_end钩子函数
- messages.md - Message类、流式响应、聊天上下文
- steps.md - Step装饰器、上下文管理器、嵌套步骤
- elements.md - 文本、图片、文件、PDF、音频、视频、Plotly图表
- actions.md - 操作按钮、回调函数、负载
- ask-user.md - AskUserMessage、AskFileMessage、AskActionMessage
- session.md - 用户会话、保留键、状态管理
- auth.md - 密码验证、OAuth、头部验证
- integrations.md - OpenAI、LangChain、LlamaIndex、Mistral集成
- patterns.md - RAG、文档问答、多Agent、反馈机制
Integrations
Integrations
python
undefinedpython
undefinedOpenAI
OpenAI
cl.instrument_openai()
cl.instrument_openai()
LangChain
LangChain
config = RunnableConfig(callbacks=[cl.LangchainCallbackHandler()])
config = RunnableConfig(callbacks=[cl.LangchainCallbackHandler()])
LlamaIndex
LlamaIndex
callback_manager = CallbackManager([cl.LlamaIndexCallbackHandler()])
undefinedcallback_manager = CallbackManager([cl.LlamaIndexCallbackHandler()])
undefinedConfiguration
Configuration
.chainlit/config.tomltoml
[project]
name = "My App"
[UI]
cot = "full" # Show chain of thought: full, hidden, tool_call.chainlit/config.tomltoml
[project]
name = "My App"
[UI]
cot = "full" # Show chain of thought: full, hidden, tool_callRun Commands
运行命令
bash
undefinedbash
undefinedDevelopment with auto-reload
开发模式(自动重载)
chainlit run app.py -w
chainlit run app.py -w
Production
生产模式
chainlit run app.py --host 0.0.0.0 --port 8000
chainlit run app.py --host 0.0.0.0 --port 8000
Generate auth secret
生成验证密钥
chainlit create-secret
undefinedchainlit create-secret
undefinedKey Imports
核心导入
python
import chainlit as clpython
import chainlit as clCore
核心类
cl.Message, cl.Step, cl.Action
cl.Message, cl.Step, cl.Action
Elements
UI元素
cl.Text, cl.Image, cl.File, cl.Pdf, cl.Audio, cl.Video
cl.Plotly, cl.Dataframe, cl.TaskList
cl.Text, cl.Image, cl.File, cl.Pdf, cl.Audio, cl.Video
cl.Plotly, cl.Dataframe, cl.TaskList
Ask User
用户交互
cl.AskUserMessage, cl.AskFileMessage, cl.AskActionMessage
cl.AskUserMessage, cl.AskFileMessage, cl.AskActionMessage
Decorators
装饰器
@cl.on_chat_start, @cl.on_message, @cl.on_chat_end
@cl.step, @cl.action_callback
@cl.password_auth_callback, @cl.oauth_callback
undefined@cl.on_chat_start, @cl.on_message, @cl.on_chat_end
@cl.step, @cl.action_callback
@cl.password_auth_callback, @cl.oauth_callback
undefined