langchain-local-dev-loop
Compare original and translation side by side
🇺🇸
Original
English🇨🇳
Translation
ChineseLangChain Local Dev Loop
LangChain本地开发循环
Overview
概述
Configure a rapid local development workflow for LangChain applications with testing, debugging, and hot reload capabilities.
为LangChain应用配置具备测试、调试和热重载能力的快速本地开发工作流。
Prerequisites
前提条件
- Completed setup
langchain-install-auth - Python 3.9+ with virtual environment
- pytest and related testing tools
- IDE with Python support (VS Code recommended)
- 已完成设置
langchain-install-auth - Python 3.9+ 及虚拟环境
- pytest及相关测试工具
- 支持Python的IDE(推荐VS Code)
Instructions
操作步骤
Step 1: Set Up Project Structure
步骤1:搭建项目结构
my-langchain-app/
├── src/
│ ├── __init__.py
│ ├── chains/
│ │ └── __init__.py
│ ├── agents/
│ │ └── __init__.py
│ └── prompts/
│ └── __init__.py
├── tests/
│ ├── __init__.py
│ ├── conftest.py
│ └── test_chains.py
├── .env
├── .env.example
├── pyproject.toml
└── README.mdmy-langchain-app/
├── src/
│ ├── __init__.py
│ ├── chains/
│ │ └── __init__.py
│ ├── agents/
│ │ └── __init__.py
│ └── prompts/
│ └── __init__.py
├── tests/
│ ├── __init__.py
│ ├── conftest.py
│ └── test_chains.py
├── .env
├── .env.example
├── pyproject.toml
└── README.mdStep 2: Configure Testing
步骤2:配置测试
python
undefinedpython
undefinedtests/conftest.py
tests/conftest.py
import pytest
from unittest.mock import MagicMock
from langchain_core.messages import AIMessage
@pytest.fixture
def mock_llm():
"""Mock LLM for unit tests without API calls."""
mock = MagicMock()
mock.invoke.return_value = AIMessage(content="Mocked response")
return mock
@pytest.fixture
def sample_prompt():
"""Sample prompt for testing."""
from langchain_core.prompts import ChatPromptTemplate
return ChatPromptTemplate.from_template("Test: {input}")
undefinedimport pytest
from unittest.mock import MagicMock
from langchain_core.messages import AIMessage
@pytest.fixture
def mock_llm():
"""用于单元测试的Mock LLM,无需调用API。"""
mock = MagicMock()
mock.invoke.return_value = AIMessage(content="Mocked response")
return mock
@pytest.fixture
def sample_prompt():
"""用于测试的示例提示词。"""
from langchain_core.prompts import ChatPromptTemplate
return ChatPromptTemplate.from_template("Test: {input}")
undefinedStep 3: Create Test File
步骤3:创建测试文件
python
undefinedpython
undefinedtests/test_chains.py
tests/test_chains.py
def test_chain_construction(mock_llm, sample_prompt):
"""Test that chain can be constructed."""
from langchain_core.output_parsers import StrOutputParser
chain = sample_prompt | mock_llm | StrOutputParser()
assert chain is not Nonedef test_chain_invoke(mock_llm, sample_prompt):
"""Test chain invocation with mock."""
from langchain_core.output_parsers import StrOutputParser
chain = sample_prompt | mock_llm | StrOutputParser()
result = chain.invoke({"input": "test"})
assert result == "Mocked response"undefineddef test_chain_construction(mock_llm, sample_prompt):
"""测试能否成功构建链。"""
from langchain_core.output_parsers import StrOutputParser
chain = sample_prompt | mock_llm | StrOutputParser()
assert chain is not Nonedef test_chain_invoke(mock_llm, sample_prompt):
"""测试使用Mock调用链。"""
from langchain_core.output_parsers import StrOutputParser
chain = sample_prompt | mock_llm | StrOutputParser()
result = chain.invoke({"input": "test"})
assert result == "Mocked response"undefinedStep 4: Set Up Development Tools
步骤4:配置开发工具
toml
undefinedtoml
undefinedpyproject.toml
pyproject.toml
[project]
name = "my-langchain-app"
version = "0.1.0"
requires-python = ">=3.9"
dependencies = [
"langchain>=0.3.0",
"langchain-openai>=0.2.0",
"python-dotenv>=1.0.0",
]
[project.optional-dependencies]
dev = [
"pytest>=8.0.0",
"pytest-asyncio>=0.23.0",
"pytest-cov>=4.0.0",
"ruff>=0.1.0",
"mypy>=1.0.0",
]
[tool.pytest.ini_options]
asyncio_mode = "auto"
testpaths = ["tests"]
[tool.ruff]
line-length = 100
undefined[project]
name = "my-langchain-app"
version = "0.1.0"
requires-python = ">=3.9"
dependencies = [
"langchain>=0.3.0",
"langchain-openai>=0.2.0",
"python-dotenv>=1.0.0",
]
[project.optional-dependencies]
dev = [
"pytest>=8.0.0",
"pytest-asyncio>=0.23.0",
"pytest-cov>=4.0.0",
"ruff>=0.1.0",
"mypy>=1.0.0",
]
[tool.pytest.ini_options]
asyncio_mode = "auto"
testpaths = ["tests"]
[tool.ruff]
line-length = 100
undefinedOutput
输出结果
- Organized project structure with separation of concerns
- pytest configuration with fixtures for mocking LLMs
- Development dependencies configured
- Ready for rapid iteration
- 职责分离的规范化项目结构
- 带有LLM模拟夹具的pytest配置
- 已配置的开发依赖
- 可立即进行快速迭代
Error Handling
错误处理
| Error | Cause | Solution |
|---|---|---|
| Import Error | Missing package | Install with |
| Fixture Not Found | conftest.py issue | Ensure conftest.py is in tests/ directory |
| Async Test Error | Missing marker | Add |
| Env Var Missing | .env not loaded | Use |
| 错误类型 | 原因 | 解决方案 |
|---|---|---|
| 导入错误 | 缺失依赖包 | 使用 |
| 夹具未找到 | conftest.py存在问题 | 确保conftest.py位于tests/目录下 |
| 异步测试错误 | 缺失标记 | 添加 |
| 环境变量缺失 | .env文件未加载 | 使用 |
Examples
示例
Running Tests
运行测试
bash
undefinedbash
undefinedRun all tests
运行所有测试
pytest
pytest
Run with coverage
带覆盖率报告运行
pytest --cov=src --cov-report=html
pytest --cov=src --cov-report=html
Run specific test
运行指定测试
pytest tests/test_chains.py::test_chain_invoke -v
pytest tests/test_chains.py::test_chain_invoke -v
Watch mode (requires pytest-watch)
监听模式(需要pytest-watch)
ptw
undefinedptw
undefinedIntegration Test Example
集成测试示例
python
undefinedpython
undefinedtests/test_integration.py
tests/test_integration.py
import pytest
from dotenv import load_dotenv
load_dotenv()
@pytest.mark.integration
def test_real_llm_call():
"""Integration test with real LLM (requires API key)."""
from langchain_openai import ChatOpenAI
llm = ChatOpenAI(model="gpt-4o-mini", temperature=0)
response = llm.invoke("Say 'test passed'")
assert "test" in response.content.lower()undefinedimport pytest
from dotenv import load_dotenv
load_dotenv()
@pytest.mark.integration
def test_real_llm_call():
"""使用真实LLM的集成测试(需要API密钥)。"""
from langchain_openai import ChatOpenAI
llm = ChatOpenAI(model="gpt-4o-mini", temperature=0)
response = llm.invoke("Say 'test passed'")
assert "test" in response.content.lower()undefinedResources
参考资源
Next Steps
后续步骤
Proceed to for production-ready code patterns.
langchain-sdk-patterns继续学习以获取生产就绪的代码模式。
langchain-sdk-patterns