langchain-local-dev-loop

Compare original and translation side by side

🇺🇸

Original

English
🇨🇳

Translation

Chinese

LangChain Local Dev Loop

LangChain本地开发循环

Overview

概述

Configure a rapid local development workflow for LangChain applications with testing, debugging, and hot reload capabilities.
为LangChain应用配置具备测试、调试和热重载能力的快速本地开发工作流。

Prerequisites

前提条件

  • Completed
    langchain-install-auth
    setup
  • Python 3.9+ with virtual environment
  • pytest and related testing tools
  • IDE with Python support (VS Code recommended)
  • 已完成
    langchain-install-auth
    设置
  • Python 3.9+ 及虚拟环境
  • pytest及相关测试工具
  • 支持Python的IDE(推荐VS Code)

Instructions

操作步骤

Step 1: Set Up Project Structure

步骤1:搭建项目结构

my-langchain-app/
├── src/
│   ├── __init__.py
│   ├── chains/
│   │   └── __init__.py
│   ├── agents/
│   │   └── __init__.py
│   └── prompts/
│       └── __init__.py
├── tests/
│   ├── __init__.py
│   ├── conftest.py
│   └── test_chains.py
├── .env
├── .env.example
├── pyproject.toml
└── README.md
my-langchain-app/
├── src/
│   ├── __init__.py
│   ├── chains/
│   │   └── __init__.py
│   ├── agents/
│   │   └── __init__.py
│   └── prompts/
│       └── __init__.py
├── tests/
│   ├── __init__.py
│   ├── conftest.py
│   └── test_chains.py
├── .env
├── .env.example
├── pyproject.toml
└── README.md

Step 2: Configure Testing

步骤2:配置测试

python
undefined
python
undefined

tests/conftest.py

tests/conftest.py

import pytest from unittest.mock import MagicMock from langchain_core.messages import AIMessage
@pytest.fixture def mock_llm(): """Mock LLM for unit tests without API calls.""" mock = MagicMock() mock.invoke.return_value = AIMessage(content="Mocked response") return mock
@pytest.fixture def sample_prompt(): """Sample prompt for testing.""" from langchain_core.prompts import ChatPromptTemplate return ChatPromptTemplate.from_template("Test: {input}")
undefined
import pytest from unittest.mock import MagicMock from langchain_core.messages import AIMessage
@pytest.fixture def mock_llm(): """用于单元测试的Mock LLM,无需调用API。""" mock = MagicMock() mock.invoke.return_value = AIMessage(content="Mocked response") return mock
@pytest.fixture def sample_prompt(): """用于测试的示例提示词。""" from langchain_core.prompts import ChatPromptTemplate return ChatPromptTemplate.from_template("Test: {input}")
undefined

Step 3: Create Test File

步骤3:创建测试文件

python
undefined
python
undefined

tests/test_chains.py

tests/test_chains.py

def test_chain_construction(mock_llm, sample_prompt): """Test that chain can be constructed.""" from langchain_core.output_parsers import StrOutputParser
chain = sample_prompt | mock_llm | StrOutputParser()
assert chain is not None
def test_chain_invoke(mock_llm, sample_prompt): """Test chain invocation with mock.""" from langchain_core.output_parsers import StrOutputParser
chain = sample_prompt | mock_llm | StrOutputParser()
result = chain.invoke({"input": "test"})
assert result == "Mocked response"
undefined
def test_chain_construction(mock_llm, sample_prompt): """测试能否成功构建链。""" from langchain_core.output_parsers import StrOutputParser
chain = sample_prompt | mock_llm | StrOutputParser()
assert chain is not None
def test_chain_invoke(mock_llm, sample_prompt): """测试使用Mock调用链。""" from langchain_core.output_parsers import StrOutputParser
chain = sample_prompt | mock_llm | StrOutputParser()
result = chain.invoke({"input": "test"})
assert result == "Mocked response"
undefined

Step 4: Set Up Development Tools

步骤4:配置开发工具

toml
undefined
toml
undefined

pyproject.toml

pyproject.toml

[project] name = "my-langchain-app" version = "0.1.0" requires-python = ">=3.9" dependencies = [ "langchain>=0.3.0", "langchain-openai>=0.2.0", "python-dotenv>=1.0.0", ]
[project.optional-dependencies] dev = [ "pytest>=8.0.0", "pytest-asyncio>=0.23.0", "pytest-cov>=4.0.0", "ruff>=0.1.0", "mypy>=1.0.0", ]
[tool.pytest.ini_options] asyncio_mode = "auto" testpaths = ["tests"]
[tool.ruff] line-length = 100
undefined
[project] name = "my-langchain-app" version = "0.1.0" requires-python = ">=3.9" dependencies = [ "langchain>=0.3.0", "langchain-openai>=0.2.0", "python-dotenv>=1.0.0", ]
[project.optional-dependencies] dev = [ "pytest>=8.0.0", "pytest-asyncio>=0.23.0", "pytest-cov>=4.0.0", "ruff>=0.1.0", "mypy>=1.0.0", ]
[tool.pytest.ini_options] asyncio_mode = "auto" testpaths = ["tests"]
[tool.ruff] line-length = 100
undefined

Output

输出结果

  • Organized project structure with separation of concerns
  • pytest configuration with fixtures for mocking LLMs
  • Development dependencies configured
  • Ready for rapid iteration
  • 职责分离的规范化项目结构
  • 带有LLM模拟夹具的pytest配置
  • 已配置的开发依赖
  • 可立即进行快速迭代

Error Handling

错误处理

ErrorCauseSolution
Import ErrorMissing packageInstall with
pip install -e ".[dev]"
Fixture Not Foundconftest.py issueEnsure conftest.py is in tests/ directory
Async Test ErrorMissing markerAdd
@pytest.mark.asyncio
decorator
Env Var Missing.env not loadedUse
python-dotenv
and load_dotenv()
错误类型原因解决方案
导入错误缺失依赖包使用
pip install -e ".[dev]"
安装
夹具未找到conftest.py存在问题确保conftest.py位于tests/目录下
异步测试错误缺失标记添加
@pytest.mark.asyncio
装饰器
环境变量缺失.env文件未加载使用
python-dotenv
并调用load_dotenv()

Examples

示例

Running Tests

运行测试

bash
undefined
bash
undefined

Run all tests

运行所有测试

pytest
pytest

Run with coverage

带覆盖率报告运行

pytest --cov=src --cov-report=html
pytest --cov=src --cov-report=html

Run specific test

运行指定测试

pytest tests/test_chains.py::test_chain_invoke -v
pytest tests/test_chains.py::test_chain_invoke -v

Watch mode (requires pytest-watch)

监听模式(需要pytest-watch)

ptw
undefined
ptw
undefined

Integration Test Example

集成测试示例

python
undefined
python
undefined

tests/test_integration.py

tests/test_integration.py

import pytest from dotenv import load_dotenv
load_dotenv()
@pytest.mark.integration def test_real_llm_call(): """Integration test with real LLM (requires API key).""" from langchain_openai import ChatOpenAI
llm = ChatOpenAI(model="gpt-4o-mini", temperature=0)
response = llm.invoke("Say 'test passed'")
assert "test" in response.content.lower()
undefined
import pytest from dotenv import load_dotenv
load_dotenv()
@pytest.mark.integration def test_real_llm_call(): """使用真实LLM的集成测试(需要API密钥)。""" from langchain_openai import ChatOpenAI
llm = ChatOpenAI(model="gpt-4o-mini", temperature=0)
response = llm.invoke("Say 'test passed'")
assert "test" in response.content.lower()
undefined

Resources

参考资源

Next Steps

后续步骤

Proceed to
langchain-sdk-patterns
for production-ready code patterns.
继续学习
langchain-sdk-patterns
以获取生产就绪的代码模式。