python-testing

Compare original and translation side by side

🇺🇸

Original

English
🇨🇳

Translation

Chinese

Python Testing

Python 测试

Quick reference for Python testing with pytest, coverage, fixtures, and best practices.
使用pytest、coverage、fixtures及最佳实践进行Python测试的快速参考。

When This Skill Applies

本技能适用场景

  • Writing unit tests and integration tests
  • Test-driven development (TDD)
  • Test fixtures and parametrization
  • Coverage analysis
  • Mocking and patching
  • Async testing
  • 编写单元测试和集成测试
  • 测试驱动开发(TDD)
  • 测试fixtures和参数化
  • 覆盖率分析
  • Mock和打桩
  • 异步测试

Quick Reference

快速参考

Running Tests

运行测试

bash
undefined
bash
undefined

Basic test run

Basic test run

uv run pytest
uv run pytest

Verbose output

Verbose output

uv run pytest -v
uv run pytest -v

Show print statements

Show print statements

uv run pytest -s
uv run pytest -s

Stop at first failure

Stop at first failure

uv run pytest -x
uv run pytest -x

Run specific test

Run specific test

uv run pytest tests/test_module.py::test_function
uv run pytest tests/test_module.py::test_function

Run by keyword

Run by keyword

uv run pytest -k "test_user"
undefined
uv run pytest -k "test_user"
undefined

Test Coverage

测试覆盖率

bash
undefined
bash
undefined

Run with coverage

Run with coverage

uv run pytest --cov
uv run pytest --cov

HTML report

HTML report

uv run pytest --cov --cov-report=html
uv run pytest --cov --cov-report=html

Show missing lines

Show missing lines

uv run pytest --cov --cov-report=term-missing
uv run pytest --cov --cov-report=term-missing

Coverage for specific module

Coverage for specific module

uv run pytest --cov=mymodule tests/
undefined
uv run pytest --cov=mymodule tests/
undefined

Fixtures

Fixtures

python
import pytest

@pytest.fixture
def sample_data():
    return {"key": "value"}

@pytest.fixture(scope="module")
def db_connection():
    conn = create_connection()
    yield conn
    conn.close()

def test_with_fixture(sample_data):
    assert sample_data["key"] == "value"
python
import pytest

@pytest.fixture
def sample_data():
    return {"key": "value"}

@pytest.fixture(scope="module")
def db_connection():
    conn = create_connection()
    yield conn
    conn.close()

def test_with_fixture(sample_data):
    assert sample_data["key"] == "value"

Parametrize Tests

参数化测试

python
import pytest

@pytest.mark.parametrize("input,expected", [
    ("hello", "HELLO"),
    ("world", "WORLD"),
    ("test", "TEST"),
])
def test_uppercase(input: str, expected: str):
    assert input.upper() == expected

@pytest.mark.parametrize("value,is_valid", [
    (1, True),
    (0, False),
    (-1, False),
])
def test_validation(value, is_valid):
    assert validate(value) == is_valid
python
import pytest

@pytest.mark.parametrize("input,expected", [
    ("hello", "HELLO"),
    ("world", "WORLD"),
    ("test", "TEST"),
])
def test_uppercase(input: str, expected: str):
    assert input.upper() == expected

@pytest.mark.parametrize("value,is_valid", [
    (1, True),
    (0, False),
    (-1, False),
])
def test_validation(value, is_valid):
    assert validate(value) == is_valid

Markers

标记

python
import pytest

@pytest.mark.slow
def test_slow_operation():
    # Long-running test
    pass

@pytest.mark.skip(reason="Not implemented yet")
def test_future_feature():
    pass

@pytest.mark.skipif(sys.platform == "win32", reason="Unix only")
def test_unix_specific():
    pass

@pytest.mark.xfail
def test_known_issue():
    # Expected to fail
    pass
bash
undefined
python
import pytest

@pytest.mark.slow
def test_slow_operation():
    # Long-running test
    pass

@pytest.mark.skip(reason="Not implemented yet")
def test_future_feature():
    pass

@pytest.mark.skipif(sys.platform == "win32", reason="Unix only")
def test_unix_specific():
    pass

@pytest.mark.xfail
def test_known_issue():
    # Expected to fail
    pass
bash
undefined

Run only marked tests

Run only marked tests

uv run pytest -m slow uv run pytest -m "not slow"
undefined
uv run pytest -m slow uv run pytest -m "not slow"
undefined

Async Testing

异步测试

python
import pytest

@pytest.mark.asyncio
async def test_async_function():
    result = await async_operation()
    assert result == expected_value

@pytest.fixture
async def async_client():
    client = AsyncClient()
    await client.connect()
    yield client
    await client.disconnect()
python
import pytest

@pytest.mark.asyncio
async def test_async_function():
    result = await async_operation()
    assert result == expected_value

@pytest.fixture
async def async_client():
    client = AsyncClient()
    await client.connect()
    yield client
    await client.disconnect()

Mocking

Mocking

python
from unittest.mock import Mock, patch, MagicMock

def test_with_mock():
    mock_obj = Mock()
    mock_obj.method.return_value = "mocked"
    assert mock_obj.method() == "mocked"

@patch('module.external_api')
def test_with_patch(mock_api):
    mock_api.return_value = {"status": "success"}
    result = call_external_api()
    assert result["status"] == "success"
python
from unittest.mock import Mock, patch, MagicMock

def test_with_mock():
    mock_obj = Mock()
    mock_obj.method.return_value = "mocked"
    assert mock_obj.method() == "mocked"

@patch('module.external_api')
def test_with_patch(mock_api):
    mock_api.return_value = {"status": "success"}
    result = call_external_api()
    assert result["status"] == "success"

pytest-mock (cleaner)

pytest-mock (cleaner)

def test_with_mocker(mocker): mock = mocker.patch('module.function') mock.return_value = 42 assert function() == 42
undefined
def test_with_mocker(mocker): mock = mocker.patch('module.function') mock.return_value = 42 assert function() == 42
undefined

Test Organization

测试组织

project/
├── src/
│   └── myproject/
│       ├── __init__.py
│       └── module.py
└── tests/
    ├── __init__.py
    ├── conftest.py          # Shared fixtures
    ├── test_module.py
    └── integration/
        └── test_api.py
project/
├── src/
│   └── myproject/
│       ├── __init__.py
│       └── module.py
└── tests/
    ├── __init__.py
    ├── conftest.py          # Shared fixtures
    ├── test_module.py
    └── integration/
        └── test_api.py

conftest.py

conftest.py

python
undefined
python
undefined

tests/conftest.py

tests/conftest.py

import pytest
@pytest.fixture(scope="session") def app_config(): return {"debug": True, "testing": True}
@pytest.fixture(autouse=True) def reset_db(): setup_database() yield teardown_database()
undefined
import pytest
@pytest.fixture(scope="session") def app_config(): return {"debug": True, "testing": True}
@pytest.fixture(autouse=True) def reset_db(): setup_database() yield teardown_database()
undefined

pyproject.toml Configuration

pyproject.toml 配置

toml
[tool.pytest.ini_options]
testpaths = ["tests"]
python_files = ["test_*.py", "*_test.py"]
python_classes = ["Test*"]
python_functions = ["test_*"]
addopts = [
    "-v",
    "--strict-markers",
    "--cov=src",
    "--cov-report=term-missing",
]
markers = [
    "slow: marks tests as slow",
    "integration: marks tests as integration tests",
]

[tool.coverage.run]
source = ["src"]
omit = ["*/tests/*", "*/test_*.py"]

[tool.coverage.report]
exclude_lines = [
    "pragma: no cover",
    "def __repr__",
    "raise AssertionError",
    "raise NotImplementedError",
    "if __name__ == .__main__.:",
]
toml
[tool.pytest.ini_options]
testpaths = ["tests"]
python_files = ["test_*.py", "*_test.py"]
python_classes = ["Test*"]
python_functions = ["test_*"]
addopts = [
    "-v",
    "--strict-markers",
    "--cov=src",
    "--cov-report=term-missing",
]
markers = [
    "slow: marks tests as slow",
    "integration: marks tests as integration tests",
]

[tool.coverage.run]
source = ["src"]
omit = ["*/tests/*", "*/test_*.py"]

[tool.coverage.report]
exclude_lines = [
    "pragma: no cover",
    "def __repr__",
    "raise AssertionError",
    "raise NotImplementedError",
    "if __name__ == .__main__.:",
]

Common Testing Patterns

常见测试模式

Test Exceptions

测试异常

python
import pytest

def test_raises_exception():
    with pytest.raises(ValueError):
        function_that_raises()

def test_raises_with_message():
    with pytest.raises(ValueError, match="Invalid input"):
        function_that_raises()
python
import pytest

def test_raises_exception():
    with pytest.raises(ValueError):
        function_that_raises()

def test_raises_with_message():
    with pytest.raises(ValueError, match="Invalid input"):
        function_that_raises()

Test Warnings

测试警告

python
import pytest

def test_deprecation_warning():
    with pytest.warns(DeprecationWarning):
        deprecated_function()
python
import pytest

def test_deprecation_warning():
    with pytest.warns(DeprecationWarning):
        deprecated_function()

Temporary Files

临时文件

python
def test_with_tmp_path(tmp_path):
    file_path = tmp_path / "test.txt"
    file_path.write_text("content")
    assert file_path.read_text() == "content"
python
def test_with_tmp_path(tmp_path):
    file_path = tmp_path / "test.txt"
    file_path.write_text("content")
    assert file_path.read_text() == "content"

TDD Workflow

TDD 工作流

bash
undefined
bash
undefined

1. RED: Write failing test

1. RED: Write failing test

uv run pytest tests/test_new_feature.py
uv run pytest tests/test_new_feature.py

FAILED

FAILED

2. GREEN: Implement minimal code

2. GREEN: Implement minimal code

uv run pytest tests/test_new_feature.py
uv run pytest tests/test_new_feature.py

PASSED

PASSED

3. REFACTOR: Improve code

3. REFACTOR: Improve code

uv run pytest # All tests pass
undefined
uv run pytest # All tests pass
undefined

See Also

另请参阅

  • uv-project-management
    - Adding pytest to projects
  • python-code-quality
    - Combining tests with linting
  • python-development
    - Core Python development patterns
  • uv-project-management
    - 为项目添加pytest
  • python-code-quality
    - 将测试与代码检查结合
  • python-development
    - 核心Python开发模式

References

参考资料