Loading...
Loading...
Found 63 Skills
Professional Pydantic v2.12 development for data validation, serialization, and type-safe models. Use when working with Pydantic for (1) creating or modifying BaseModel classes, (2) implementing validators and serializers, (3) configuring model behavior, (4) handling JSON schema generation, (5) working with settings management, (6) debugging validation errors, (7) integrating with ORMs or APIs, or (8) any production-grade Python data validation tasks. Includes complete API reference, concept guides, examples, and migration patterns.
Implement masked text input controls in WinForms applications. Use this skill whenever the user needs to create input fields with format masks (phone numbers, IP addresses, dates, currency), validate formatted input, restrict data entry to specific patterns, or configure how user input behaves with mask constraints.
Assess data quality with checks for missing values, duplicates, type issues, and inconsistencies. Use for data validation, ETL pipelines, or dataset documentation.
Migrates databases between providers (Postgres, MySQL, Supabase, PlanetScale, MongoDB). Reads source schema, generates migration scripts, handles data type mapping, foreign keys, indexes, triggers, stored procedures. Validates migration with row counts and checksums. Generates migration-plan.md with step-by-step execution guide, rollback procedures, estimated downtime.
Validate, format, and convert between JSON, YAML, and TOML. Parse and query structured data files. No API key required.
Zero-context verification that every number, comparison, and scope claim in the paper matches raw result files. Uses a fresh cross-model reviewer with NO prior context to prevent confirmation bias. Use when user says "审查论文数据", "check paper claims", "verify numbers", "论文数字核对", or before submission to ensure paper-to-evidence fidelity.
Review football data code and visualisations for correctness. Use after building a chart, data pipeline, or analysis. Dispatches specialised reviewers for data correctness, chart conventions, visual inspection, and interactive edge cases.
Data validation using Great Expectations. Expectation suites, checkpoints, and data docs for pipeline monitoring.
Standards and best practices for writing LookML tests to ensure data integrity, accuracy, and logic validation.
Design ETL workflows with data validation using tools like Pandas, Dask, or PySpark. Use when building robust data processing systems in Python.
Activated when the user wants to create a data model, validate data, serialize JSON, create Pydantic models, add validators, define settings, or create request/response schemas. Covers Pydantic v2 BaseModel, Field, validators, data validation, JSON schema generation, serialization, deserialization, and settings management.
Use to define schemas, topic tags, and lineage metadata for enriched signals.