Loading...
Loading...
Found 945 Skills
Invoke Alibaba Cloud Apsara Data Agent for Analytics via CLI to perform natural language-driven data analysis on enterprise databases. Data Agent for Analytics is an intelligent data analysis agent developed by Alibaba Cloud Database team for enterprise users. It automatically completes requirement analysis, data understanding, analysis insights, and report generation based on natural language descriptions. This tool supports: discovering data resources (instances/databases/tables) managed in DMS, initiating query or deep analysis sessions, real-time progress tracking, and retrieving analysis conclusions and generated reports. Use this Skill when users need to query databases, analyze data trends, generate data reports, ask questions in natural language, or mention "Data Agent", "data analysis", "database query", "SQL analysis", "data insights".
Read any data file (CSV, JSON, Parquet, Avro, Excel, spatial, SQLite) or remote URL (S3, HTTPS). Use when user references a data file, asks "what's in this file", or wants to preview/profile a dataset. Not for source code.
Your AI agent's crypto brain. One skill, 83+ commands across 14 data domains — real-time prices, wallets, social intelligence, DeFi, on-chain SQL, prediction markets, and more. Natural language in, structured data out. Install once, access everything. Use whenever the user needs crypto data, asks about prices/wallets/tokens/DeFi, wants to investigate on-chain activity, or is building something that consumes crypto data — even if they don't say "surf" explicitly.
Go implementation guide for PMA-managed service and CLI projects. Covers project layout (cmd/internal), strict linting with golangci-lint v2, database access (sqlc + pgx or GORM), HTTP patterns (stdlib + Chi or Gin), layered config with koanf, structured logging with slog, OpenTelemetry observability, and CI quality gates.
Run and interact with KarpathyTalk, an open markdown-based developer social network with GitHub auth, SQLite, and an LLM-friendly JSON/markdown API.
Index and search documents. Similar to ZeroClaw's SQLite hybrid search and OpenClaw's memory system. Supports full-text search, keyword extraction, and document categorization.
Treasure Data platform help — enterprise AI-native CDP with profile unification, audience segmentation, journey orchestration, 400+ connectors, AI Marketing Cloud suites. Use when setting up Treasure Data CDP, unifying customer profiles across channels, building audience segments in Treasure Data, activating audiences to ad platforms or CRMs, configuring Treasure Data connectors or integrations, writing SQL queries in Treasure Data, troubleshooting Treasure Data workflows or jobs, or working with the TD API. Do NOT use for general CRM data quality without Treasure Data context — use /sales-data-hygiene instead.
Web application security expert. OWASP Top 10, XSS, SQLi, CSRF, SSRF, authentication bypass, IDOR. Use for web app security testing.
Analyze lakehouse data interactively using Fabric Livy sessions and PySpark/Spark SQL for advanced analytics, DataFrames, cross-lakehouse joins, Delta time-travel, and unstructured/JSON data. Use when the user explicitly asks for PySpark, Spark DataFrames, Livy sessions, or Python-based analysis — NOT for simple SQL queries. Triggers: "PySpark", "Spark SQL", "analyze with PySpark", "Spark DataFrame", "Livy session", "lakehouse with Python", "PySpark analysis", "PySpark data quality", "Delta time-travel with Spark".
Query VictoriaMetrics metrics via curl. Use when running PromQL/MetricsQL queries, discovering metrics/labels, checking alerts and rules, inspecting TSDB status, exporting raw data, checking metric usage statistics, or debugging relabeling/downsampling/retention configs. Triggers on: metric queries, PromQL, MetricsQL, label discovery, series exploration, cardinality checks, alert status, recording rules, active/top queries, export data, metric statistics, relabel debug, downsampling debug, retention debug, flags.
CRITICAL RULE: You MUST use this skill whenever the task involves any machine learning tasks or data analysis. Use this skill if the user's prompt or requirements mention any of the following: * Clustering * Classification * Regression * Time series forecasting * Statistical testing * Model comparison * ML * Data analysis SQL/BigQuery ML HANDOFF: If the user requires a SQL solution, use this skill to dictate the ANALYSIS STEPS (e.g., markdown analysis cells, visualization logic), but defer to `bigquery` for all SQL syntax.
Reviews, audits, and modifies CockroachDB cluster settings. Self-Hosted has full control over all settings and start flags. Advanced/BYOC can modify most SQL-level settings but infrastructure settings are managed by CRL. Standard has limited settings access — session variables are the primary tuning mechanism. Basic has minimal settings — use session variables and Cloud Console. Use when auditing configuration, tuning performance, or troubleshooting settings-related issues.