Loading...
Loading...
Found 3,853 Skills
Handles Taubyte login/profile setup and first-time browser GitHub auth via tau login --new when no account exists; tau login for existing profiles. Stops immediately when browser login is required. Uses non-interactive login when the user supplies a GitHub username; otherwise asks for it explicitly. Must run before cloud/project/resource operations.
Master Taubyte workflow skill. Enforces strict order with Dream-by-default routing, scope routing, context logging, and verification.
Auto-maintains a context log file inside the current project (cloud, scope, selected resources, latest actions) to reduce AI hallucinations.
Return XNO to the operator or original sender. Identifies source addresses, confirms before sending, handles ambiguity safely.
Turn a spec or requirements into a bite-sized implementation plan. File-structure-first (deep modules), embedded grill to stress-test, no forced sub-skill chains. Use when you have a spec or clear requirements and are ready to plan execution.
Execute a written implementation plan task-by-task with inline verification. Stop and ask on blockers. No forced sub-skill chains. Use when a plan exists and you're ready to implement.
Automated data quality and transformation capabilities for Dataform/dbt/BigQuery pipelines. Processes data sourced from BigQuery or Cloud Storage (GCS), applying best practices for data ingestion, movement, schema mapping, and comprehensive data cleaning.
Use these skills when you need to troubleshoot slow performance, analyze query execution plans, identify resource-heavy processes, and monitor system-level PromQL metrics.
Finds and inspects data assets within Google Cloud. Relevant when any of the following conditions are true: 1. The user request involves finding, exploring, or inspecting data assets in Google Cloud, such as: - BigQuery datasets, tables, or views - BigLake catalog or tables - Spanner instances, databases or tables - etc. 2. You need to retrieve the schema, metadata, or governance policies for a GCP data asset. 3. You have a keyword or topic (e.g., "sales data") but lack the specific table or resource ID. 4. You are attempting to find data using `bq ls`, as this skill offers a superior approach. Don't use when: - Assets are outside Google Cloud
**STOP AND VERIFY**: Before running any command or tool that results in irreversible data loss, you MUST obtain explicit user consent. When in doubt, ask. It is better to wait for confirmation than to accidentally delete production data or critical project assets. Use this for: - SQL: DROP TABLE/VIEW/SCHEMA/DATABASE, TRUNCATE, or broad DELETE (missing WHERE or using 1=1). - Cloud Storage: gsutil rm or gcloud storage rm targeting production data or critical buckets. - Infrastructure: gcloud projects delete, deleting Spanner/BigQuery/Dataproc resources, deleting secrets, or KMS key destruction.
Use these skills when you need to troubleshoot performance bottlenecks, analyze query execution plans, identify resource-heavy processes, and monitor system-level PromQL metrics.
DataWorks Operations Center assistant for task and workflow operations, alert rule creation and management. Covers troubleshooting, failure recovery, baseline assurance, monitoring and alerting. Supports periodic, manual, and triggered tasks/workflows (excludes real-time/streaming tasks). Uses aliyun CLI to call dataworks-public OpenAPI (2024-05-18). Trigger keywords: query task, task instance, instance log, workflow, workflow instance, alert rule, operations center, task failure, instance status, upstream/downstream dependency, rerun, monitoring alert, custom monitoring, alert rule, task instance, workflow instance, operation log, baseline assurance, failure recovery, DataWorks operations. Do NOT trigger: data source management, compute resources, resource groups, data development, MaxCompute table management, ECS/RDS/OSS operations, workspace member management, data quality, data lineage, data preview.