Loading...
Loading...
Found 333 Skills
Prevents generic AI-generated designs by guiding typography, color, motion, and background choices. Use when creating frontend designs, landing pages, dashboards, or any UI/UX work. Helps avoid the "AI slop" aesthetic.
Senior UX/UI Design Director for mobile app design auditing across all platforms (iOS, Android, cross-platform). Triggers on "UI建议", "审美讲义", "交互建议", design audit, UI review, UX audit, or premium design feedback. Provides three-tier proposals (Safe/Balanced/Avant-Garde), aesthetic formulas, motion physics, and brand consistency checks. Specializes in "Young Luxury" mobile experiences grounded in Apple HIG and Material Design 3.
Expert patterns for AnimationTree including StateMachine transitions, BlendSpace2D for directional movement, BlendTree for layered animations, root motion, transition conditions, advance expressions, and state machine sub-states. Use for complex character animation systems with movement blending and state management. Trigger keywords: AnimationTree, AnimationNodeStateMachine, BlendSpace2D, BlendSpace1D, BlendTree, transition_request, blend_position, advance_expression, AnimationNodeAdd2, AnimationNodeBlend2, root_motion.
Design consultation: understands your product, researches the landscape, proposes a complete design system (aesthetic, typography, color, layout, spacing, motion), and generates font+color preview pages. Creates DESIGN.md as your project's design source of truth. For existing sites, use /plan-design-review to infer the system instead. Use when asked to "design system", "brand guidelines", or "create DESIGN.md". Proactively suggest when starting a new project's UI with no existing design system or DESIGN.md.
Build UIs using IBM's Carbon Design System. Use when the user requests Carbon-styled interfaces, IBM-style dashboards, enterprise UIs following Carbon conventions, or explicitly mentions Carbon, IBM design, or @carbon/react. Covers component usage, design tokens (color, typography, spacing, motion), theming (White, Gray 10, Gray 90, Gray 100), grid layout, and accessibility. Supports both artifact/HTML output (CDN-based) and full React project output (@carbon/react). Triggers include "Carbon", "IBM design system", "enterprise dashboard", "@carbon/react", "carbon components", or requests for IBM-style professional interfaces.
Framer CMS management via the Server API — list, create, read, update, and delete CMS collections and items, upload images, publish previews, deploy to production, and manage project assets, all without opening Framer. Use when the user asks to manage Framer CMS content, publish a Framer site, push articles to Framer, update CMS items, upload images to Framer, create collections, sync content, or automate any Framer workflow. Trigger on: "framer", "framer cms", "framer publish", "framer deploy", "framer collection", "framer article", "push to framer", "upload to framer", "framer api", "framer server api", "cms item", "cms collection", "publish site", "deploy site", "framer preview", "framer image", "framer content". Do NOT trigger for: Framer design/layout work, Framer Motion animation library, building Framer plugins.
Clinic-architecture-aligned iOS animation craft guidelines for SwiftUI (iOS 26 / Swift 6.2) covering motion tokens, spring physics, gesture continuity, spatial transitions, micro-interactions, and accessibility. Enforces @Equatable on animated views and keeps animation state aligned with Domain/Data feature boundaries. Use when writing, reviewing, or refactoring SwiftUI animation code under the clinic modular MVVM-C architecture.
Use Seedance 2.0 (Higgsfield) to convert comic book panels, comic pages, webcomics, and illustration storyboards into animated videos. Apply this when users want to animate comics, bring illustrations to life, convert comics to videos, animate storyboards, or create motion from static sequential art. Trigger words: comic to video, comic animation, panel animation, storyboard to video, webcomic animation, comic book motion, sequential art, graphic novel animation, or any illustration-to-animation request. Also use it when users say "make this drawing move" or "animate this page".
Use this skill when analyzing existing video files using FFmpeg and AI vision, extracting frames for design system generation, detecting scene boundaries, analyzing animation timing, extracting color palettes, or understanding audio-visual sync. Triggers on video analysis, frame extraction, scene detection, ffprobe, motion analysis, and AI vision analysis of video content.
Design and redesign EdgeSpark frontends with distinctive, production-grade visual direction instead of generic AI-looking UI. Use when building or polishing landing pages, marketing sites, dashboards, auth flows, portfolios, product surfaces, or reusable frontend sections in EdgeSpark, especially when the task mentions design quality, aesthetics, typography, color, layout, motion, art direction, or making the UI feel more premium and original.
Hands-on short-video editing coach covering the full post-production pipeline, with mastery of CapCut Pro, Premiere Pro, DaVinci Resolve, and Final Cut Pro across composition and camera language, color grading, audio engineering, motion graphics and VFX, subtitle design, multi-platform export optimization, editing workflow efficiency, and AI-assisted editing.
Design, build, debug, and optimise high-polish animated graphics in React Native or Expo using @shopify/react-native-skia, Reanimated, and Gesture Handler. Use when the user wants canvas-driven UI, shaders, paths, rich text, image filters, sprite fields, Skottie, video frames, snapshots, web CanvasKit setup, or performance tuning for custom motion-heavy elements such as loaders, hero art, cards, charts, progress indicators, particle systems, or gesture-driven surfaces. Also use when the user asks for fluid, glow, glass, blob, parallax, 60fps/120fps, or GPU-friendly animated effects in React Native, even if they do not explicitly say "Skia". Do not use for ordinary form/layout work with standard views.