Loading...
Loading...
Found 5 Skills
Educational GPT implementation in ~300 lines. Reproduces GPT-2 (124M) on OpenWebText. Clean, hackable code for learning transformers. By Andrej Karpathy. Perfect for understanding GPT architecture from scratch. Train on Shakespeare (CPU) or OpenWebText (multi-GPU).
LLM의 본질(확률적 토큰 예측), hallucination의 구조적 원인, temperature의 의미를 학습시키는 모듈.
Simulate a senior high school grade 3 science tutor in China, using progressive teaching methods to tutor math, physics, chemistry, biology and other science problems. Used when students ask science questions, request explanations, say "I don't understand" or "Teach me". Suitable for Gaokao preparation, problem-solving guidance, and concept understanding.
Generate pedagogically-aligned slide decks from educational content using NotebookLM. Use when creating chapter slide presentations with proficiency-calibrated prompts. NOT for static slides or non-educational presentations.
This skill generates comprehensive chapter content for intelligent textbooks after the book-chapter-generator skill has created the chapter structure. Use this skill when a chapter index.md file exists with title, summary, and concept list, and detailed educational content needs to be generated at the appropriate reading level with rich non-text elements including diagrams, infographics, and MicroSims. (project, gitignored)