← Registry

knowledge-distillation

Community

Compress large language models using knowledge distillation from teacher to student models. Use when deploying smaller models with retained performance, transferring GPT-4 capabilities to open-source models, or reducing inference costs. Covers temperature scaling, soft targets, reverse KLD, logit distillation, and MiniLLM training strategies.

Install

skillpm install knowledge-distillation

Format score

95/100

Spec

v1.0

Installs

0

Published

April 1, 2026