← Registry

nanogpt

Community

Educational GPT implementation in ~300 lines. Reproduces GPT-2 (124M) on OpenWebText. Clean, hackable code for learning transformers. By Andrej Karpathy. Perfect for understanding GPT architecture from scratch. Train on Shakespeare (CPU) or OpenWebText (multi-GPU).

Install

skillpm install nanogpt

Format score

100/100

Spec

v1.0

Installs

0

Published

April 1, 2026