← Registry

long-context

Community

Extend context windows of transformer models using RoPE, YaRN, ALiBi, and position interpolation techniques. Use when processing long documents (32k-128k+ tokens), extending pre-trained models beyond original context limits, or implementing efficient positional encodings. Covers rotary embeddings, attention biases, interpolation methods, and extrapolation strategies for LLMs.

Install

skillpm install long-context

Format score

85/100

Spec

v1.0

Installs

0

Published

April 1, 2026