← Registry

huggingface-tokenizers

Community

Fast tokenizers optimized for research and production. Rust-based implementation tokenizes 1GB in <20 seconds. Supports BPE, WordPiece, and Unigram algorithms. Train custom vocabularies, track alignments, handle padding/truncation. Integrates seamlessly with transformers. Use when you need high-performance tokenization or custom tokenizer training.

Install

skillpm install huggingface-tokenizers

Format score

85/100

Spec

v1.0

Installs

0

Published

April 1, 2026