[YANKED] Python v0.12.0
The breaking change was causing more issues upstream in transformers than anticipated:
https://github.com/huggingface/transformers/pull/16537#issuecomment-1085682657
The decision was to rollback on that breaking change, and figure out a different way later to do this modification
Bump minor version because of a breaking change.
[#938] Breaking change. Decoder trait is modified to be composable. This is only breaking if you are using decoders on their own. tokenizers should be error free.
[#939] Making the regex in ByteLevel pre_tokenizer optional (necessary for BigScience)
[#952] Fixed the vocabulary size of UnigramTrainer output (to respect added tokens)
[#954] Fixed not being able to save vocabularies with holes in vocab (ConvBert). Yell warnings instead, but stop panicking.
[#962] Fix tests for python 3.10
[#961] Added link for Ruby port of tokenizers
Fetched April 7, 2026