We released a new open source byte-pair tokenizer that is faster and more flexible than popular alternatives.
The post So many tokens, so little time: Introducing a faster, more flexible byte-pair tokenizer appeared first on [The GitHub Bl ... ⌘ [Read more](https://github.blog/ai-and-ml/llms/so-many-tokens-so-little-time-introducing-a-faster-more-flexible-byte-pair-tokenizer/)