Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Fixed the issue of being unable to handle added/expanded model tokens…
…. For transformers, tokenizer.vocab_size excludes all tokens added via token expansion. Correct usage here is len(tokenizer). (#83)
- Loading branch information