v1.1.3
- Fix bug in a function that was unused (so not affecting tokenizer results)
- Support very large inputs (previous version was not guaranteed to produce correct results for inputs larger than 100 000 characters, although in practice it would almost always produce correct results for large inputs)