Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support MiniMaxAI/MiniMax-Text-01 ({% generation %} blocks) #28

Closed
ochafik opened this issue Jan 22, 2025 · 0 comments · Fixed by #29
Closed

Support MiniMaxAI/MiniMax-Text-01 ({% generation %} blocks) #28

ochafik opened this issue Jan 22, 2025 · 0 comments · Fixed by #29

Comments

@ochafik
Copy link
Collaborator

ochafik commented Jan 22, 2025

Reported by @fairydreaming here (thanks again!)

Custom extension introduced by HuggingFace in huggingface/transformers#30650

It looks like it's only useful during training (to help mask generated tokens away), so should be ok to treat as a no-op for inference purposes.

But... since it breaks jinja2, it may be time to switch the python expectations logic to the transformers library, which may simplify the tests. (Tracking in #30 )

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant