-
Notifications
You must be signed in to change notification settings - Fork 141
Issues: huggingface/lighteval
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
[BUG] tiktoken is not optional dependency
bug
Something isn't working
#546
opened Feb 8, 2025 by
hynky1999
couldn't find it in the cached files and it looks like Elron/bleurt-tiny-512, how to set the model path?
#545
opened Feb 8, 2025 by
bannima
More flexibility in parameters for OpenAI / LiteLLM
feature request
New feature/request
#544
opened Feb 7, 2025 by
satpalsr
[FT] Faster generation with TransformersModel by using less padding
feature request
New feature/request
#531
opened Feb 3, 2025 by
rolshoven
lighteval with llama3.2 [RuntimeError: No executable batch size found, reached zero.]
#525
opened Jan 29, 2025 by
Nevermetyou65
[BUG] ImportError: cannot import name 'ExprExtractionConfig' from 'lighteval.metrics.dynamic_metrics'
bug
Something isn't working
#523
opened Jan 28, 2025 by
hlzhang109
[FT] Enable lazy model initialization
feature request
New feature/request
#496
opened Jan 11, 2025 by
JoelNiklaus
[FT] Custom model to TransformersModel
feature request
New feature/request
#489
opened Jan 7, 2025 by
Giuseppe5
[BUG] By default pip install lighteval is installing the cpu only torch version , its killing dependencies.
bug
Something isn't working
#487
opened Jan 6, 2025 by
kzos
[FT] Add and test multinode runs back
feature request
New feature/request
#482
opened Jan 2, 2025 by
clefourrier
[FT] Enhancing CorpusLevelTranslationMetric with Asian Language Support
feature request
New feature/request
#478
opened Dec 27, 2024 by
ryan-minato
[FT] JudgeLLM should support litellm backend
feature request
New feature/request
#474
opened Dec 22, 2024 by
JoelNiklaus
[BUG] Issue with LightevalTaskConfig.stop_sequence Attribute When Unset
bug
Something isn't working
#462
opened Dec 19, 2024 by
ryan-minato
[BUG] Issue with CACHE_DIR Default Value in Accelerate Pipeline
bug
Something isn't working
#460
opened Dec 19, 2024 by
ryan-minato
[FT] remove openai endpoint and only use litellm
feature request
New feature/request
#458
opened Dec 18, 2024 by
NathanHB
[FT] Align parameter names in config files and config classes
feature request
New feature/request
#439
opened Dec 12, 2024 by
albertvillanova
[FT] Fail faster when passing unsupported metrics to InferenceEndpointModel
feature request
New feature/request
#436
opened Dec 11, 2024 by
albertvillanova
[FT] Enable the evaluation of any function
feature request
New feature/request
#430
opened Dec 10, 2024 by
JoelNiklaus
Previous Next
ProTip!
Exclude everything labeled
bug
with -label:bug.