-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
RankLLM: Merge LiT5 Models into RankLLM #35
Comments
I've completed the repro for rank_llm and LiT5-Distill. Thanks @ronakice @manveertamber |
I'm working on this A potential solution for the integration (thanks @ronakice) : in a rank_fid.py
in a lit5_reranker.py
|
Perhaps also https://github.com/soyoung97/ListT5/blob/main/test.py, a contemporary very similar work. Not sure but maybe @soyoung97 can also verify their implementation! :) |
I've pushed a draft version of LiT5-Distill integration into RankLLM at 17Melissa/rank_llm@eba8cc3. The data.py and model.py scripts are set up, but I'm still figuring out the way to integrate them. Perhaps the Any insights or suggestions will be appreciated! |
This is great stuff Melissa, we've been refactoring RankLLM so it will be great if we can try to align things with that, I'll give more comments tomorrow :) |
Made a pull request castorini/rank_llm#116 with what has been accomplished so far and potential next steps |
The LiT5 model suite should be incorporated into RankLLM to centralize our models.
Relevant Repos:
To work on this task, get started by doing the repro for RankZephyr here: https://github.com/castorini/rank_llm?tab=readme-ov-file#run-end-to-end-test
Then, attempt to repro LiT5 (just the LiT5-Distill.sh): https://github.com/castorini/LiT5. You may need to adjust the batch sizes according to GPU limitations; more details can be found in the script.
The text was updated successfully, but these errors were encountered: