Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use for local model #1

Open
Chasapas opened this issue Feb 19, 2024 · 1 comment
Open

Use for local model #1

Chasapas opened this issue Feb 19, 2024 · 1 comment

Comments

@Chasapas
Copy link

Chasapas commented Feb 19, 2024

Hey. Can i use this to convert a local fine tunbed model (LORA)?

@brittlewis12
Copy link
Owner

Hey @Chasapas! Right now this script doesn't support LoRA conversion, but llama.cpp itself does have a script for converting LoRAs! Take a look here:
https://github.com/ggerganov/llama.cpp/blob/master/convert-lora-to-ggml.py

A little more detail on how to use it here:
ggerganov/llama.cpp#5360

If you're interested, I could explore adding this as an option, but I haven't trained any LoRAs myself so I haven't had an occasion to look into this further so far!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants