Asking to pad but the tokenizer does not have a padding token #29
christiandarkin
started this conversation in
General
Replies: 1 comment 1 reply
-
Add these lines if tokenizer.pad_token is None:
print(f"Tokenizer has no pad_token set, setting it to 1.")
tokenizer.pad_token_id = 1 here: https://github.com/zetavg/LLaMA-LoRA-Tuner/blob/98dba8d/llama_lora/lib/finetune.py#L246 |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I keep getting the error: Asking to pad but the tokenizer does not have a padding token- but there doesn't seem to be an option for it in the gui. Any idea of how to fix this? (I'm trying to train with "TheBloke/Wizard-Vicuna-7B-Uncensored-HF"
Thanks.
Beta Was this translation helpful? Give feedback.
All reactions