You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Repro code: model5 = keras_hub.models.CausalLM.from_preset("hf://tiiuae/falcon-7b-instruct", dtype="bfloat16")
Result: ValueError: KerasHub has no converter for huggingface/transformers models with model type 'falcon'
Thanks for reporting the issue. You can intialize falcon-7b-instruct model using transformers AutoTokenizer, AutoModelForCausalLM class.
from transformers import AutoModelForCausalLM, AutoTokenizer
import torch
model_name = "tiiuae/falcon-7b-instruct"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(model_name, torch_dtype=torch.bfloat16, trust_remote_code=True, device_map="auto")
And for loading falcon model family(falcon_refinedweb_1b_en) in keras-hub, you can use like this. model5 = keras_hub.models.CausalLM.from_preset("hf://keras/falcon_refinedweb_1b_en", dtype="bfloat16")
Repro code:
model5 = keras_hub.models.CausalLM.from_preset("hf://tiiuae/falcon-7b-instruct", dtype="bfloat16")
Result:
ValueError: KerasHub has no converter for huggingface/transformers models with model type 'falcon'
Now that the Falcon model family exists in Keras-hub, this should work.
The text was updated successfully, but these errors were encountered: