You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
An error occurred: The checkpoint you are trying to load has model type qwen2_vl but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.
#114
Open
kicks66 opened this issue
Sep 23, 2024
· 1 comment
Im getting the following error when using the vLLM template
An error occurred: The checkpoint you are trying to load has model type qwen2_vl but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.
I believe its because the latest version of transformers is required:
Im getting the following error when using the vLLM template
An error occurred: The checkpoint you are trying to load has model type qwen2_vl but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.
I believe its because the latest version of transformers is required:
pip install git+https://github.com/huggingface/transformers accelerate
Is it possible to install this over the top?
The text was updated successfully, but these errors were encountered: