-
Notifications
You must be signed in to change notification settings - Fork 614
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Can flagembedding dependancy on transformers upgrade to newer ones? #1266
Comments
I’ve also encountered a bug caused by a conflict between |
File /data/user/usr/anaconda3/envs/py11/lib/python3.11/site-packages/FlagEmbedding/inference/reranker/decoder_only/models/gemma_model.py:56
54 from .gemma_config import CostWiseGemmaConfig
55 from transformers.models.gemma2.modeling_gemma2 import Gemma2RMSNorm, Gemma2RotaryEmbedding, rotate_half, apply_rotary_pos_emb
---> 56 from transformers.models.gemma2.modeling_gemma2 import Gemma2MLP, repeat_kv, Gemma2Attention, Gemma2FlashAttention2, Gemma2SdpaAttention, GEMMA2_ATTENTION_CLASSES, Gemma2DecoderLayer, GEMMA2_START_DOCSTRING
57 from transformers.models.gemma2.modeling_gemma2 import GEMMA2_INPUTS_DOCSTRING
59 if is_flash_attn_2_available():
ImportError: cannot import name 'GEMMA2_ATTENTION_CLASSES' from 'transformers.models.gemma2.modeling_gemma2' error raised for transformers==4.47.0 |
I have the same problem when I am just importing @545999961 from FlagEmbedding import BGEM3FlagModel
File d:\conda\win\envs\prod\Lib\site-packages\FlagEmbedding\inference\reranker\decoder_only\models\gemma_model.py:56
[54](file:///D:/conda/win/envs/prod/Lib/site-packages/FlagEmbedding/inference/reranker/decoder_only/models/gemma_model.py:54) from .gemma_config import CostWiseGemmaConfig
[55](file:///D:/conda/win/envs/prod/Lib/site-packages/FlagEmbedding/inference/reranker/decoder_only/models/gemma_model.py:55) from transformers.models.gemma2.modeling_gemma2 import Gemma2RMSNorm, Gemma2RotaryEmbedding, rotate_half, apply_rotary_pos_emb
---> [56](file:///D:/conda/win/envs/prod/Lib/site-packages/FlagEmbedding/inference/reranker/decoder_only/models/gemma_model.py:56) from transformers.models.gemma2.modeling_gemma2 import Gemma2MLP, repeat_kv, Gemma2Attention, Gemma2FlashAttention2, Gemma2SdpaAttention, GEMMA2_ATTENTION_CLASSES, Gemma2DecoderLayer, GEMMA2_START_DOCSTRING
[57](file:///D:/conda/win/envs/prod/Lib/site-packages/FlagEmbedding/inference/reranker/decoder_only/models/gemma_model.py:57) from transformers.models.gemma2.modeling_gemma2 import GEMMA2_INPUTS_DOCSTRING
[59](file:///D:/conda/win/envs/prod/Lib/site-packages/FlagEmbedding/inference/reranker/decoder_only/models/gemma_model.py:59) if is_flash_attn_2_available():
ImportError: cannot import name 'GEMMA2_ATTENTION_CLASSES' from 'transformers.models.gemma2.modeling_gemma2' (d:\conda\win\envs\prod\Lib\site-packages\transformers\models\gemma2\modeling_gemma2.py) |
Are there any updated plans or timelines for upgrading to the latest version of transformers? |
Adding to the urgency around this. Would be great to have this merged. |
Yes, can we please have some action on this? There even is a pull request for it. You just need to merge it and release it. |
Hello, everyone! We've just released version 1.3.4 on PyPi. This update has fixed the dependency issue discussed here. Thanks to @Hypothesis-Z for submitting PR #1343! |
I have other packages depending on higher versions of transformer such as sentence-transformers, marker-pdf and so on. Can the future releases upgrade transformers version requirements?
The text was updated successfully, but these errors were encountered: