Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can flagembedding dependancy on transformers upgrade to newer ones? #1266

Open
firezym opened this issue Nov 29, 2024 · 7 comments
Open

Can flagembedding dependancy on transformers upgrade to newer ones? #1266

firezym opened this issue Nov 29, 2024 · 7 comments

Comments

@firezym
Copy link

firezym commented Nov 29, 2024

I have other packages depending on higher versions of transformer such as sentence-transformers, marker-pdf and so on. Can the future releases upgrade transformers version requirements?

flagembedding 1.3.2 has requirement transformers==4.44.2, but you have transformers 4.46.3.
marker-pdf 0.3.10 has requirement transformers<5.0.0,>=4.45.2, but you have transformers 4.44.2.
@Nimtaa
Copy link

Nimtaa commented Dec 12, 2024

I’ve also encountered a bug caused by a conflict between transformers==4.44.2 and accelerate==1.2.0. It would be great if you could update the transformers dependency to the latest available version.

@conderls
Copy link

File /data/user/usr/anaconda3/envs/py11/lib/python3.11/site-packages/FlagEmbedding/inference/reranker/decoder_only/models/gemma_model.py:56
     54 from .gemma_config import CostWiseGemmaConfig
     55 from transformers.models.gemma2.modeling_gemma2 import Gemma2RMSNorm, Gemma2RotaryEmbedding, rotate_half, apply_rotary_pos_emb
---> 56 from transformers.models.gemma2.modeling_gemma2 import Gemma2MLP, repeat_kv, Gemma2Attention, Gemma2FlashAttention2, Gemma2SdpaAttention, GEMMA2_ATTENTION_CLASSES, Gemma2DecoderLayer, GEMMA2_START_DOCSTRING
     57 from transformers.models.gemma2.modeling_gemma2 import GEMMA2_INPUTS_DOCSTRING
     59 if is_flash_attn_2_available():

ImportError: cannot import name 'GEMMA2_ATTENTION_CLASSES' from 'transformers.models.gemma2.modeling_gemma2'

error raised for transformers==4.47.0

@firezym
Copy link
Author

firezym commented Dec 30, 2024

File /data/user/usr/anaconda3/envs/py11/lib/python3.11/site-packages/FlagEmbedding/inference/reranker/decoder_only/models/gemma_model.py:56
     54 from .gemma_config import CostWiseGemmaConfig
     55 from transformers.models.gemma2.modeling_gemma2 import Gemma2RMSNorm, Gemma2RotaryEmbedding, rotate_half, apply_rotary_pos_emb
---> 56 from transformers.models.gemma2.modeling_gemma2 import Gemma2MLP, repeat_kv, Gemma2Attention, Gemma2FlashAttention2, Gemma2SdpaAttention, GEMMA2_ATTENTION_CLASSES, Gemma2DecoderLayer, GEMMA2_START_DOCSTRING
     57 from transformers.models.gemma2.modeling_gemma2 import GEMMA2_INPUTS_DOCSTRING
     59 if is_flash_attn_2_available():

ImportError: cannot import name 'GEMMA2_ATTENTION_CLASSES' from 'transformers.models.gemma2.modeling_gemma2'

error raised for transformers==4.47.0

I have the same problem when I am just importing @545999961

from FlagEmbedding import BGEM3FlagModel

File d:\conda\win\envs\prod\Lib\site-packages\FlagEmbedding\inference\reranker\decoder_only\models\gemma_model.py:56
     [54](file:///D:/conda/win/envs/prod/Lib/site-packages/FlagEmbedding/inference/reranker/decoder_only/models/gemma_model.py:54) from .gemma_config import CostWiseGemmaConfig
     [55](file:///D:/conda/win/envs/prod/Lib/site-packages/FlagEmbedding/inference/reranker/decoder_only/models/gemma_model.py:55) from transformers.models.gemma2.modeling_gemma2 import Gemma2RMSNorm, Gemma2RotaryEmbedding, rotate_half, apply_rotary_pos_emb
---> [56](file:///D:/conda/win/envs/prod/Lib/site-packages/FlagEmbedding/inference/reranker/decoder_only/models/gemma_model.py:56) from transformers.models.gemma2.modeling_gemma2 import Gemma2MLP, repeat_kv, Gemma2Attention, Gemma2FlashAttention2, Gemma2SdpaAttention, GEMMA2_ATTENTION_CLASSES, Gemma2DecoderLayer, GEMMA2_START_DOCSTRING
     [57](file:///D:/conda/win/envs/prod/Lib/site-packages/FlagEmbedding/inference/reranker/decoder_only/models/gemma_model.py:57) from transformers.models.gemma2.modeling_gemma2 import GEMMA2_INPUTS_DOCSTRING
     [59](file:///D:/conda/win/envs/prod/Lib/site-packages/FlagEmbedding/inference/reranker/decoder_only/models/gemma_model.py:59) if is_flash_attn_2_available():

ImportError: cannot import name 'GEMMA2_ATTENTION_CLASSES' from 'transformers.models.gemma2.modeling_gemma2' (d:\conda\win\envs\prod\Lib\site-packages\transformers\models\gemma2\modeling_gemma2.py)

@Nimtaa
Copy link

Nimtaa commented Jan 17, 2025

Are there any updated plans or timelines for upgrading to the latest version of transformers?
There are security updates in the latest release that we really need to align with: huggingface/transformers#34840

@mxchinegod
Copy link
Contributor

#1356

Adding to the urgency around this. Would be great to have this merged.

@jesnie
Copy link

jesnie commented Feb 7, 2025

Yes, can we please have some action on this? There even is a pull request for it. You just need to merge it and release it.

@hanhainebula
Copy link
Collaborator

Hello, everyone! We've just released version 1.3.4 on PyPi. This update has fixed the dependency issue discussed here. Thanks to @Hypothesis-Z for submitting PR #1343!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants