Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Transformers 4.35.0 support #4

Open
lun-4 opened this issue Nov 3, 2023 · 1 comment
Open

Transformers 4.35.0 support #4

lun-4 opened this issue Nov 3, 2023 · 1 comment

Comments

@lun-4
Copy link

lun-4 commented Nov 3, 2023

I had the misfortune of following the instructions 5 hours after release of transformers v4.35, the instructions guide to upgrade to the latest release, so I got the following error:

$ python -m llava.serve.controller --host 0.0.0.0 --port 10000                                                                                                                                                          (obsidian)
[2023-11-02 21:08:06,589] [INFO] [real_accelerator.py:110:get_accelerator] Setting ds_accelerator to cuda (auto detect)
Traceback (most recent call last):
  File "[...]/miniconda3/envs/obsidian/lib/python3.10/runpy.py", line 187, in _run_module_as_main
    mod_name, mod_spec, code = _get_module_details(mod_name, _Error)
  File "[...]/miniconda3/envs/obsidian/lib/python3.10/runpy.py", line 110, in _get_module_details
    __import__(pkg_name)
  File "[...]/Obsidian/llava/__init__.py", line 1, in <module>
    from .model import LlavaLlamaForCausalLM
  File "[...]/Obsidian/llava/model/__init__.py", line 3, in <module>
    from .language_model.llava_mpt import LlavaMPTForCausalLM, LlavaMPTConfig
  File "[...]/Obsidian/llava/model/language_model/llava_mpt.py", line 26, in <module>
    from .mpt.modeling_mpt import MPTConfig, MPTForCausalLM, MPTModel
  File "[...]/Obsidian/llava/model/language_model/mpt/modeling_mpt.py", line 19, in <module>
    from .hf_prefixlm_converter import add_bidirectional_mask_if_missing, convert_hf_causal_lm_to_prefix_lm
  File "[...]/Obsidian/llava/model/language_model/mpt/hf_prefixlm_converter.py", line 15, in <module>
    from transformers.models.bloom.modeling_bloom import _expand_mask as _expand_mask_bloom
ImportError: cannot import name '_expand_mask' from 'transformers.models.bloom.modeling_bloom' ([...]/miniconda3/envs/obsidian/lib/python3.10/site-packages/transformers/models/bloom/modeling_bloom.py)

As the immediate workaround, downgrading to v4.34 (pip install --upgrade transformers==4.34.0) works.

@qnguyen3
Copy link
Contributor

qnguyen3 commented Nov 3, 2023

Thank you! I will look into it, for now, using 4.34.0 seems to be the best solution

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants