Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

issues with LaMed-Phi3 inference #33

Open
lzl199704 opened this issue Nov 12, 2024 · 4 comments
Open

issues with LaMed-Phi3 inference #33

lzl199704 opened this issue Nov 12, 2024 · 4 comments

Comments

@lzl199704
Copy link

Screenshot 2024-11-11 203559 Cannot use the demo code for inference.

Transformers==4.46.2
Torch==2.4.0

@baifanxxx
Copy link
Collaborator

Hi,

Thank you for your attention, please try a lower version of transformers.

@YongchengYAO
Copy link

transformers==4.45 works for me

@Coisini-Glenda
Copy link

This problem occurred during the eval_caption test:

RuntimeError: Error(s) in loading state_dict for Phi3ForCausalLM:
size mismatch for model.embed_tokens.weight: copying a param with shape torch.Size([32015, 3072]) from checkpoint, the shape in current model is torch.Size([32064, 3072]).
size mismatch for lm_head.weight: copying a param with shape torch.Size([32015, 3072]) from checkpoint, the shape in current model is torch.Size([32064, 3072]).

@zhouhoo
Copy link

zhouhoo commented Dec 23, 2024

transformers==4.45 works for me

I got error as follow: Token indices sequence length is longer than the specified maximum sequence length for this model (2963 > 512). Running this sequence through the model will result in indexing errors 0%| | 0/64 [00:02<?, ?it/s] Traceback (most recent call last): File "/root/M3D/Bench/eval/eval_vqa.py", line 162, in <module> main() File "/root/M3D/Bench/eval/eval_vqa.py", line 114, in main generation = model.generate(image, input_id, max_new_tokens=args.max_new_tokens, File "/root/miniconda3/envs/py310/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 116, in decorate_context return func(*args, **kwargs) File "/root/miniconda3/envs/py310/lib/python3.10/site-packages/transformers/generation/utils.py", line 1804, in generate generation_config, model_kwargs = self._prepare_generation_config(generation_config, **kwargs) File "/root/miniconda3/envs/py310/lib/python3.10/site-packages/transformers/generation/utils.py", line 1355, in _prepare_generation_config model_kwargs = generation_config.update(**kwargs) AttributeError: 'Tensor' object has no attribute 'update'

it is version problem?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants