-
Notifications
You must be signed in to change notification settings - Fork 25
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
RuntimeError: Error(s) in loading state_dict for DistributedDataParallel #7
Comments
Could you provide more details about the version of your 'transformers' pak? We recommand running SegVol on 'transformers==4.18.0'. |
Thank you very much for your reply. I have resolved the issue as per your suggestion and have also discovered that adding False to load_state_dict can also resolve the issue. |
I strongly recommend not using 'strict=False' in load_state_dict to load parameters, as this can result in random initialization of some parameters. |
Have you fixed the bug yet? I'm sorry I can't reproduce it. I don't know if anyone else has been in a similar situation.🤦 |
您好,我正在尝试复现demo也遇到了同样的报错,我用的是window系统,cuda==12.2,pytorch==2.0.1,monai==1.3.1(因为0.9.0安装失败),其余都和推荐版本相同。我目前也是通过strict=False规避该问题。 |
请确保'transformers==4.18.0'以及加载的是 |
|
Thank you very much for your great work on open source. I encountered the following problems when training the model according to the training sequence, dataset, and weight values provided by you:
RuntimeError: Error(s) in loading state_dict for DistributedDataParallel: Unexpected key(s) in state_dict: "module.text_encoder.clip_text_model.text_model.embeddings.position_ids".
I sincerely hope to receive your reply.
The text was updated successfully, but these errors were encountered: