You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The README has a link to https://github.com/huggingface/pytorch-pretrained-BERT, but this redirects to https://github.com/huggingface/transformers and I think the former is deprecated. The pytorch-pretrained-bert package still exists on PyPI (link), but I installed transformers instead. Now I'm getting ModuleNotFoundError: No module named 'pytorch_pretrained_bert'. In line 12 shown below, I simply replaced pytorch_pretrained_bert with transformers:
This gets me a little further, but then I see this (partial traceback):
File "/home/mwg/wsd/disambiguate/python/getalp/wsd/modules/embeddings/embeddings_bert.py", line 69, in forward
inputs, _ =self.bert_embeddings(inputs, attention_mask=pad_mask, output_all_encoded_layers=False)
File "/home/mwg/wsd/disambiguate/py36/lib/python3.7/site-packages/torch/nn/modules/module.py", line 532, in __call__
result =self.forward(*input, **kwargs)
TypeError: forward() got an unexpected keyword argument 'output_all_encoded_layers'
I saw huggingface/transformers#3541 and then changed line 13 above to have the parameter output_hidden_states=False and removed the output_all_encoded_layers=False parameter from line 69 (shown below):
Thank you for pointing that out !
Everything looks good, but I will have to be sure that it doesn't break anything after the change, at least by reproducing the main results of our paper. I will do it when I have time !
The README has a link to https://github.com/huggingface/pytorch-pretrained-BERT, but this redirects to https://github.com/huggingface/transformers and I think the former is deprecated. The
pytorch-pretrained-bert
package still exists on PyPI (link), but I installedtransformers
instead. Now I'm gettingModuleNotFoundError: No module named 'pytorch_pretrained_bert'
. In line 12 shown below, I simply replacedpytorch_pretrained_bert
withtransformers
:disambiguate/python/getalp/wsd/modules/embeddings/embeddings_bert.py
Lines 8 to 14 in d6c0e75
This gets me a little further, but then I see this (partial traceback):
I looked at the following link but did not see anything about
output_all_encoded_layers
: https://github.com/huggingface/transformers#Migrating-from-pytorch-pretrained-bert-to-transformersI saw huggingface/transformers#3541 and then changed line 13 above to have the parameter
output_hidden_states=False
and removed theoutput_all_encoded_layers=False
parameter from line 69 (shown below):disambiguate/python/getalp/wsd/modules/embeddings/embeddings_bert.py
Line 69 in d6c0e75
After this I was able to get some output. Can you confirm if these changes are sufficient? If so I can put together a PR for the fix.
The text was updated successfully, but these errors were encountered: