You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The flag "eval_mode" indicates that I am not going to train the model, I am using it in order to calculate embeddings.
When running, the module which loads the model (transformers.modeling_utils) issues the following message:
Some weights of BertModel were not initialized from the model checkpoint at onlplab/alephbert-base and are newly initialized: ['bert.pooler.dense.weight', 'bert.pooler.dense.bias']
You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.
Do these weights affect the calculation of embeddings? if so, how to fix it?
Thanks,
Yuval
The text was updated successfully, but these errors were encountered:
Hi,
I am loading AlephBERT using the "transformers" package via allennlp using the following jsonnet definition:
"token_embedders": {
"bert": {
"type": "pretrained_transformer",
"model_name": "onlplab/alephbert-base",
"eval_mode": true,
}
}
The flag "eval_mode" indicates that I am not going to train the model, I am using it in order to calculate embeddings.
When running, the module which loads the model (transformers.modeling_utils) issues the following message:
Some weights of BertModel were not initialized from the model checkpoint at onlplab/alephbert-base and are newly initialized: ['bert.pooler.dense.weight', 'bert.pooler.dense.bias']
You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.
Do these weights affect the calculation of embeddings? if so, how to fix it?
Thanks,
Yuval
The text was updated successfully, but these errors were encountered: