You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When I run RoBERTa/main.py to train the model, When I use RoBERTa for question embedding, after running the following code, the values in the roberta_last_hidden_states are all nan. Why is this?
roberta_last_hidden_states = self.roberta_model(question_tokenized, attention_mask=attention_mask)[0]
The text was updated successfully, but these errors were encountered:
When I run RoBERTa/main.py to train the model, When I use RoBERTa for question embedding, after running the following code, the values in the roberta_last_hidden_states are all nan. Why is this?
roberta_last_hidden_states = self.roberta_model(question_tokenized, attention_mask=attention_mask)[0]
The text was updated successfully, but these errors were encountered: