You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hellor,
Thank you for sharing your code.
Have you tried to fine-tune the bert-like model during the fine-tuning?
If I understood well you don't use the tags when using the bert embeddings, did you try using both?
Also there is 2 minor changes for the char_lstm.py file to work :
Line 42 replace n_embed by n_word_embed and return None with embed to avoid changing the training code.
Thank you in advance.
The text was updated successfully, but these errors were encountered:
I ran into the same issue while trying to do training.
I did fix at line 42 of char_lstm.py the variable from n_embed to n_word_embed, but I didn't realized what "return None with embed" mean. Anyone can elaborate a little more about it?
Hellor,
Thank you for sharing your code.
Have you tried to fine-tune the bert-like model during the fine-tuning?
If I understood well you don't use the tags when using the bert embeddings, did you try using both?
Also there is 2 minor changes for the char_lstm.py file to work :
Line 42 replace n_embed by n_word_embed and return None with embed to avoid changing the training code.
Thank you in advance.
The text was updated successfully, but these errors were encountered: