This is a natural language processing course-based project. The project is to designed an RNN-based encoder and decoder with self-attention mechanism to translate Chinese/Vietnamese to English
ch2en_gru.ipynb: Includes GRU encoder and GRU-based Luong attention/no attention model for Chinese to English translation.
ch2en_lstm.ipynb: Includes lstm encoder and lstm-based Luong attention/no attention decoder model for Chinese to English translation.
vi2en.ipynb: Includes lstm/gru encoder and lstm-based/gru-based Luong attention/no attention decoder model for Vietnamese to English translation.
vi2en.ipynb: Includes lstm/gru encoder and lstm-based/gru-based Luong attention/no attention decoder model for Vietnamese to English translation.
vi2en_self.ipynb: Includes self-attention encoder and gru-based decoder model for Vietnamese to English translation.
ch2en_self.ipynb: Includes self-attention encoder and gru-based decoder model for Chinese to English translation.
✅ Add dataloader
✅ Train Unknown word representation
✅ Mask
✅ Minibatch
✅ Bleu score
✅ Save model
✅ Self-attention
✅ Multiple layers in encoder and decoder
✅ Without Attention
✅ LSTM
✅ Beam Search
✅ character-level Chinese
✅ phrase-level Chinese
-
.Ipynb to .py
-
Transformers