Skip to content

NLP course-based project: focus on translation of Chinese to English and Vietnamese to English.

Notifications You must be signed in to change notification settings

gzmkobe/Neural-Translation-Machine

Repository files navigation

This is a natural language processing course-based project. The project is to designed an RNN-based encoder and decoder with self-attention mechanism to translate Chinese/Vietnamese to English

 ch2en_gru.ipynb: Includes GRU encoder and GRU-based Luong attention/no attention model for Chinese to English translation.
 ch2en_lstm.ipynb: Includes lstm encoder and lstm-based Luong attention/no attention decoder model for Chinese to English translation.
 vi2en.ipynb: Includes lstm/gru encoder and lstm-based/gru-based Luong attention/no attention decoder model for Vietnamese to English translation.
 vi2en.ipynb: Includes lstm/gru encoder and lstm-based/gru-based Luong attention/no attention decoder model for Vietnamese to English translation.
 vi2en_self.ipynb: Includes self-attention encoder and gru-based decoder model for Vietnamese to English translation.
 ch2en_self.ipynb: Includes self-attention encoder and gru-based decoder model for Chinese to English translation.

Completed

✅ Add dataloader

✅ Train Unknown word representation

✅ Mask

✅ Minibatch

✅ Bleu score

✅ Save model

✅ Self-attention

✅ Multiple layers in encoder and decoder

✅ Without Attention

✅ LSTM

✅ Beam Search

✅ character-level Chinese

✅ phrase-level Chinese

Future Work

  • .Ipynb to .py

  • Transformers

About

NLP course-based project: focus on translation of Chinese to English and Vietnamese to English.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •