Character and word level natural language generation (NLG) tutorial. Uses Python and several types of neural networks (LSTM, deep LSTM, Bidirectional LSTM).
The code examples include the following:
- Character and word level NLG models
- Model training has early stopping to prevent over-training
- Save the best performing models during training
- Metrics, such as accuracy and loss, from each trained epoch are written to a CSV file for later analysis
- Image of the neural network model
- Information about the complete test run (ex: number of epochs, total run time, details about the model, etc.)
- Separate code to load the best performing model and generate text. This saves time since it could take many hours to create a good NLG model. You may want to run the best model several times to get different and often amusing results.
Note, separately 4, 5 and 6 above can also be done in TensorBoard.
Below are a few sample outputs showing results for the simple character based NLG code: