This is a poem generator trained with Gated Recurrent Units (GRU) on a corpus of poems including those in:
- Poems Every Child Should Know, by Mary E. Burt
- Poems, by Thomas Hall Shastid
- Poems of Progress and New Thought Pastels, by Ella Wheeler Wilcox
- Poems Teachers Ask For
- Poems Teachers Ask For, Book Two
- The Pied Piper of Hamelin, and Other Poems, by Robert Browning
Most of the code comes from Tensorflow's text generation tutorial.
Please refer to the notebook for an overview of the results.
- Clean the corpus
- Find more data
- Save a model
- Try LSTM
- Train a model based on sequences of words instead of characters
- Create a rhyme generator
- Train a model backward: from the last word to the first
The idea is to generate the rhymes first, and then write the lines from the last word to the first. This process makes sure that the generated poems rhyme.
- Generate other kind of text (Haiku, magazines, articles, books...)