Julia implementation of NLP models, that based on google transformer, with Flux.jl.
For using the model, see example
folder.
In the Julia REPL:
]add Transformers
#Currently the Dataset need the HTTP#master to download WMT
]add HTTP#master
For using GPU, install & build:
]add CuArrays
]build
julia> using CuArrays
julia> using Transformers
#run the model below
.
.
.
Coming soon!
Currently the code is really ugly, need refactor, test and docs.
- write docs
- write test
- refactor code
[50%]
better embedding functions- gather function forward
- gather function backward (might be better)
- OneHotArray
- more util functions
- easy gpu data
- remove Vocabulary
- lazy CuArrays loading
- using HTTP to handle dataset download (need HTTP.jl update)
- optimize performance
- text related util functions
- better dataset API
- more datasets
[75%]
openai gpt model- model implementation
- loading pretrain
- model example
- more util functions
- openai gpt-2 model
- google bert model