Skip to content

clive-g-brown/Transformers.jl

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Transformers.jl

Julia implementation of NLP models, that based on google transformer, with Flux.jl. For using the model, see example folder.

Table of Contents

  1. Transformers.jl
  2. Installation
  3. implemented model
  4. Usage
  5. Issue
  6. Roadmap

Installation

In the Julia REPL:

]add Transformers

#Currently the Dataset need the HTTP#master to download WMT
]add HTTP#master

For using GPU, install & build:

]add CuArrays

]build 

julia> using CuArrays

julia> using Transformers

#run the model below
.
.
.

implemented model

Usage

Coming soon!

Issue

Currently the code is really ugly, need refactor, test and docs.

Roadmap

  • write docs
  • write test
  • refactor code
  • [50%] better embedding functions
    • gather function forward
    • gather function backward (might be better)
    • OneHotArray
    • more util functions
    • easy gpu data
    • remove Vocabulary
  • lazy CuArrays loading
  • using HTTP to handle dataset download (need HTTP.jl update)
  • optimize performance
  • text related util functions
  • better dataset API
  • more datasets
  • [75%] openai gpt model
    • model implementation
    • loading pretrain
    • model example
    • more util functions
  • openai gpt-2 model
  • google bert model

About

Julia Implementation of Transformer models

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Julia 100.0%