diff --git a/.gitignore b/.gitignore
new file mode 100644
index 0000000..abc8e5a
--- /dev/null
+++ b/.gitignore
@@ -0,0 +1,14 @@
+.mypy_cache
+__pycache__
+*.pyc
+.idea/
+
+results/
+kb/
+kb_embed*/
+
+slurm*
+*/model*.pt
+*/*.log
+*/*.sh
+!scripts/*.sh
diff --git a/README.md b/README.md
new file mode 100644
index 0000000..c17a41f
--- /dev/null
+++ b/README.md
@@ -0,0 +1,209 @@
+# Latent Relation Language Models
+
+This repository contains the official PyTorch implementation of Latent Relation Language Models
+([arXiv](https://arxiv.org/abs/1908.07690)):
+
+> Hiroaki Hayashi\*, Zecong Hu\*, Chenyan Xiong, Graham Neubig
+> _Latent Relation Language Models_
+> The 34th AAAI Conference on Artificial Intelligence (AAAI 2020)
+
+![lrlm](docs/lrlm-fig.png)
+
+
+## Requirements
+
+- Python 3.6+
+- PyTorch 0.4+
+- Other packages:
+ - IPython
+ - tqdm
+ - tensorboardX
+ - [fastText@bc12859](https://github.com/facebookresearch/fastText/tree/bc1285939f1c216bd358425c3685a049dd8f56c0)
+
+
+## Usage
+
+1. Clone this repository and install dependencies:
+ ```bash
+ git clone https://github.com/neulab/lrlm.git
+ cd lrlm
+ pip install -r requirements.txt
+ ```
+ To prevent incorrect or conflicting package versions, it is recommended to install dependencies in a virtual
+ environment.
+
+2. Download required files, including the dataset, unknown word probabilities, and (for inference only) pre-trained
+ model weights:
+ ```bash
+ # Wikitext-S,F dataset
+ wget https://github.com/neulab/lrlm/releases/download/v1.0/wikitext.tar.bz2
+
+ # Wikifacts dataset
+ wget https://github.com/neulab/lrlm/releases/download/v1.0/wikfacts.tar.bz2
+
+ # Transformer-XL on WikiText-F model weights
+ wget https://github.com/neulab/lrlm/releases/download/v1.0/t-xl_wt-f_model17.pt
+
+ # Transformer-XL on WikiText-S model weights
+ wget https://github.com/neulab/lrlm/releases/download/v1.0/t-xl_wt-s_model17.pt
+ ```
+ Note that the list of resources above are available under [releases](https://github.com/neulab/lrlm/releases).
+
+ Please contact us for pretrained models for different configurations.
+
+ FastText model weights can be downloaded via [google drive](https://drive.google.com/file/d/1zBBMnhYEMWXAS0QK3Wg2q_fENcKLAXTE/view?usp=sharing).
+
+3. To train a new model, use one of the scripts in the `scripts/` directory, e.g.
+ `scripts/train_lrlm_transformer_wikitext_short.sh`. This will create a directory for the experiment named
+ `lrlm-transformer-wikitext-short` under the working directory, containing the following files:
+
+ - `model.pt`: The model checkpoint at the k-th epoch. Checkpoints are only saved when the validation results for
+ that epoch improves over previous epochs.
+ - `.txt`: The training log file.
+ - `