Skip to content

Commit

Permalink
Update documentation
Browse files Browse the repository at this point in the history
  • Loading branch information
Maximilian Weichart committed Sep 11, 2024
1 parent ec1be7c commit feb0e55
Show file tree
Hide file tree
Showing 7 changed files with 19 additions and 11 deletions.
Binary file added docs/_static/components/board.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/_static/components/holder.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/_static/components/queue.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/_static/components/tetromino.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
2 changes: 1 addition & 1 deletion docs/environments/tetris.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Tetris

![Tetris](../_static/components/holder.png)
![Tetris](../_static/components/board.png)


| Description | Details |
Expand Down
12 changes: 9 additions & 3 deletions docs/introduction/installation.md
Original file line number Diff line number Diff line change
@@ -1,11 +1,17 @@
# Installation

At the moment, you can install the environment by cloning the repository and running `poetry install`.
You can install the Tetris-Gymnasium package using pip:

> In the near future, this library will be distributed via PyPI.
```bash
pip install tetris-gymnasium
```

Alternatively, you can install the environment by cloning the repository and using Poetry:

```{code-block} bash
```bash
git clone https://github.com/Max-We/Tetris-Gymnasium.git
cd Tetris-Gymnasium
poetry install
```

This method is useful if you want to contribute to the project or need the latest development version.
16 changes: 9 additions & 7 deletions docs/introduction/quickstart.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,14 +34,16 @@ For example, the `play_interactive.py` script allows you to play the Tetris envi

## Training

In order to do Reinforcement Learning, you need to train an agent. To give some real-world examples, the code in the files `train_lin.py` and `train_cnn.py` in the `examples` directory show how to train a DQN agent on the Tetris environment.
To do Reinforcement Learning, you need to train an agent. The `examples` directory contains a script demonstrating how to train a DQN agent on the Tetris environment using a convolutional neural network (CNN) model.

To run the training, you can use the following command:
To run the training, use the following command:

```{code-block} bash
poetry run python examples/train_lin.py # uses a linear model
# or
poetry run python examples/train_cnn.py # uses convolutions
```bash
poetry run python examples/train_cnn.py
```

You can refer to the [CleanRL documentation](https://docs.cleanrl.dev/rl-algorithms/dqn/) for more information on the training script. Note: If you have tracking enabled, you will be prompted to login to Weights & Biases during the first run. This behavior can be adjusted in the script.
This script trains a DQN agent with a CNN architecture.

You can refer to the [CleanRL documentation](https://docs.cleanrl.dev/rl-algorithms/dqn/) for more information on the training script.

Note: If you have tracking enabled, you will be prompted to login to Weights & Biases during the first run. This behavior can be adjusted in the script or by passing the parameter `--track False`.

0 comments on commit feb0e55

Please sign in to comment.