Skip to content

Commit

Permalink
Update README
Browse files Browse the repository at this point in the history
Update README

Update README

Update README
  • Loading branch information
kuangliu committed Nov 24, 2020
1 parent 5e3f990 commit 7ad1b9c
Show file tree
Hide file tree
Showing 2 changed files with 12 additions and 9 deletions.
17 changes: 10 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,15 @@ I'm playing with [PyTorch](http://pytorch.org/) on the CIFAR10 dataset.
- Python 3.6+
- PyTorch 1.0+

## Training
```
# Start training with:
CUDA_VISIBLE_DEVICES=0 python main.py
# You can manually resume the training with:
CUDA_VISIBLE_DEVICES=0 python main.py --resume --lr=0.01
```

## Accuracy
| Model | Acc. |
| ----------------- | ----------- |
Expand All @@ -22,11 +31,5 @@ I'm playing with [PyTorch](http://pytorch.org/) on the CIFAR10 dataset.
| [DenseNet121](https://arxiv.org/abs/1608.06993) | 95.04% |
| [PreActResNet18](https://arxiv.org/abs/1603.05027) | 95.11% |
| [DPN92](https://arxiv.org/abs/1707.01629) | 95.16% |
| [DLA](https://arxiv.org/abs/1707.064) | 95.47% |

## Learning rate adjustment
I manually change the `lr` during training:
- `0.1` for epoch `[0,150)`
- `0.01` for epoch `[150,250)`
- `0.001` for epoch `[250,350)`

Resume the training with `python main.py --resume --lr=0.01`
4 changes: 2 additions & 2 deletions main.py
Original file line number Diff line number Diff line change
Expand Up @@ -86,7 +86,7 @@
criterion = nn.CrossEntropyLoss()
optimizer = optim.SGD(net.parameters(), lr=args.lr,
momentum=0.9, weight_decay=5e-4)
scheduler = torch.optim.lr_scheduler.CosineAnnealingLR(optimizer, T_max=100)
scheduler = torch.optim.lr_scheduler.CosineAnnealingLR(optimizer, T_max=200)


# Training
Expand Down Expand Up @@ -148,7 +148,7 @@ def test(epoch):
best_acc = acc


for epoch in range(start_epoch, start_epoch+100):
for epoch in range(start_epoch, start_epoch+200):
train(epoch)
test(epoch)
scheduler.step()

0 comments on commit 7ad1b9c

Please sign in to comment.