diff --git a/README.md b/README.md index 01a7c98..1312373 100644 --- a/README.md +++ b/README.md @@ -7,6 +7,8 @@ This implementation is based on [ResNeXt-DenseNet](https://github.com/D-X-Y/ResN - PyTorch 0.3.1 - TorchVision 0.3.0 +The trained models with log files can be found in [Google Drive](https://drive.google.com/drive/folders/1lPhInbd7v3HjK9uOPW_VNjGWWm7kyS8e?usp=sharing) + ## Training ImageNet ### Usage of Pruning Training: @@ -35,14 +37,25 @@ Run resnet(100 epochs): python original_train.py -a resnet50 --save_dir ./snapshots/resnet50-baseline /path/to/Imagenet2012 --workers 36 ``` +### Inference the pruned model with zeros +```bash +sh scripts/inference_resnet.sh +``` + +### Inference the pruned model without zeros +```bash +sh scripts/infer_pruned.sh +``` +The pruned model could be downloaded at the Release page. + ### Scripts to reproduce the results in our paper To train the ImageNet model with / without pruning, see the directory `scripts` (we use 8 GPUs for training). -### Inference the pruned model +## Training Cifar-10 ```bash -sh scripts/inference_resnet.sh +sh scripts/cifar10_resnet.sh ``` -The trained models with log files can be found in [Google Drive](https://drive.google.com/drive/folders/1lPhInbd7v3HjK9uOPW_VNjGWWm7kyS8e?usp=sharing) +Be care of the hyper-parameter [`layer_end`](https://github.com/he-y/soft-filter-pruning/blob/master/scripts/cifar10_resnet.sh#L4-L9) for different layer of ResNet. ## Notes @@ -57,6 +70,9 @@ We follow the [Facebook process of ImageNet](https://github.com/facebook/fb.resn Two subfolders ("train" and "val") are included in the "/path/to/ImageNet2012". The correspding code is [here](https://github.com/he-y/soft-filter-pruning/blob/master/pruning_train.py#L129-L130). +#### FLOPs Calculation +Refer to the [file](https://github.com/he-y/soft-filter-pruning/blob/master/utils/cifar_resnet_flop.py). + ## Citation ``` @inproceedings{he2018soft,