diff --git a/README.md b/README.md index 0355db4..1cb4a12 100644 --- a/README.md +++ b/README.md @@ -6,20 +6,35 @@ This repository contains the code for DenseNet introduced in the following paper [Gao Huang](http://www.cs.cornell.edu/~gaohuang/)\*, [Zhuang Liu](https://liuzhuang13.github.io/)\*, [Laurens van der Maaten](https://lvdmaaten.github.io/) and [Kilian Weinberger](https://www.cs.cornell.edu/~kilian/) (\* Authors contributed equally). +and its journal version -**Now with much more memory efficient implementation!** Please check the [technical report](https://arxiv.org/pdf/1707.06990.pdf) and [code](https://github.com/liuzhuang13/DenseNet/tree/master/models) for more infomation. +[Convolutional Networks with Dense Connectivity](http://www.gaohuang.net/papers/DenseNet_Journal.pdf) (TPAMI 2019) + +[Gao Huang](http://www.cs.cornell.edu/~gaohuang/), [Zhuang Liu](https://liuzhuang13.github.io/), [Geoff Pleiss](https://geoffpleiss.com/), [Laurens van der Maaten](https://lvdmaaten.github.io/) and [Kilian Weinberger](https://www.cs.cornell.edu/~kilian/). + + +**Now with memory-efficient implementation!** Please check the [technical report](https://arxiv.org/pdf/1707.06990.pdf) and [code](https://github.com/liuzhuang13/DenseNet/tree/master/models) for more infomation. The code is built on [fb.resnet.torch](https://github.com/facebook/fb.resnet.torch). ### Citation If you find DenseNet useful in your research, please consider citing: + @article{huang2019convolutional, + title={Convolutional Networks with Dense Connectivity}, + author={Huang, Gao and Liu, Zhuang and Pleiss, Geoff and Van Der Maaten, Laurens and Weinberger, Kilian}, + journal={IEEE Transactions on Pattern Analysis and Machine Intelligence}, + year={2019} + } + @inproceedings{huang2017densely, - title={Densely connected convolutional networks}, + title={Densely Connected Convolutional Networks}, author={Huang, Gao and Liu, Zhuang and van der Maaten, Laurens and Weinberger, Kilian Q }, booktitle={Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition}, year={2017} } + + ## Other Implementations @@ -114,6 +129,8 @@ DenseNet-BC (L=190, k=40)|25.6M |- |**3.46** | -|**17.18** ## Results on ImageNet and Pretrained Models ### Torch +Note: the pre-trained models in Torch are deprecated and no longer maintained. Please use PyTorch's pre-trained [DenseNet models](https://pytorch.org/vision/stable/models.html) instead. + #### Models in the original paper The Torch models are trained under the same setting as in [fb.resnet.torch](https://github.com/facebook/fb.resnet.torch). The error rates shown are 224x224 1-crop test errors. @@ -130,10 +147,10 @@ More accurate models trained with the memory efficient implementation in the [te | Network | Top-1 error | Torch Model | | ------------- | ----------- | ------------ | -| DenseNet-264 (k=32) | 22.1 | [Download (256MB)](https://drive.google.com/file/d/0By1NwtA2JPGzdVRqOEotMUZrbTA/view?usp=sharing) -| DenseNet-232 (k=48) | 21.2 | [Download (426MB)](https://drive.google.com/open?id=0By1NwtA2JPGzdkRDaWQ5M3VHTDg) -| DenseNet-cosine-264 (k=32) | 21.6 | [Download (256MB)](https://drive.google.com/file/d/0By1NwtA2JPGzRDhxWGo2a3pOTjA/view?usp=sharing) -| DenseNet-cosine-264 (k=48) | 20.4 | [Download (557MB)](https://drive.google.com/file/d/0By1NwtA2JPGzcnFDSE1HQVh4c0k/view?usp=sharing) +| DenseNet-264 (k=32) | 22.1 | [Download (256MB)](https://drive.google.com/file/d/0By1NwtA2JPGzdVRqOEotMUZrbTA/view?usp=sharing&resourcekey=0-5D_u52k6wCy6doaLwdBfqw) +| DenseNet-232 (k=48) | 21.2 | [Download (426MB)](https://drive.google.com/file/d/0By1NwtA2JPGzdkRDaWQ5M3VHTDg/view?usp=sharing&resourcekey=0-yl4XWJ2J6GZaF6RPj43QHQ) +| DenseNet-cosine-264 (k=32) | 21.6 | [Download (256MB)](https://drive.google.com/file/d/0By1NwtA2JPGzRDhxWGo2a3pOTjA/view?usp=sharing&resourcekey=0-AOIBvppNz9cdDcuKvH7_aQ) +| DenseNet-cosine-264 (k=48) | 20.4 | [Download (557MB)](https://drive.google.com/file/d/0By1NwtA2JPGzcnFDSE1HQVh4c0k/view?usp=sharing&resourcekey=0-uAyd9bsTas2twCzAZ1DUpA) ### Caffe @@ -173,11 +190,15 @@ Thus, for practical use, we suggest picking one model from those Wide-DenseNet-B ## Updates -**08/23/2017:** +**12/10/2019:** + +1. Journal version (accepted by IEEE TPAMI) released. + +08/23/2017: 1. Add supporting code, so one can simply *git clone* and run. -**06/06/2017:** +06/06/2017: 1. Support **ultra memory efficient** training of DenseNet with *customized densely connected layer*. @@ -205,6 +226,6 @@ Thus, for practical use, we suggest picking one model from those Wide-DenseNet-B ## Contact liuzhuangthu at gmail.com -gh349 at cornell.edu +gaohuang at tsinghua.edu.cn Any discussions, suggestions and questions are welcome! diff --git a/README777.MD b/README777.MD new file mode 100644 index 0000000..f1885df --- /dev/null +++ b/README777.MD @@ -0,0 +1,2 @@ +7777777777 +