Skip to content

A curated list of neural network pruning resources.

Notifications You must be signed in to change notification settings

chenxiang204/Awesome-Pruning

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

25 Commits
 
 

Repository files navigation

Awesome Pruning Awesome

A curated list of neural network pruning and related resources. Inspired by awesome-deep-vision, awesome-adversarial-machine-learning, awesome-deep-learning-papers and Awesome-NAS.

Please feel free to pull requests or open an issue to add papers.

Table of Contents

Type of Pruning

Type F W Other
Explanation Filter pruning Weight pruning other types

2019

Title Venue Type Code
Filter Pruning via Geometric Median for Deep Convolutional Neural Networks Acceleration CVPR (Oral) F github
Towards Optimal Structured CNN Pruning via Generative Adversarial Learning CVPR F github
Centripetal SGD for Pruning Very Deep Convolutional Networks with Complicated Structure CVPR F github
On Implicit Filter Level Sparsity in Convolutional Neural Networks, Extension1, Extension2 CVPR F github
Structured Pruning of Neural Networks with Budget-Aware Regularization CVPR F -
Importance Estimation for Neural Network Pruning CVPR F github
OICSR: Out-In-Channel Sparsity Regularization for Compact Deep Neural Networks CVPR F -
Partial Order Pruning: for Best Speed/Accuracy Trade-off in Neural Architecture Search CVPR Other github
Variational Convolutional Neural Network Pruning CVPR - -
The Lottery Ticket Hypothesis: Finding Sparse, Trainable Neural Networks ICLR (Best) W github
Rethinking the Value of Network Pruning ICLR F github
Dynamic Channel Pruning: Feature Boosting and Suppression ICLR F github
SNIP: Single-shot Network Pruning based on Connection Sensitivity ICLR F github
Dynamic Sparse Graph for Efficient Deep Learning ICLR F github
Collaborative Channel Pruning for Deep Networks ICML F -
Approximated Oracle Filter Pruning for Destructive CNN Width Optimization github ICML F -
EigenDamage: Structured Pruning in the Kronecker-Factored Eigenbasis4 ICML W github

2018

Title Venue Type Code
Rethinking the Smaller-Norm-Less-Informative Assumption in Channel Pruning of Convolution Layers ICLR F github
To prune, or not to prune: exploring the efficacy of pruning for model compression ICLR W -
Discrimination-aware Channel Pruning for Deep Neural Networks NIPS F github
Frequency-Domain Dynamic Pruning for Convolutional Neural Networks NIPS W -
Amc: Automl for model compression and acceleration on mobile devices ECCV F github
Data-Driven Sparse Structure Selection for Deep Neural Networks ECCV F github
Coreset-Based Neural Network Compression ECCV F github
Constraint-Aware Deep Neural Network Compression ECCV W github
A Systematic DNN Weight Pruning Framework using Alternating Direction Method of Multipliers ECCV W github
PackNet: Adding Multiple Tasks to a Single Network by Iterative Pruning CVPR F github
NISP: Pruning Networks using Neuron Importance Score Propagation CVPR F -
CLIP-Q: Deep Network Compression Learning by In-Parallel Pruning-Quantization CVPR W -
“Learning-Compression” Algorithms for Neural Net Pruning CVPR W -
Soft Filter Pruning for Accelerating Deep Convolutional Neural Networks IJCAI F github

2017

Title Venue Type Code
Pruning Filters for Efficient ConvNets ICLR F github
Pruning Convolutional Neural Networks for Resource Efficient Inference ICLR F github
Net-Trim: Convex Pruning of Deep Neural Networks with Performance Guarantee NIPS W github
Learning to Prune Deep Neural Networks via Layer-wise Optimal Brain Surgeon NIPS W github
Runtime Neural Pruning NIPS F -
Designing Energy-Efficient Convolutional Neural Networks using Energy-Aware Pruning CVPR F -
ThiNet: A Filter Level Pruning Method for Deep Neural Network Compression ICCV F github
Channel pruning for accelerating very deep neural networks ICCV F github
Learning Efficient Convolutional Networks Through Network Slimming ICCV F github

2016

Title Venue Type Code
Deep Compression: Compressing Deep Neural Networks with Pruning, Trained Quantization and Huffman Coding ICLR (Best) W github
Dynamic Network Surgery for Efficient DNNs NIPS W github

2015

Title Venue Type Code
Learning both Weights and Connections for Efficient Neural Networks NIPS W github

Related Repo

Awesome-model-compression-and-acceleration

EfficientDNNs

Embedded-Neural-Network

awesome-AutoML-and-Lightweight-Models

Model-Compression-Papers

knowledge-distillation-papers

Network-Speed-and-Compression

About

A curated list of neural network pruning resources.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published