Sensitivity pruner: : Filter-Level compression algorithm for deep neural networks
Graphical abstract
References
Recommendations
Filter Pruning via Similarity Clustering for Deep Convolutional Neural Networks
Neural Information ProcessingAbstractNetwork pruning is a technique to obtain a smaller lightweight model by removing the redundant structure from pre-trained models. However, existing methods are mainly based on the importance of filters in the whole network. Unlike previous methods,...
HRel: Filter pruning based on High Relevance between activation maps and class labels
AbstractThis paper proposes an Information Bottleneck theory based filter pruning method that uses a statistical measure called Mutual Information (MI). The MI between filters and class labels, also called Relevance, is computed using the ...
Highlights- Mutual Information between filters’ activation maps and ground truths is utilized.
Deep Model Compression based on the Training History
AbstractDeep Convolutional Neural Networks (DCNNs) have shown promising performances in several visual recognition problems which motivated the researchers to propose popular architectures such as LeNet, AlexNet, VGGNet, ResNet, and many more. These ...
Highlights- We propose a novel method for pruning filters from convolution layers based on the training history of a deep neural network.
- We introduce an optimization step (custom regularizer) to reduce the information loss incurred due to filter ...
Comments
Please enable JavaScript to view thecomments powered by Disqus.Information & Contributors
Information
Published In
Publisher
Elsevier Science Inc.
United States
Publication History
Author Tags
Qualifiers
- Research-article
Contributors
Other Metrics
Bibliometrics & Citations
Bibliometrics
Article Metrics
- 0Total Citations
- 0Total Downloads
- Downloads (Last 12 months)0
- Downloads (Last 6 weeks)0
Other Metrics
Citations
View Options
View options
Login options
Check if you have access through your login credentials or your institution to get full access on this article.
Sign in