Dec 24, 2018 · We introduce dynamic runtime pruning of feature maps and show that 10% of dynamic feature map execution can be removed without loss of accuracy.
Oct 22, 2021 · We present a novel method to dynamically prune feature maps at runtime reducing bandwidth by up to 11.5% without loss of accuracy for image classification.
Apr 2, 2019 · We introduce dynamic runtime pruning of feature maps and show that 10% of dynamic feature map execution can be removed without loss of accuracy.
[PDF] Dynamic Runtime Feature Map Pruning - Semantic Scholar
www.semanticscholar.org › paper › Dyn...
This work analyzes parameter sparsity of six popular convolutional neural networks and introduces dynamic runtime pruning of feature maps, showing that 10% ...
Oct 29, 2021 · We present a novel method to dynamically prune feature maps at runtime reducing bandwidth by up to 11.5% without loss of accuracy for image classification.
In this paper, we analyze feature map sparsity for several popular convolutional neural networks. When considering run-time behavior, we find a good probability ...
In this paper, we propose a deep reinforcement learning (DRL) based framework to efficiently perform runtime channel pruning on convolutional neural ...
We propose a dynamic channel-pruning method that dynamically identifies and removes less important filters based on a redundancy analysis of its feature maps.
Missing: Runtime | Show results with:Runtime
These maps are then fed into a global average-pooling block, which averages the values of each feature map producing a single number for each feature map.
High bandwidth requirements are an obstacle for accelerating the training and inference of deep neural networks. Most previous research focuses on reducing ...