An energy-efficient bagged binary neural network accelerator

S Liang, Y Lin, W He, L Zhang, M Wu… - 2020 IEEE 3rd …, 2020 - ieeexplore.ieee.org
S Liang, Y Lin, W He, L Zhang, M Wu, X Zhou
2020 IEEE 3rd International Conference on Electronics Technology …, 2020ieeexplore.ieee.org
Recently, binary neural networks (BNNs) have been extensively studied since they can
address the challenge of large memory footprint and power consumption caused by floating-
point convolutional neural networks (CNNs) while maintaining tolerable accuracy. There
have been many efforts designing BNN hardware accelerators and showed very well
results. But even that will require huge improvements in reduction of memory cost and
energy efficiency. In this paper, we first propose the bagged binary neural network …
Recently, binary neural networks (BNNs) have been extensively studied since they can address the challenge of large memory footprint and power consumption caused by floating-point convolutional neural networks (CNNs) while maintaining tolerable accuracy. There have been many efforts designing BNN hardware accelerators and showed very well results. But even that will require huge improvements in reduction of memory cost and energy efficiency. In this paper, we first propose the bagged binary neural network accelerator (BBNA), which is a fully pipelined BNN accelerator with bagging ensemble unit for aggregating several BNN pipelines to achieve better model accuracy. In other words, the proposed architecture provides an opportunity for embedded devices to obtain acceptable accuracy with smaller ensemble BNNs. As a result, compared to other works, our design achieves 1.9x better energy efficiency with better performance, and the ensemble method saves more than 79% and 94% memory footprint, respectively, with nearly accuracy on MNIST dataset.
ieeexplore.ieee.org