ChannelNets: Compact and Efficient Convolutional Neural Networks via Channel-Wise Convolutions. Academic Article uri icon

abstract

  • Convolutional neural networks (CNNs) have shown great capability of solving various artificial intelligence tasks. However, the increasing model size has raised challenges in employing them in resource-limited applications. In this work, we propose to compress deep models by using channel-wise convolutions, which replace dense connections among feature maps with sparse ones in CNNs. Based on this novel operation, we build light-weight CNNs known as ChannelNets. ChannelNets use three instances of channel-wise convolutions; namely group channel-wise convolutions, depth-wise separable channel-wise convolutions, and the convolutional classification layer. Compared to prior CNNs designed for mobile devices, ChannelNets achieve a significant reduction in terms of the number of parameters and computational cost without loss in accuracy. Notably, our work represents an attempt to compress the fully-connected classification layer, which usually accounts for about 25 percent of total parameters in compact CNNs. Along this new direction, we investigate the behavior of our proposed convolutional classification layer and conduct detailed analysis. Based on our in-depth analysis, we further propose convolutional classification layers without weight-sharing. This new classification layer achieves a good trade-off between fully-connected classification layers and the convolutional classification layer. Experimental results on the ImageNet dataset demonstrate that ChannelNets achieve consistently better performance compared to prior methods.

published proceedings

  • IEEE Trans Pattern Anal Mach Intell

author list (cited authors)

  • Gao, H., Wang, Z., Cai, L., & Ji, S.

citation count

  • 34

complete list of authors

  • Gao, Hongyang||Wang, Zhengyang||Cai, Lei||Ji, Shuiwang

publication date

  • August 2021