AUTHOR=Chen Lin , Gong Saijun , Shi Xiaoyu , Shang Mingsheng TITLE=Dynamical Conventional Neural Network Channel Pruning by Genetic Wavelet Channel Search for Image Classification JOURNAL=Frontiers in Computational Neuroscience VOLUME=Volume 15 - 2021 YEAR=2021 URL=https://www.frontiersin.org/journals/computational-neuroscience/articles/10.3389/fncom.2021.760554 DOI=10.3389/fncom.2021.760554 ISSN=1662-5188 ABSTRACT=Neural network pruning is critical to alleviating the high computational cost of deep neural networks on resource-limited devices. Conventional network pruning methods compress the network based on the hand-crafted rules with a pre-defined pruning ratio, which fails to consider the variety of channels among different layers, thus resulting in a sub-optimal pruned model. To alleviate this issue, this paper proposes a genetic-wavelet-channel-search (GWCS) based pruning framework, where the pruning process is modeled as a multi-stage genetic optimization procedure. Its main ideas are two-folds: (1) We encode all the channels of the pertained network and divide them into multiple searching spaces according to the different functional convolutional layers from concrete to abstract. (2) We develop a wavelet-channel-aggregation-based fitness function to explore the most representative and discriminative channels at each layer and prune the network dynamically. In the experiments, the proposed GWCS is evaluated on CIFAR-10, CIFAR-100, and ImageNet datasets with two kinds of popular deep convolutional neural networks (ResNet and VGGNet). The results demonstrate that GNAS outperforms state-of-the-art pruning algorithms in both accuracy and compression rate. Notably, GNAS reduces more than 73.1% FLOPs by pruning ResNet-32 with even 0.79% accuracy improvement on CIFAR-100.