L0-ARM: Network Sparsification via Stochastic Binary Optimization

Yang Li, and Shihao Ji

Department of Computer Science

Georgia State University


Abstract

We consider network sparsification as an L0-norm regularized binary optimization problem, where each unit of a neural network (e.g., weight, neuron, or channel, etc.) is attached with a stochastic binary gate, whose parameters are jointly optimized with original network parameters. The Augment-Reinforce-Merge (ARM), a recently proposed unbiased gradient estimator, is investigated for this binary optimization problem. Compared to the hard concrete gradient estimator from Louizos et al., ARM demonstrates superior performance of pruning network architectures while retaining almost the same accuracies of baseline methods. Similar to the hard concrete estimator, ARM also enables conditional computation during model training but with improved effectiveness due to the exact binary stochasticity. Thanks to the flexibility of ARM, many smooth or non-smooth parametric functions, such as scaled sigmoid or hard sigmoid, can be used to parameterize this binary optimization problem and the unbiasness of the ARM estimator is retained, while the hard concrete estimator has to rely on the hard sigmoid function to achieve conditional computation and thus accelerated training. Extensive experiments on multiple public datasets demonstrate state-of-the-art pruning rates with almost the same accuracies of baseline methods. The resulting algorithm L0-ARM sparsifies the Wide-ResNet models on CIFAR-10 and CIFAR-100 while the hard concrete estimator cannot. We plan to release our code to facilitate the research in this area.


Demo

Visualization of part of the neurons in conv-layer and fully-connected layer of the LeNet-5-Caffe sparsified by L0-ARM. To achieve computational efficiency, only neuron-level (instead of weight-level) sparsification is considered. See the paper for more details.

conv
fc

Paper

Yang Li, Shihao Ji, "L0-ARM: Network Sparsification via Stochastic Binary Optimization," The European Conference on Machine Learning (ECML, acceptance rate 18%), Würzburg, Germany, Sept. 2019. [arXiv]