Group sparsity: The Hinge Between Filter Pruning and Decomposition for Network Compression
Metadata only
Date
2020Type
- Conference Paper
ETH Bibliography
yes
Altmetrics
Abstract
In this paper, we analyze two popular network compression techniques, i.e. filter pruning and low-rank decomposition, in a unified sense. By simply changing the way the sparsity regularization is enforced, filter pruning and low-rank decomposition can be derived accordingly. This provides another flexible choice for network compression because the techniques complement each other. For example, in popular network architectures with shortcut connections (e.g. ResNet), filter pruning cannot deal with the last convolutional layer in a ResBlock while the low-rank decomposition methods can. In addition, we propose to compress the whole network jointly instead of in a layer-wise manner. Our approach proves its potential as it compares favorably to the state-of-the-art on several benchmarks. Code is available at https://github.com/ofsoundof/group_sparsity. Show more
Publication status
publishedExternal links
Book title
2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)Pages / Article No.
Publisher
IEEEEvent
Organisational unit
03514 - Van Gool, Luc / Van Gool, Luc
Notes
Due to the Coronavirus (COVID-19) the conference was conducted virtually.More
Show all metadata
ETH Bibliography
yes
Altmetrics