Brain Networks Laboratory (Choe Lab)

SparseNet: A Sparse DenseNet for Image Classification

Jun 14, 2018

SparseNet: A Sparse DenseNet for Image Classification

Abstract: Deep neural networks have made remarkable progresses on various computer vision tasks. Recent works have shown that depth, width and shortcut connections of networks are all vital to their performances. In this paper, we introduce a method to sparsify DenseNet which can reduce connections of a L-layer DenseNet from O(L^2) to O(L), and thus we can simultaneously increase depth, width and connections of neural networks in a more parameter-efficient and computation-efficient way. Moreover, an attention module is introduced to further boost our network's performance. We denote our network as SparseNet. We evaluate SparseNet on datasets of CIFAR(including CIFAR10 and CIFAR100) and SVHN. Experiments show that SparseNet can obtain improvements over the state-of-the-art on CIFAR10 and SVHN. Furthermore, while achieving comparable performances as DenseNet on these datasets, SparseNet is x2.6 smaller and x3.7 faster than the original DenseNet.

https://arxiv.org/abs/1804.05340v1


← Back to all articles         Quick Navigation:    Next:[ j ] – Prev:[ k ] – List:[ l ]