DDROP: Encouraging Filter Sparsity in Convolutional Neural Networks via Dynamic Dropout for Prunable Models

Keywords: Pruning, Convolutional Neural Networks, Dropout, Regularization

Abstract

In this paper, we introduce a novel dynamic dropout (DDROP) based method for inducing sparsity in deep neural networks. Our method works by adaptively dropping neurons or filters during the training phase. Unlike other pruning techniques, DDROP determines the probabilities of dropping neurons based on their importance, with the use of a ranking criteria derived from their activation statistics. Furthermore, we incorporate the l1 regularization to suppress the least important neurons, further enhancing the dynamic pruning process. We evaluate the proposed method on standard datasets such as CIFAR-10, CIFAR-100, and ILSVRC2012 and various network architectures, showing the consistent enhancement in the accuracy of the pruned models compared to other techniques. The results obtained from our evaluations highlight DDROP's promise as a strategy for efficient deep neural networks and its ability to achieve structured sparsity, reducing the complexity of the model while keeping a satisfactory performance.
Published
2025-06-01
How to Cite
Toulaoui, A., Barik, M., Khalfi, H., & Hafidi, I. (2025). DDROP: Encouraging Filter Sparsity in Convolutional Neural Networks via Dynamic Dropout for Prunable Models. Statistics, Optimization & Information Computing. Retrieved from http://47.88.85.238/index.php/soic/article/view/2535
Section
I2CEAI24