DDROP: Encouraging Filter Sparsity in Convolutional Neural Networks via Dynamic Dropout for Prunable Models
Keywords:
Pruning, Convolutional Neural Networks, Dropout, Regularization
Abstract
In this paper, we introduce a novel dynamic dropout (DDROP) based method for inducing sparsity in deep neural networks. Our method works by adaptively dropping neurons or filters during the training phase. Unlike other pruning techniques, DDROP determines the probabilities of dropping neurons based on their importance, with the use of a ranking criteria derived from their activation statistics. Furthermore, we incorporate the l1 regularization to suppress the least important neurons, further enhancing the dynamic pruning process. We evaluate the proposed method on standard datasets such as CIFAR-10, CIFAR-100, and ILSVRC2012 and various network architectures, showing the consistent enhancement in the accuracy of the pruned models compared to other techniques. The results obtained from our evaluations highlight DDROP's promise as a strategy for efficient deep neural networks and its ability to achieve structured sparsity, reducing the complexity of the model while keeping a satisfactory performance.
Published
2025-06-01
How to Cite
Toulaoui, A., Barik, M., Khalfi, H., & Hafidi, I. (2025). DDROP: Encouraging Filter Sparsity in Convolutional Neural Networks via Dynamic Dropout for Prunable Models. Statistics, Optimization & Information Computing. Retrieved from http://47.88.85.238/index.php/soic/article/view/2535
Issue
Section
I2CEAI24
Authors who publish with this journal agree to the following terms:
- Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).