Benchmarking Metaheuristics for Neural Network Optimization: A Comprehensive Study
Keywords:
Metaheuristics, Neural Network Optimization, Deep Learning, Training
Abstract
Deep learning is a machine learning method that has been successfully applied in various applications. However, these methods have structures and parameters that vary according to the specific application. Training a neural network is a highly complex task, but it is crucial to the process of learning the network. This step is generally carried out using gradient descent methods. Alternatively, metaheuristics are approximate optimization algorithms grounded in specific theories and objective functions that have shown better results in various domains. The idea of this work is therefore to use metaheuristics to train a deep neural network and compare the obtained results. To evaluate the different metaheuristics, we tested them on six different machine learning datasets. We have found that the PSO algorithm gives the best results in the training data. While its accuracy deteriorates on the testing data, we can conclude that the PSO is prone to overfitting. On the other hand, simulated annealing, and genetic algorithms generalize better to the test data. Comparing the execution times of the algorithms, we have found that simulated annealing is the fastest. This makes it the algorithm with the best trade-off between execution time and accuracy.
Published
2026-04-02
How to Cite
EDDAOUDI, F., LAKHBAB, H., & NAIMI, M. (2026). Benchmarking Metaheuristics for Neural Network Optimization: A Comprehensive Study. Statistics, Optimization & Information Computing. https://doi.org/10.19139/soic-2310-5070-3060
Issue
Section
Research Articles
Authors who publish with this journal agree to the following terms:
- Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).