A novel Mathematical Modeling for Deep Multilayer Perceptron Optimization: Architecture Optimization and Activation Functions Selection

  • Taoufyq Elansari Faculty of sciences, Moulay Ismail university, Morocco
  • Mohammed Ouanan TSI Team, Department of Computer Sciences, Faculty of Sciences, Moulay Ismail University.
  • Hamid Bourray TSI Team, Department of Computer Sciences, Faculty of Sciences, Moulay Ismail University.

Abstract

The Multilayer Perceptron (MLP) is an artificial neural network composed of one or more hidden layers. It has found wide use in various fields and applications. The number of neurons in the hidden layers, the number of hidden layers, and the activation functions employed in each layer significantly influence the convergence of MLP learning algorithms. This article presents a model for selecting activation functions and optimizing the structure of the multilayer perceptron, formulated in terms of mixed-variable optimization. To solve the obtained model, a hybrid algorithm is used, combining stochastic optimization and the backpropagation algorithm. Our algorithm shows better complexity and execution time compared to some other methods in the literature, as the numerical results show.
Published
2024-06-06
How to Cite
Elansari, T., Ouanan, M., & Bourray, H. (2024). A novel Mathematical Modeling for Deep Multilayer Perceptron Optimization: Architecture Optimization and Activation Functions Selection. Statistics, Optimization & Information Computing, 12(5), 1409-1424. https://doi.org/10.19139/soic-2310-5070-1990
Section
Research Articles