Stock Price Forecasting Using Gaussian-ADAM Optimized Long Short-Term Memory Neural Networks
Keywords:
LSTM, Forecasting, Stock market, ADAM optimizer, Ito derivative
Abstract
This study introduces a novel optimization algorithm, \textbf{ItoAdam}, which integrates stochastic differential calculus—specifically Itô's lemma—into the standard Adam optimizer. Although Adam is widely used for training deep learning models, it may fail to converge reliably in complex, non-convex settings. To address this, ItoAdam injects Brownian noise into the gradient updates, enabling probabilistic exploration of the loss surface and improving the ability to escape poor local minima. ItoAdam is applied to train Long Short-Term Memory (LSTM) neural networks for stock price forecasting using historical data from 13 major companies, including Google, Nvidia, Apple, Microsoft, and JPMorgan. Theoretical analysis confirms the \textbf{convergence} of the proposed method under mild conditions, ensuring its robustness for deep learning applications. In addition, a \textbf{Differential Evolution (DE)} algorithm is employed to automatically optimize critical LSTM hyperparameters such as hidden size, number of layers, bidirectionality, and noise standard deviation. Experimental results show that the ItoAdam-LSTM model consistently outperforms the standard Adam-LSTM approach across evaluation metrics including RMSE, MAE, and $R^2$. A detailed sensitivity analysis reveals that optimal forecasting accuracy is typically achieved when the noise standard deviation lies between \textbf{$2.1 \times 10^{-4}$} and \textbf{$2.9 \times 10^{-4}$}. These findings highlight the effectiveness of combining Itô-driven stochastic optimization with evolutionary search and recurrent architectures for robust financial time series prediction in noisy and nonstationary environments.
Published
2025-10-06
How to Cite
Lahbabi, H., Bouhanch, Z., El moutaouakil, K., Palade, V., & Patriciu, A.-M. (2025). Stock Price Forecasting Using Gaussian-ADAM Optimized Long Short-Term Memory Neural Networks. Statistics, Optimization & Information Computing. https://doi.org/10.19139/soic-2310-5070-2876
Issue
Section
Research Articles
Authors who publish with this journal agree to the following terms:
- Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).