Statistics, Optimization & Information Computing
http://47.88.85.238/index.php/soic
<p><em><strong>Statistics, Optimization and Information Computing</strong></em> (SOIC) is an international refereed journal dedicated to the latest advancement of statistics, optimization and applications in information sciences. Topics of interest are (but not limited to): </p> <p>Statistical theory and applications</p> <ul> <li class="show">Statistical computing, Simulation and Monte Carlo methods, Bootstrap, Resampling methods, Spatial Statistics, Survival Analysis, Nonparametric and semiparametric methods, Asymptotics, Bayesian inference and Bayesian optimization</li> <li class="show">Stochastic processes, Probability, Statistics and applications</li> <li class="show">Statistical methods and modeling in life sciences including biomedical sciences, environmental sciences and agriculture</li> <li class="show">Decision Theory, Time series analysis, High-dimensional multivariate integrals, statistical analysis in market, business, finance, insurance, economic and social science, etc</li> </ul> <p> Optimization methods and applications</p> <ul> <li class="show">Linear and nonlinear optimization</li> <li class="show">Stochastic optimization, Statistical optimization and Markov-chain etc.</li> <li class="show">Game theory, Network optimization and combinatorial optimization</li> <li class="show">Variational analysis, Convex optimization and nonsmooth optimization</li> <li class="show">Global optimization and semidefinite programming </li> <li class="show">Complementarity problems and variational inequalities</li> <li class="show"><span lang="EN-US">Optimal control: theory and applications</span></li> <li class="show">Operations research, Optimization and applications in management science and engineering</li> </ul> <p>Information computing and machine intelligence</p> <ul> <li class="show">Machine learning, Statistical learning, Deep learning</li> <li class="show">Artificial intelligence, Intelligence computation, Intelligent control and optimization</li> <li class="show">Data mining, Data analysis, Cluster computing, Classification</li> <li class="show">Pattern recognition, Computer vision</li> <li class="show">Compressive sensing and sparse reconstruction</li> <li class="show">Signal and image processing, Medical imaging and analysis, Inverse problem and imaging sciences</li> <li class="show">Genetic algorithm, Natural language processing, Expert systems, Robotics, Information retrieval and computing</li> <li class="show">Numerical analysis and algorithms with applications in computer science and engineering</li> </ul>International Academic Pressen-USStatistics, Optimization & Information Computing2311-004X<span>Authors who publish with this journal agree to the following terms:</span><br /><br /><ol type="a"><ol type="a"><li>Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a <a href="http://creativecommons.org/licenses/by/3.0/" target="_new">Creative Commons Attribution License</a> that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.</li><li>Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.</li><li>Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See <a href="http://opcit.eprints.org/oacitation-biblio.html" target="_new">The Effect of Open Access</a>).</li></ol></ol>Survival Function Estimation based on Neutrosophic two-parameter XLindley distribution
http://47.88.85.238/index.php/soic/article/view/2229
<p>The probability distribution is of paramount importance in probability theory as it abounds in application across most disciplines comprising science. It is reported to be used preferentially for actuarial studies on insurance and finance, in the medical field, agriculture, demography and econometric analyses. The focus of the current research work is to introduce a novel extension termed as the neutrosophic two-parameter XLindley distribution (NTPXL). Many mathematical characteristics that model life survival have been developed and studied, involving survival and hazard functions, moment-generating functions and other tests of average, variance, and standard deviation, skewness and kurtosis. Monte Carlo method has been applied to investigate the effectiveness of the NTPXL distribution estimate. The results of the simulation conducted for this study show that the task of estimating with satisfactory accuracy is possible if the sample size is large enough. The actuality of premature infant staying time data has been used to explain the exact way through which the elaborated NTPXL distribution should be applied. On the application aspects, it has been shown in the subsequent sections that the NTPXL distribution is versatile because it can handle classical data as well as data that comprises uncertainties, ambiguity or imprecision.</p>Farooq Al-MutarZakariya Algamal
Copyright (c) 2025 Statistics, Optimization & Information Computing
2025-07-082025-07-081431088109710.19139/soic-2310-5070-2229New Transformation For The Selection Probability Under Positive Correlation Coefficient
http://47.88.85.238/index.php/soic/article/view/2417
<p>In this paper, we suggested a new transformation for the selection a sample with probability proportional to size measure with replacement under a positive correlation coefficient between the study variable and the measure of size variable . The relative efficiency of the proposed estimator has been studied under a super-population model. A numerical investigation into the performance of the estimator has been made.</p>Naser Odat
Copyright (c) 2025 Statistics, Optimization & Information Computing
2025-06-112025-06-111431098110910.19139/soic-2310-5070-2417Modelling and Reliability Analysis of the Two-Parameter Lindley-Binomial Distribution
http://47.88.85.238/index.php/soic/article/view/2516
<p><br>The primary purpose of this research is to describe the two-parameter Lindley Binomial (LB2) distribution, a new<br>probability distribution applicable for the proportion data analysis, specifically in the simulation, real data, and reliability analysis setting. The shape of probability mass function and some probabilistic properties of the proposed distribution, including generating functions are derived. The method of moment, maximum likelihood estimation, and the expectation maximization algorithm are used for parameter estimation. Goodness of fit of the proposed distribution is assessed by using it on a real dataset. This research also investigates the age-specific prevalence and risk pattern of Hepatitis B virus (HBV) infected within the dataset. It is compared with the binomial, beta-binomial, and negative binomial distributions for its performance. The results show that the proposed distribution has some advantages over previous models and therefore is advantageous in analyzing proportional data. Additionally, the two-parameter Lindley Binomial distribution is fit to the data to evaluate the reliability function, hazard rate function, inverted hazard rate function, and mean residual life (MRL) by age group. The findings demonstrate substantial differences of HBV positive between various age demographics with great public health implications</p>Mustafa Neamat NaderSameera Abdulsalam Othman Kurdistan M.Taher Omar
Copyright (c) 2025 Statistics, Optimization & Information Computing
2025-07-052025-07-051431110113810.19139/soic-2310-5070-2516McDonald Rayleigh distribution with application
http://47.88.85.238/index.php/soic/article/view/2531
<p>Statistical modeling of many phenomena is very important topic, especially the phenomena of survival, reliability, economic and financial. Many standard probability distributions lack superiority in modeling data sets of complex phenomenal. In recent years, the design of different forms of probability distributions has received wide attention by using different techniques in statistical theory. In this paper, McDonald family used to extend Rayleigh (<em>McR</em>) distribution. Some theoretical statistical properties of <em>McR</em> distribution were discussed, shape and scale parameters of <em>McR</em> distribution were estimated by maximum likelihood (<em>ML</em>) and E-Bayesian (<em>EB</em>) methods under square error (<em>SE</em>) and linear exponential (<em>Linex</em>) loss functions with three different kinds of hyper priors of distributions. In this paper, we propose the use of alternating direction method for image reconstruction from highly incomplete convolution data, where an image is reconstructed as a minimizer of an energy function that sums a TV term for image regularity and a least squares term for data.</p>Safwan RashedHayfa SaieedRaya AL-Rassam
Copyright (c) 2025 Statistics, Optimization & Information Computing
2025-06-122025-06-121431139115210.19139/soic-2310-5070-2531Parameter Estimation of a Non-Homogeneous Inverse Exponential Process Using Classical, Metaheuristic, and Deep Learning Approaches
http://47.88.85.238/index.php/soic/article/view/2601
<p>The proposed research technique implements Inverse Exponential distribution as its intensity function for modelling and estimating NHPP occurrence rates. The main research target evaluates parameter identification capabilities between traditional methods and metaheuristic algorithm and deep learning approaches in this newly created stochastic process. This research analyses parameter estimation through a combination of Maximum Likelihood Estimation (MLE) with Ordinary Least Squares (OLS) classical methods and Firefly Algorithm (FFA) and Grey Wolf Optimization (GWO) as well as Long Short-Term Memory (LSTM) networks and Artificial Neural Networks (ANN) as deep learning models. Performance evaluation of parameter identification methods depends on Root Mean Squared Error (RMSE) calculations in the simulation experiment. The researcher tested the model using failure time data that was extracted from the Mosul Dam power station during the time period from January 2017 through January 2020. Research data shows Artificial Neural Networks with Long Short-Term Memory networks produce superior outcomes than traditional techniques because ANN achieved the lowest Root Mean Squared Error across all sample numbers. The proposed research uses hybrid intelligent methods to improve stochastic process parameter estimation through examples that can benefit reliability engineering alongside temporal modelling of data.</p>Yaseen Oraibi
Copyright (c) 2025 Statistics, Optimization & Information Computing
2025-07-092025-07-091431153117310.19139/soic-2310-5070-2601Discretization of the Inverse Rayleigh-G Family: Theoretical Properties, Machine Learning-Based Parameter Estimation, and Practical Applications
http://47.88.85.238/index.php/soic/article/view/2618
<p>This analysis investigates a novel two-parameter discrete distribution, namely the Discrete Inverse Rayleigh Exponential (DIRE) distribution, which is derived from the Inverse Rayleigh-G family using a survival discretization method. The DIRE distribution features adaptable probability mass and hazard rate functions, capable of exhibiting symmetric, asymmetric, monotonic, and reversed-J-shaped behaviors, making it highly suitable for modeling a wide range of real-world data. Key statistical properties, such as mean, variance, moment-generating function, and dispersion index, are thoroughly examined. For parameter estimation, both Maximum Likelihood Estimation (MLE) and a machine learning-based K-Nearest Neighbors (K-NN) algorithm are utilized. Extensive simulations and real-world dataset analyses reveal that the DIRE distribution surpasses existing models in goodness-of-fit metrics, with the K-NN estimator demonstrating superior accuracy and robustness compared to MLE. The practical utility of the DIRE distribution is illustrated through two empirical datasets—COVID-19 case counts and failure time data—highlighting its effectiveness in managing complex discrete data. The results indicate that this new model offers improved flexibility and reliability, making it a valuable tool for statistical modeling and machine learning applications.</p>Adel sufyanSanaa MohsinKawthar HameedEmad Az- Zo’biMohammad Tashtoush
Copyright (c) 2025 Statistics, Optimization & Information Computing
2025-07-082025-07-081431174119710.19139/soic-2310-5070-2618A Novel Accelerated Failure Time Model with Risk Analysis under Actuarial Data, Censored and Uncensored Application
http://47.88.85.238/index.php/soic/article/view/2627
<p><span class="fontstyle0">This paper proposes a novel Accelerated Failure Time (AFT) model based on the Weighted Topp-Leone (WTLE) exponential distribution, designed for robust survival analysis under censored and uncensored actuarial and biomedical data. The AFT-WTLE model introduces flexible hazard rate shapes, validated through goodness-of-fit tests and real-world applications, including electric insulating fluid failure times and body fat percentage datasets. Parameter estimation employs maximum likelihood (MLE), Cram´er-von Mises (CVM), Anderson-Darling (ADE), and their modified variants (RTADE, AD2LE), with simulation studies demonstrating RTADE’s superior accuracy in bias and root mean squared error (RMSE) for small-to-moderate samples. The model’s risk assessment capabilities are highlighted via Value-at-Risk (VaR), Tail VaR (TVaR), and tail mean-variance metrics, revealing RTADE and ADE as optimal for capturing extreme tail risks. A modified Nikulin-Rao-Robson (NRR) chi-square test confirms the AFT-WTLE’s validity for censored data, with empirical rejection levels aligning closely with theoretical thresholds. Applications to motor failure data and Johnson’s body fat dataset illustrate its practical utility in actuarial, healthcare, and engineering domains. Computational efficiency is achieved via the BB algorithm for parameter optimization. Simulation results emphasize improved estimation consistency with increasing sample sizes, particularly for RTADE in high-quantile risk metrics. This work bridges gaps in survival modeling by integrating flexible baseline hazards with advanced risk quantification tools, offering a versatile framework for analyzing complex survival data across disciplines. <br></span></p>Mohamed IbrahimHafida GoualMeribout Kaouter KhaoulaAbdullah H. Al-NefaieAhmad M. AboAlkhairHaitham M. Yousof
Copyright (c) 2025 Statistics, Optimization & Information Computing
2025-07-112025-07-111431198122510.19139/soic-2310-5070-2627Using Bayesian Ridge Regression model and ESN for Climatic Time Series Forecasting
http://47.88.85.238/index.php/soic/article/view/2637
<p>The analysis of climatic time series variables, especially evaporation forecasts influenced by diverse climate components, is essential for mitigating the hazards associated with climate change and its effects on environmental phenomena, subsequently affecting human and plant health. The weather patterns and temporal conditions will be evaluated during a restricted time frame. Numerous climate variables, including temperature and humidity, demonstrate strong connections, resulting in multicollinearity that undermines the efficacy of conventional linear models and induces instability and considerable variability in model parameters. Moreover, climatic data frequently display erratic variations owing to their dependence on several sources, including sensors and satellites. This results in a nonlinear pattern, leading to considerable geographical and temporal fluctuation, hence complicating modelling efforts with conventional methods. Therefore, there is a necessity for statistical models that can address these issues and systematically manage residuals and uncertainties, which adversely affect the precision of time series forecasting. This study utilized the Bayesian Ridge Regression (BRR) model. This Bayesian adaptation of conventional ridge regression considers model parameters as random variables instead of constants, thereby diminishing estimate bias and enhancing stability in the presence of multicollinearity. The model offers a probabilistic representation of outputs, facilitating confidence intervals for forecasting and improving the reliability of results. Moreover, climatic time series forecasting is influenced by their chaotic and nonlinear characteristics. This is the point at which the echo state network (ESN) is relevant. This specific sort of recurrent neural network excels at forecasting nonlinear time series due to its proficiency in managing temporal dynamics and nonlinear modelling. This study integrated and hybridized the BRR model within the ESN architecture to utilize its overfitting mitigation capabilities and structural attributes for enhanced forecast accuracy. Experimental findings indicated that the BRR-ESN hybrid model markedly surpassed traditional models in multivariate time series forecasting, validating its efficacy in addressing the structural and climatic intricacies of evaporation data and the associated climate variables.</p>Raed Arif Snaan SnaanOsamah Basheer Shukur
Copyright (c) 2025 Statistics, Optimization & Information Computing
2025-06-232025-06-231431226124310.19139/soic-2310-5070-2637Adaptive Pricing Strategies in Digital Marketing
http://47.88.85.238/index.php/soic/article/view/2200
<p>Dynamic pricing in digital marketing plays a crucial role in enabling businesses to adapt to the ever-changing market conditions and meet customer demands effectively. This paper presents an improved methodology for leveraging machine learning, specifically the Deep Q-Network (DQN) model, to optimize dynamic pricing decisions in the digital marketing domain. The DQN model architecture incorporates deep neural networks and reinforcement learning algorithms to learn and optimize pricing decisions. The model is trained using hyperparameters optimized through experimentation. The results demonstrate the superiority of the DQN model over a baseline strategy, with significant improvements in revenue, profit, conversion rate, customer lifetime value, market share, and price elasticity. The findings highlight the potential of machine learning in enhancing e-marketing strategies, allowing businesses to adapt pricing decisions in real-time based on customer behavior and market dynamics. This research contributes to the growing body of knowledge on dynamic pricing and provides valuable insights for businesses seeking to leverage advanced analytics in digital marketing.</p>Manal LoukiliFayçal Messaoudi Omar El AaloucheRaouya El Youbi Riad Loukili
Copyright (c) 2025 Statistics, Optimization & Information Computing
2025-07-132025-07-131431244125110.19139/soic-2310-5070-2200Energy-Optimized Routing in WSNs: Enhanced Dijkstra-Based MEP and EBP Approaches
http://47.88.85.238/index.php/soic/article/view/2573
<p>Wireless Sensor Networks (WSNs) require energy-efficient routing to extend network lifetime due to limited node battery capacity. This paper presents two improved Dijkstra-based algorithms—Minimum Energy Path (MEP) Dijkstra, which minimizes per-packet energy consumption, and Energy-Balanced Path (EBP) Dijkstra, which dynamically adjusts paths to distribute energy usage evenly—enhancing both efficiency and fairness. Through extensive simulations, we show that EBP increases network lifetime by 97.3% compared to MEP while maintaining balanced energy consumption. Our contributions include complete algorithm pseudocode with complexity analysis, an accurate energy consumption model for realistic WSN scenarios, a comparative study with 12 state-of-the-art methods, and sensitivity analysis on key parameters such as balancing factor (β), network density, and traffic patterns. The results demonstrate significant improvements in energy efficiency, providing practical insights for WSN deployment in IoT and remote monitoring applications. </p> <p> </p>Hala NazmyBenBella Sayed TawfikMohamed Abdallah MakhloufOsama Farouk
Copyright (c) 2025 Statistics, Optimization & Information Computing
2025-06-232025-06-231431252126910.19139/soic-2310-5070-2573A comparative study of Hilbert transform and Fourier transform methods to complex-based global minimum variance portfolio
http://47.88.85.238/index.php/soic/article/view/2586
<p>Quantitative method in portfolio construction is an engaging issue in mathematical finance. A number of studies have shown the role of real numbers in constructing portfolio. However, very little attention has been paid to the role of complex number in finance. The principal objective of this project is to construct complex-based Global Minimum Variance (GMV) portfolio and apply clustering method in asset selection. The findings indicate that the GMV with Hilbert transform method has lower standard deviation in general than the real-based GMV portfolio. On the other hand, GMV portfolio approached by Fourier Transform shows higher standard deviation than complex-based portfolio with Hilbert transform and real-based portfolio. Our findings show how to develop GMV portfolio with Hilbert and Fourier Transform approach for constructing complex-based optimal portfolio.</p>NurwahidahMawardi BahriAmran Rahim
Copyright (c) 2025 Statistics, Optimization & Information Computing
2025-07-222025-07-221431270129510.19139/soic-2310-5070-2586The local stability of a compartmental corruption epidemic model under the impact of personal willingness
http://47.88.85.238/index.php/soic/article/view/2610
<div class="page" title="Page 1"> <div class="layoutArea"> <div class="column"> <p>The given work establishes a compartmental corruption epidemic model that consists of the susceptible compartment, exposed compartment, corrupt compartment, jailed compartment, reformed compartment, and honest compartment under the impact of personal willingness. In this model, it is assumed that corruption spreads like an infectious disease if there is an interaction between a susceptible individual and a corrupt individual, hence the epidemiology theory can be used to analyze the behavior of the model. The local stability analysis of the model is established. The study shows that the corruption-free fixed point and the corruption-endemic fixed point depend on the basic reproduction number. The numerical simulations that demonstrate the local stability of the corruption-free fixed point and the corruption-endemic fixed point under influence of the personal willingness are conducted.</p> </div> </div> </div>MuhafzanArrival Rince PutriNoverina AlfianyHafizhah Artrya Hanan
Copyright (c) 2025 Statistics, Optimization & Information Computing
2025-07-112025-07-111431296130710.19139/soic-2310-5070-2610A Multilayer Perceptron and Cost Optimization of a Single-Server Queue with Bernoulli Feedback and Customer Impatience under a Hybrid Vacation Policy
http://47.88.85.238/index.php/soic/article/view/2623
<p>This paper deals with a single server queueing system, aiming to handle a hybrid vacation, operating within a finite space, and taking account of Bernoulli feedback and balking, alongside reneging and retention. In case of queue emptiness, coming after a normal busy period, the single server shifts to a working vacation. The server proceeds to take a vacation in case no customers are queued upon the server's return from a working vacation. For analysis purposes, we employed a recursive method to derive the system's steady-state probabilities, thereby facilitating the evaluation of key performance metrics. The numerical results are compared with analytical results and those obtained using a soft computing technique based on a Multilayer Perceptron (MLP) system. Lastly, the Grey Wolf Optimizer is applied to identify the optimal service rates that minimize costs.</p>Houssam Eddine HamacheLouiza BERDJOUDJAimen Dehimi
Copyright (c) 2025 Statistics, Optimization & Information Computing
2025-06-242025-06-241431308132510.19139/soic-2310-5070-2623Optimizing Numerical Radius Inequalities via Decomposition Techniques and Parameterized Aluthge Transforms
http://47.88.85.238/index.php/soic/article/view/2645
<p>This manuscript presents substantial refinements to several classical inequalities connecting the numerical radius w(V), spectral radius ρ(V), and operator norm ∥V∥ for bounded linear operators acting on Hilbert spaces. Building upon inequalities established by Kittaneh 1 and the framework introduced by Yamazaki 3 , we develop enhanced bounds through parameterized Aluthge transforms and contemporary decomposition methods. Our key contributions encompass: (1) refined numerical radius bounds that strengthen Kittaneh’s inequality through quantifiable correction terms, (2) parameterized spectral radius inequalities for operator sums and products that significantly improve existing results, and (3) precision-enhanced bounds for commutators and anti-commutators. We provide comprehensive proofs establishing the superiority of our bounds across diverse operator classes. These refinements yield important theoretical implications in operator theory and matrix analysis, offering substantially tighter estimations of operator spread than previously attainable results.</p>Jamal OudetallahMutaz ShatnawiAla AmourahAbdullah AlsobohIqbal M. BatihaTala Sasa
Copyright (c) 2025 Statistics, Optimization & Information Computing
2025-07-072025-07-071431326133610.19139/soic-2310-5070-2645Comparative Analysis of Control Strategies for Single-Phase PWM-CSCs Feeding Linear Loads: IDA-PBC, Nonlinear PI, and PI-PBC Approaches
http://47.88.85.238/index.php/soic/article/view/2410
<p>This study provides an in-depth analysis and comparison of control strategies for single-phase pulse width modulation current source converters (PWM-CSCs) feeding linear loads, focusing on the implementation of interconnection and damping assignment passivity-based control (IDA-PBC), nonlinear proportional-integral (PI) control, and passivity-based PI control (PI-PBC) approaches. The dynamic characterization of PWM-CSC systems is examined through state variables, including inductor current and capacitor voltage, to model system behavior accurately. The Lyapunov stability theorem ensures the equilibrium points' global asymptotic stability, demonstrating that the proposed control techniques effectively drive system states to their desired values over time. The simulation results, conducted in a MATLAB environment, illustrate the performance of the proposed controllers. These results show that the controllers minimize total harmonic distortion (THD) while maintaining stable output current regulation. Furthermore, both IDA-PBC and PI-PBC techniques exhibit superior performance in terms of stability and signal fidelity when compared to nonlinear PI control. This study not only advances the theoretical understanding of PWM-CSC control but also offers practical insights for implementing these controllers in renewable energy systems and industrial applications. The paper concludes by proposing future research directions to further enhance control strategies for PWM-CSCs.</p>Angélica Mercedes Nivia-VargasOscar Danilo Montoya GiraldoWalter Julián Gil-González
Copyright (c) 2025 Statistics, Optimization & Information Computing
2025-06-122025-06-121431337135510.19139/soic-2310-5070-2410On the Local Multiset Dimension of Comb Product Graphs
http://47.88.85.238/index.php/soic/article/view/2431
<p>One of the topics of distance in graphs is resolving set problems. This topic has many application in science and technology namely the application of resolving set problems in networks is one of the describe navigation robots, chemistry structure, and computer sciences. Suppose the set $W=\{s_1,s_2,…,s_k\}\subset V(G)$, the vertex representations of $\in V(G)$ is $r_m(x|W)=\{d(x,s_1),d(x,s_2),…,d(x,s_k)\}$, where $d(x,s_i)$ is the length of the shortest path of the vertex x and the vertex in $W$ together with their multiplicity. The set W is called a local $m$-resolving set of graphs G if $r_m (v│W)\neq r_m (u|W)$ for $uv\in E(G)$. The local $m$-resolving set having minimum cardinality is called the local multiset basis and its cardinality is called the local multiset dimension of $G$, denoted by $md_l(G)$. In our paper, we determined the establish bounds of local multiset dimension of graph resulting comb product of two connected graphs.</p>Ridho AlfarisiLiliek Susilowati Arika Indah Kristiana
Copyright (c) 2025 Statistics, Optimization & Information Computing
2025-06-112025-06-111431356136110.19139/soic-2310-5070-2431Novel Deep Learning Model Optimized by Random Search or Grid Search Method for Soil Erosion Susceptibility Prediction
http://47.88.85.238/index.php/soic/article/view/2528
<p>Soil erosion is the process by which soil particles are removed from the Earth's surface. There are three stages to soil erosion: displacement, migration, and deposition. The rate of soil erosion is influenced by various factors, including infiltration, soil type, soil structure, and land cover. Soil erosion causes soil deposition in some areas and soil loss in others. One of the primary issues in meteorology is the prediction of soil erosion. Numerous methods for forecasting precipitation have been put forth, drawing from deep learning, machine learning, and statistical analysis approaches. In this paper, we compare between the Random Search and Grid Search optimization which combine with CNN, RNN, LSTM and GRU algorithm (GS_CNN, GS_RNN, GS_LSTM, GS_GRU, RS_CNN, RS_RNN, RS_LSTM, RS_GRU,) for soil erosion prediction. These models facilitate planning for soil preservation and land management techniques by improving our understanding of and capacity to forecast the dynamics of soil erosion. There are 236 instances with 11 features in the dataset that was used for this study. Six evaluation metrics were computed: accuracy, precision, recall, F1 score, Matthew’s correlation coefficient (MCC), and Area Under the Receiver Operating Characteristic Curve (AUC) to assess the efficacy of the employed classification technique. With an accuracy of 98.592%, the CNN, GS_CNN, GS_RNN, RS_CNN and RS_RNN models outperformed other machine learning methods and earlier research on the same dataset, according to the experimental results.</p>Alaa A. AlmelibariYasser Fouad
Copyright (c) 2025 Statistics, Optimization & Information Computing
2025-06-262025-06-261431362137310.19139/soic-2310-5070-2528Quantum-Resistant Privacy-Preserving IoT Authentication via Zero-Knowledge Proofs and Blockchain Integration
http://47.88.85.238/index.php/soic/article/view/2399
<p>IoT device authentication faces critical challenges in ensuring quantum resistance and privacy preservation while maintaining practical performance characteristics. This paper presents a novel privacy-preserving authentication framework that integrates blockchain technology, zero-knowledge proofs (ZKPs), and homomorphic encryption for secure IoT device management. Our approach uniquely combines Security Module operations with blockchain-based verification to address the limitations of the existing authentication methods through three key innovations: a lightweight post-quantum ZKP protocol, blockchain-based device verification with chameleon hash functions, and privacy-preserving homomorphic computation. In our experimental setup using an Intel Core i7 platform with simulated IoT sensor networks, the system achieves state-of-the-art performance with 350ms authentication times, a 5.7% improvement over current quantum-resistant solutions. The experimental results demonstrate robust scalability, supporting 100 concurrent simulated devices in a controlled test environment with 98% GUI responsiveness while maintaining privacy guarantees. The Security Module achieves 180ms homomorphic encryption times and 300ms/120ms for ZKP generation/verification, respectively. Through a novel blockchain integration framework, we further demonstrate gas efficiency with device registration averaging 145,000 gas units and 150ms network synchronization. The framework establishes practical quantum-resistant privacy-preserving authentication for IoT environments without compromising performance or scalability.</p>Mohammed TawfikAmr H. AbdelhaliemIslam Fathi
Copyright (c) 2025 Statistics, Optimization & Information Computing
2025-07-152025-07-151431374140210.19139/soic-2310-5070-2399Firefly Algorithm-Optimized SVR Framework for Accurate Stock Price Forecasting
http://47.88.85.238/index.php/soic/article/view/2387
<p>Stock prices have a fluctuating nature; that is, they can change in a short time. Therefore, it is necessary to analyze and anticipate the risks that will occur by forecasting stock prices. The method that can be used to predict stock prices is Support Vector Regression (SVR), which has the advantage of not requiring certain assumptions to be used, being able to overcome overfitting, training time is faster, and being able to predict time series-based data such as stock prices. However, because the parameters are difficult to determine, SVR requires the help of an optimization method to find the optimal parameters, namely the Firefly Algorithm (FA) method. The combination of SVR-FA is also considered to have the advantage of producing a smaller error value than the combination of other methods. The stock data used is PT's daily stock data. Indofood Sukses Makmur Tbk. and USD-IDR exchange rate data from January 1st, 2012, to January 31st, 2022. This study aims to obtain information on the accuracy of results in forecasting the stock price of PT through the best combination of values and several parameters. The best accuracy results are obtained by combining 100 SVR iterations, 10 FA iterations, and 40 individual firefly numbers with a MAPE testing accuracy of <1% and 0.6796%, which can provide good forecasting results.</p>Agustina PradjaningsihLisa Hani Rahayu Romadhoni
Copyright (c) 2025 Statistics, Optimization & Information Computing
2025-07-172025-07-171431403141810.19139/soic-2310-5070-2387A Novel Hypothesis Testing for EBUC(mgf) Utilizing Laplace Transform Techniques with Practical Applications
http://47.88.85.238/index.php/soic/article/view/2520
<p>This study focuses on a new category of life distribution known as the 'Exponential Better than used in convex in Moment Generating Function' class denoted by . An analysis is conducted based on Laplace transform order for hypothesis testing. The study involves calculating Pitman's asymptotic efficiencies for this method and comparing them with other approaches. Moreover, a detailed table of percentiles is presented for the statistical measure linked to this proposed technique. Power calculations are performed to assess the effectiveness of the testing methodologies. Additionally, an evaluation is carried out for a test that discriminates exponentiality within right censored data. The power calculations for these tests are derived using simulations with commonly utilized distributions in reliability studies. Finally, actual datasets are employed to illustrate the implementation of the suggested test statistic in addressing practical challenges associated with complete and incomplete data in the field of reliability analysis.</p>H. S. ElgehadyS. M. El-ArishyE. S. El-Atfy
Copyright (c) 2025 Statistics, Optimization & Information Computing
2025-08-242025-08-241431419143910.19139/soic-2310-5070-2520Advanced Strategies for Predicting and Managing Auto Insurance Claims using Machine Learning Models
http://47.88.85.238/index.php/soic/article/view/2655
<p>The high severity of automobile claims, which continues to rise, necessitates developing novel approaches for effectively handling claims. Machine Learning (ML) represents an essential solution to this issue of concern. As improving customer service remains the primary goal of auto insurers, the companies in question have naturally begun to adopt and use ML to comprehend better and evaluate their dataset more efficiently. This paper contributes scientifically to the pricing of car insurance, in particular, it focuses on the modeling of the total claims amount by ML models such as Support Vector Regression (SVR), Extreme Gradient Boosting (XGBoost) and Multi-Layer Perceptron (MLP). Further, a comparative analysis will help in this case by opting for statistical metrics ( e.g. Mean Squared Error (MSE), Root Mean Squared Error (RMSE), Mean Absolute Error (MAE),...) as well as stochastic ones ( e.g. Difference Score, Weight Difference Score,...) in both train and test datasets. The result shows that the SVR algorithm, originally tuned by Randomized Search CV, achieves excellent precision and surpasses other models tested, as seen in the Taylor diagram. This model, by contrast, shows less efficient visual distribution of predictions than XGBoost and MLP algorithms. The ultimate value of this study resides in the profound analysis of the dataset, which can offer insurers adequate comprehension to manage these losses effectively.</p>Chadia BekkayeHassan OUKHOUYATarek ZariRaby GuerbazHicham El Bouanani
Copyright (c) 2025 Statistics, Optimization & Information Computing
2025-08-242025-08-241431440145710.19139/soic-2310-5070-2655 A new mixed-line programming approach to the problem of multimodal urban transit
http://47.88.85.238/index.php/soic/article/view/2158
<p>Multimodal urban transportation offers an efficient and reliable solution for urban mobility. This paper proposes a novel mathematical formulation, based on Mixed Integer Linear Programming (MILP), specifically addressing the optimization of formal public transportation modes within urban settings. Unlike existing models, our approach focuses exclusively on the formal transport sector while incorporating relevant operational constraints. The study begins with a concise review of the literature on optimization and organizational challenges in multimodal urban transport. Computational experiments are performed using an optimization solver to evaluate the performance and effectiveness of the proposed model.</p>EDI Yapi Fiacre AristideKoné OumarEDI Kouassi Hilaire
Copyright (c) 2025 Statistics, Optimization & Information Computing
2025-04-222025-04-221431458147210.19139/soic-2310-5070-2158Mathematical programming with Semilocally Subconvex functions over cones
http://47.88.85.238/index.php/soic/article/view/2502
<p>In this paper, we introduce another generalization of semilocally convex functions over cones, called conesemilocally<br>subconvex function (C-slsb), and compare it with other generalizations of convex functions through examples.<br>Further, using its properties we establish a theorem of the alternatives for these functions. Then we investigate the optimal<br>solutions of the mathematical programming problem (MP) over cones using these functions, directional derivatives, and<br>the alternative theorem. Investigation of optimal solutions of (MP) is done by deriving optimality and duality results for<br>semilocally subconvex mathematical programming problems over cones (MP).</p>Vani SharmaMamta ChaudharyMeetu Bhatia Grover
Copyright (c) 2025 Statistics, Optimization & Information Computing
2025-08-132025-08-131431473148010.19139/soic-2310-5070-2502An Optimized Hybrid Approach for Reducing Computational Overheads and Evaluating Audio Signal Characteristics in Wireless Acoustic Sensor Networks
http://47.88.85.238/index.php/soic/article/view/2475
<p>This paper presents a hybrid system designed to analyze multiple properties of audio signals while minimizing quality losses during transmission over Wireless Acoustic Sensor Networks (WASNs). The proposed system operates in two phases. In the first phase, audio signal quality is evaluated using key parameters such as packet loss ratio, signal-to-noise ratio (SNR), peak signal-to-noise ratio (PSNR) and signal fidelity. The experimental results of proposed method reveal an increased packet loss ratio, reduced PSNR and lower signal fidelity degraded audio quality. An acceptable threshold is established to maintain quality, though network traffic exceeding this threshold negatively impacts performance. To address this, the system incorporates controls for packet loss, SNR, PSNR and fidelity, ensuring the transmitted audio maintains parity with the source. A WASN framework is introduced for distributed and efficient audio property analysis in the second phase. The framework employs feature extraction techniques, including Mel Frequency Cepstral Coefficient (MFCC) and Power Normalized Cepstral Coefficient (PNCC), alongside other existing methods, to extract comprehensive features from audio signals. Combining quality assessment and distributed analysis, this hybrid system provides a robust solution for enhancing audio signal processing within dynamic and resource-constrained network environments.</p>Utpal GhoshUttam Kr. MondalAbdelmoty M. AhmedAhmed A. Elngar
Copyright (c) 2025 Statistics, Optimization & Information Computing
2025-08-242025-08-241431481151110.19139/soic-2310-5070-2475A Novel Steganography Algorithm based on Interval Type-1 Fuzzy Logic System
http://47.88.85.238/index.php/soic/article/view/2650
<p>In this paper, a novel image steganography algorithm mshEdgeGrayFT1 is proposed, which combines the mamdani fuzzy inference system (FIS) and the least significant bit (LSB) technique for grayscale images. The proposed algorithm begins by removing the noise from the image using a Gaussian filter and then masking the image. The differences between the pixel intensities of two consecutive columns (diffColumn_{i,j}) and rows <br>(diffRow_{i,j}) of the resultant image are taken as input variables. A Mamdani FIS determines the edge pixels where the confidential data can be hidden. The pixels are categorized into black, gray, and white. The LSB technique is then used to hide the confidential data in the identified edge pixels. Based on various evaluation metrics, the experimental results demonstrate that the proposed algorithm mshEdgeGrayFT1 outperforms other methods.</p>Vinita YadavMeenakshi HoodaNavita Dhaka
Copyright (c) 2025 Statistics, Optimization & Information Computing
2025-09-012025-09-011431512152410.19139/soic-2310-5070-2650An efficient technique for solving the two-dimensional heat equation
http://47.88.85.238/index.php/soic/article/view/2396
In this paper, we apply the semi analytic iterative method to the solution of the two-dimensional heat equation. By means of some numerical examples we show that this method is efficient and accurate in producing exact to near-exact solutions. Even where the exact solution is unknown, we were able to obtain it through the fast convergence of the method. Christian KasumoEdwin Moyo
Copyright (c) 2025 Statistics, Optimization & Information Computing
2025-07-162025-07-161431525154110.19139/soic-2310-5070-2396Using reduced feature space (2D and 3D) based on entropic measures for detecting Parkinson’s disease through voice
http://47.88.85.238/index.php/soic/article/view/2638
<p>This research presents a proposal for the integration of different fields, including Parkinson’s disease (PD) detection, acoustic voice analysis, and signal processing. The proposal entails the development of two and three dimensional parsimonious models, predicated on feature spaces constituted by variants of Shannon’s permutation entropy and autocorrelation measures. These models elucidate the structural and informative nature of vocal signals in individuals with and without the disease (NPD). The reduced-dimensional feature spaces (2D and 3D) are novel and were used for the automatic classification of voices using support vector machines (SVM) with polynomial kernels and cross-validation, achieving average accuracy values between 0.82 and 0.88. Furthermore, the identification of homogeneous subgroups according to the coordinates in the<br>space of 2D characteristics represents significant progress. The variables under consideration are candidates for biomarkers of subtypes of speech disorders for Parkinson’s disease. The database used is freely accessible to facilitate reproducibility. The proposed approach is simple and precise and shows promise for the diagnosis and monitoring of PD through the effective use of samples of the vowel /a/ of just one second with a reduced feature space that could improve clinical workflows.</p>Monica GiulianoLuis Alberto FernándezWalter Edgardo Legnani
Copyright (c) 2025 Statistics, Optimization & Information Computing
2025-08-192025-08-191431542156510.19139/soic-2310-5070-2638Penalized estimators for modified Log-Bilal regression: simulations and applications
http://47.88.85.238/index.php/soic/article/view/2513
<p>The Log-Bilal regression is survival regression model accounts for unique features of lifetime data. In this study, we modify the Log-Bilal distribution to enhance it flexibility, resulting in a model that exhibits an increasing, non-constant failure rate over time. To address the multicollinearity for the modified Log-Bilal regression, we introduce two penalized estimators: Ridge modified Log-Bilal (RidgeMBE) and Liu_type modified Log-Bilal (liuMBE) estimators. The properties for the suggested estimators are discussed and the superiority for the estimators were checked. The Liu-type estimator demonstrates superiority over the other estimators. A simulation study is conducted across various factors, which reveals that the Liu_type estimator outperforms the others in many cases. The proposed estimators were applied to real lifetime data from mechanical pumps which it gives the results confirming the results of the simulation study.</p>Tarek Omara
Copyright (c) 2025 Statistics, Optimization & Information Computing
2025-09-012025-09-011431566158310.19139/soic-2310-5070-2513High Order Statistics From Lambert-Topp-Leone Distribution: Statistical Properties and Applications
http://47.88.85.238/index.php/soic/article/view/2914
<p>This paper investigates the statistical properties of high order statistics derived from the Lambert–Topp–Leone<br>(LTL) distribution. The authors establish recurrence relations for both single and product moments of order statistics and explore their application in characterizing the LTL distribution. Comparative analysis is conducted with related distributions, namely the Power Inverted Topp–Leone (PITL) and Topp–Leone Lomax (TLL) distributions. Through simulations, the study examines the impact of parameter variations on the mean and variance of order statistics across different scenarios, highlighting the stability and flexibility of the LTL model. Theoretical results are validated with a real dataset on stress measurements in concrete, where the LTL distribution demonstrates superior goodness-of-fit compared to competing models. The findings underscore the robustness and adaptability of the LTL distribution for modeling lifetime and reliability data, and suggest directions for future research in statistical modeling and inference.</p>Ahmed SalihWafaa HusienMurtadha Abdulah
Copyright (c) 2025 Statistics, Optimization & Information Computing
2025-09-012025-09-011431584159710.19139/soic-2310-5070-2914On the inference of entropy measures under different sampling schemes
http://47.88.85.238/index.php/soic/article/view/2235
<p>Entropy measures are fundamental measures for quantifying the uncertainty of random variables. In this study, we examine the maximum likelihood estimators (MLE) of five well-known entropy measures: Shannon, Rényi, Havrda, Arimoto, and Tsallis, under both Simple Random Sampling (SRS) and Ranked Set Sampling (RSS). We derived the asymptotic bias and variance for these entropy estimators and conducted extensive simulations to assess the performance of SRS and RSS in estimating these entropy measures. The effectiveness of our estimators was demonstrated using breast cancer data.</p>Hani SamawiAmal Helu
Copyright (c) 2025 Statistics, Optimization & Information Computing
2025-07-012025-07-011431598161010.19139/soic-2310-5070-2235