New parameter of conjugate gradient method for unconstrained nonlinear optimization
Keywords:
Unconstrained optimization, Conjugate gradient method, Descent direction, Inexact line search, Global convergence
Abstract
We are interested in the performance of nonlinear conjugate gradient methods for unconstrained optimization. Inparticular, we address the conjugate gradient algorithm with strong Wolfe inexact line search. Firstly, we study the descentproperty of the search direction of the considered conjugate gradient algorithm based on a new direction obtained from anew parameter. The main objective of this parameter is to improve the speed of the convergence of the obtained algorithm.Then, we present a complete study that shows the global convergence of this algorithm. Finally, we establish comparativenumerical experiments on well-known test examples to show the efficiency and robustness of our algorithm compared toother recent algorithms.References
1. N. Andrei, Nonlinear conjugate gradient methods for unconstrained optimization, Springer Optimization and its Applications, vol.
158, 2020.
2. N. Andrei, An unconstrained optimization test functions collection, Adv. Model. Optim., vol. 10, pp. 147–161, 2008.
3. S. Ben Hanachi, B. Sellami and M. Belloufi, New iterative conjugate gradient method for nonlinear unconstrained optimization.
RAIRO-Oper. Res., vol. 56, pp. 2315–2327, 2022.
4. Y.H. Dai, Y. Yuan, A nonlinear conjugate gradient method with a strong global convergence property, SIAM J. Optim., vol. 10, no.
1, pp. 177–182, 1999 .
5. Y.H. Dai, Y. Yuan, An efficient hybrid conjugate gradient method for unconstrained optimization, Ann. Oper. Res., vol. 103, pp.
33–47, 2001.
6. S. Delladji, M. Belloufi, B. Sellami, New hybrid conjugate gradient method as a convex combination of FR and BA methods, Journal
of Information and Optimization Sciences, vol. 42, no. 3, pp. 591-602, 2021.
7. H. Fan, Z. Zhu, A. Zhou, A new descent nonlinear conjugate gradient method for unconstrained optimization, Applied Mathematics,
Vol. 2, no. 9, pp. 1119-1123, 2011.
8. R. Fletcher, Practical methods of optimization. Unconstrained Optimization, vol. 1, Wiley, New York, 1987.
9. R. Fletcher, C. M. Reeves, Function minimization by conjugate gradients, Comput. J., vol. 7, no. 2, 149–154, 1964.
10. W.W. Hager, H. Zhang, A survey of nonlinear conjugate gradient methods, Pacific journal of Optimization, vol. 2, pp. 35–58, 2006.
11. M. R. Hestenes, E. Steifel, Methods of conjugate gradients for solving linear systems, J. Res. Natl. Bur. Stand., vol. 49, no. 6, pp.
409–436; 1952.
12. Y. Liu, C. Storey, Efficient generalized conjugate gradient algorithms, part 1: theory, J. Optim. Theory Appl, vol. 69, no. 1, pp.
129–137, 1991.
13. P. Mtagulwa, P. Kaelo, A convergent modified HS-DY hybrid conjugate gradient method for unconstrained optimization problems,
Journal of Information and Optimization Sciences, vol. 40, no. 1, pp. 97-113, 2019.
14. H. Y. Najm, E. T. Hamed and H. I. Ahmed, Global convergence of conjugate gradient method in unconstrained optimization problems,
Bol. Soc. Paran. Mat., vol. 38, pp. 227-231, 2020.
15. E. Polak, G. Ribiere, Note sur la convergence des m´ethodes de directions conjugu´ees, Rev. Franc¸aise Imformatique Recherche
Opertionelle, vol. 16, pp. 35–43, 1969.
16. B. T. Polyak, The conjugate gradient method in extreme problems. U.S.S.R, Comput. Math. Phys., vol. 9, pp. 94–112, 1969.
17. M. Rivaie, M. Mustafa, W. J. Leong, M. Ismail, A new class of nonlinear conjugate gradient coefficients with global convergence
properties, Applied Mathematics and Computation, vol. 218, no. 22, pp. 11323–11332, 2012.
18. B. Sellami, Y. Chaib, A new family of globally convergent conjugate gradient methods, Ann. Oper. Res. Springer., vol. 241, pp.
497–513, 2016.
19. B. Sellami, Y. Chaib, New conjugate gradient method for unconstrained optimization, RAIRO Operations Research, vol. 50, pp.
1013–1026, 2016.
20. G. Zoutendijk, Nonlinear programming, computational methods. In: Abadie. J. (ed.), Integer and Nonlinear Programming, 1970.
158, 2020.
2. N. Andrei, An unconstrained optimization test functions collection, Adv. Model. Optim., vol. 10, pp. 147–161, 2008.
3. S. Ben Hanachi, B. Sellami and M. Belloufi, New iterative conjugate gradient method for nonlinear unconstrained optimization.
RAIRO-Oper. Res., vol. 56, pp. 2315–2327, 2022.
4. Y.H. Dai, Y. Yuan, A nonlinear conjugate gradient method with a strong global convergence property, SIAM J. Optim., vol. 10, no.
1, pp. 177–182, 1999 .
5. Y.H. Dai, Y. Yuan, An efficient hybrid conjugate gradient method for unconstrained optimization, Ann. Oper. Res., vol. 103, pp.
33–47, 2001.
6. S. Delladji, M. Belloufi, B. Sellami, New hybrid conjugate gradient method as a convex combination of FR and BA methods, Journal
of Information and Optimization Sciences, vol. 42, no. 3, pp. 591-602, 2021.
7. H. Fan, Z. Zhu, A. Zhou, A new descent nonlinear conjugate gradient method for unconstrained optimization, Applied Mathematics,
Vol. 2, no. 9, pp. 1119-1123, 2011.
8. R. Fletcher, Practical methods of optimization. Unconstrained Optimization, vol. 1, Wiley, New York, 1987.
9. R. Fletcher, C. M. Reeves, Function minimization by conjugate gradients, Comput. J., vol. 7, no. 2, 149–154, 1964.
10. W.W. Hager, H. Zhang, A survey of nonlinear conjugate gradient methods, Pacific journal of Optimization, vol. 2, pp. 35–58, 2006.
11. M. R. Hestenes, E. Steifel, Methods of conjugate gradients for solving linear systems, J. Res. Natl. Bur. Stand., vol. 49, no. 6, pp.
409–436; 1952.
12. Y. Liu, C. Storey, Efficient generalized conjugate gradient algorithms, part 1: theory, J. Optim. Theory Appl, vol. 69, no. 1, pp.
129–137, 1991.
13. P. Mtagulwa, P. Kaelo, A convergent modified HS-DY hybrid conjugate gradient method for unconstrained optimization problems,
Journal of Information and Optimization Sciences, vol. 40, no. 1, pp. 97-113, 2019.
14. H. Y. Najm, E. T. Hamed and H. I. Ahmed, Global convergence of conjugate gradient method in unconstrained optimization problems,
Bol. Soc. Paran. Mat., vol. 38, pp. 227-231, 2020.
15. E. Polak, G. Ribiere, Note sur la convergence des m´ethodes de directions conjugu´ees, Rev. Franc¸aise Imformatique Recherche
Opertionelle, vol. 16, pp. 35–43, 1969.
16. B. T. Polyak, The conjugate gradient method in extreme problems. U.S.S.R, Comput. Math. Phys., vol. 9, pp. 94–112, 1969.
17. M. Rivaie, M. Mustafa, W. J. Leong, M. Ismail, A new class of nonlinear conjugate gradient coefficients with global convergence
properties, Applied Mathematics and Computation, vol. 218, no. 22, pp. 11323–11332, 2012.
18. B. Sellami, Y. Chaib, A new family of globally convergent conjugate gradient methods, Ann. Oper. Res. Springer., vol. 241, pp.
497–513, 2016.
19. B. Sellami, Y. Chaib, New conjugate gradient method for unconstrained optimization, RAIRO Operations Research, vol. 50, pp.
1013–1026, 2016.
20. G. Zoutendijk, Nonlinear programming, computational methods. In: Abadie. J. (ed.), Integer and Nonlinear Programming, 1970.
Published
2025-02-24
How to Cite
Ouaoua, M. L., Khelladi, S., & Benterki, D. (2025). New parameter of conjugate gradient method for unconstrained nonlinear optimization. Statistics, Optimization & Information Computing. https://doi.org/10.19139/soic-2310-5070-2069
Issue
Section
Research Articles
Authors who publish with this journal agree to the following terms:
- Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).