Abstract
In this paper, we address some of the computational challenges associated with the RMIL+ conjugate gradient parameter by proposing an efficient conjugate gradient (CG) parameter along with its generalization to the Riemannian manifold. This parameter ensures the good convergence properties of the CG method in Riemannian optimization and it is formed by combining the structures of two classical CG methods. The extension utilizes the concepts of retraction and vector transport to establish sufficient descent property for the method via strong Wolfe line search conditions. Additionally, the scheme achieves global convergence using the scaled version of the Ring-Wirth nonexpansive condition. Finally, numerical experiments are conducted to validate the scheme’s effectiveness. We consider both unconstrained Euclidean optimization test problems and Riemannian optimization problems. The results reveal that the performance of the proposed method is significantly influenced by the choice of line search in both Euclidean and Riemannian optimizations.
Similar content being viewed by others
Data availability
All the relevant data are available within the manuscript.
Notes
References
Absil P-A, Mahony R, Sepulchre R (2008) Optimization algorithms on matrix manifolds. Princeton University Press, Princeton, NJ, p 224. https://doi.org/10.1515/9781400830244 (With a foreword by Paul Van Dooren)
Aminifard Z, Babaie-Kafaki S (2022) Dai-Liao extensions of a descent hybrid nonlinear conjugate gradient method with application in signal processing. Numer Algorithms 89(3):1369–1387. https://doi.org/10.1007/s11075-021-01157-y
Andrei N (2018) A Dai-Liao conjugate gradient algorithm with clustering of eigenvalues. Numer Algorithms 77(4):1273–1282. https://doi.org/10.1007/s11075-017-0362-5
Andrei N (2020) Nonlinear conjugate gradient methods for unconstrained optimization, vol 158. Springer optimization and its applications. Springer, Berlin, p 498. https://doi.org/10.1007/978-3-030-42950-8
Awwal AM, Sulaiman IM, Maulana M, Mustafa M, Poom K, Kanokwan S (2021) A spectral RMIL+ conjugate gradient method for unconstrained optimization with applications in portfolio selection and motion control. IEEE Access 9:75398–75414
Awwal AM, Wang L, Kumam P, Sulaiman MI, Salisu S, Salihu N, Yodjai P (2023) Generalized RMIL conjugate gradient method under the strong Wolfe line search with application in image processing. Math Methods Appl Sci 46:17544–17556
Boumal N (2023) An introduction to optimization on smooth manifolds. Cambridge University Press, Cambridge, p 338
Dai Z (2016) Comments on a new class of nonlinear conjugate gradient coefficients with global convergence properties. Appl Math Comput 276:297–300. https://doi.org/10.1016/j.amc.2015.11.085
Dai Z, Wen F (2012) Another improved Wei–Yao–Liu nonlinear conjugate gradient method with sufficient descent property. Appl Math Comput 218(14):7421–7430. https://doi.org/10.1016/j.amc.2011.12.091
Dai YH, Yuan Y (1999) A nonlinear conjugate gradient method with a strong global convergence property. SIAM J Optim 10(1):177–182. https://doi.org/10.1137/S1052623497318992
Dai Y-H, Yuan Y (2000) Nonlinear conjugate gradient methods. Shanghai Science and Technology Publisher, Shanghai
Dolan ED, Moré JJ (2002) Benchmarking optimization software with performance profiles. Math Program 91(2):201–213. https://doi.org/10.1007/s101070100263
Fletcher R (1981) Practical methods of optimization, vol 2. A Wiley-Interscience Publication. Wiley, Chichester, p 224. Constrained optimization
Fletcher R (1987) Practical methods of optimization. John Wiley & Sons, Ltd., Chichester, p 436. https://doi.org/10.1002/9781118723203
Fletcher R, Reeves CM (1964) Function minimization by conjugate gradients. Comput J 7:149–154. https://doi.org/10.1093/comjnl/7.2.149
Hager WW, Zhang H (2006) Algorithm 851: CG_DESCENT, a conjugate gradient method with guaranteed descent. ACM Trans Math Softw 32(1):113–137. https://doi.org/10.1145/1132973.1132979
Hestenes MR, Stiefel E (1952) Methods of conjugate gradients for solving linear systems. J Res Nat Bur Std 49:409–436
Hu J, Liu X, Wen Z-W, Yuan Y-X (2020) A brief introduction to manifold optimization. J Oper Res Soc China 8(2):199–248. https://doi.org/10.1007/s40305-020-00295-9
Jian J, Han L, Jiang X (2015) A hybrid conjugate gradient method with descent property for unconstrained optimization. Appl Math Model 39(3–4):1281–1290. https://doi.org/10.1016/j.apm.2014.08.008
Liu Y, Storey C (1991) Efficient generalized conjugate gradient algorithms. I. Theory. J Optim Theory Appl 69(1):129–137. https://doi.org/10.1007/BF00940464
Momin J, Xin-She Y (2013) A literature survey of benchmark functions for global optimization problems. Int J Math Model Numer Optim 4(2):150–194
Motzkin TS, Straus EG (1965) Maxima for graphs and a new proof of a theorem of Turán. Can J Math 17:533–540. https://doi.org/10.4153/CJM-1965-053-6
Nasiru S, Mathew RO, Mohammed YW, Abubakar SH, Suraj S (2021) A Dai–Liao hybrid conjugate gradient method for unconstrained optimization. Int J Ind Optim 2(2):69–84. https://doi.org/10.12928/ijio.v2i2.4100
Polak E, Ribiere G (1969) Note sur la convergence de methodes de directions conjuguees. USSR Comp Math Math Phys 9(4):94–112
Poljak BT (1967) A general method for solving extremal problems. Dokl Akad Nauk SSSR 174:33–36
Ring W, Wirth B (2012) Optimization methods on Riemannian manifolds and their application to shape space. SIAM J Optim 22(2):596–627. https://doi.org/10.1137/11082885X
Rivaie M, Mamat M, June LW, Mohd I (2012) A new class of nonlinear conjugate gradient coefficients with global convergence properties. Appl Math Comput 218(22):11323–11332. https://doi.org/10.1016/j.amc.2012.05.030
Sakai H, Iiduka H (2020) Hybrid Riemannian conjugate gradient methods with global convergence properties. Comput Optim Appl 77(3):811–830. https://doi.org/10.1007/s10589-020-00224-9
Sakai H, Iiduka H (2021) Sufficient descent Riemannian conjugate gradient methods. J Optim Theory Appl 190(1):130–150. https://doi.org/10.1007/s10957-021-01874-3
Sakai H, Sato H, Iiduka H (2023) Global convergence of Hager–Zhang type Riemannian conjugate gradient method. Appl Math Comput 441:127685–12. https://doi.org/10.1016/j.amc.2022.127685
Salihu N, Odekunle M, Waziri M, Halilu A (2020) A new hybrid conjugate gradient method based on secant equation for solving large scale unconstrained optimization problems. Iran J Optim 12(1):33–44
Salihu N, Odekunle MR, Saleh AM, Salihu S (2021) A Dai–Liao hybrid Hestenes–Stiefel and Fletcher–Revees methods for unconstrained optimization. Int J Ind Optim 2(1):33–50
Salihu N, Kumam P, Awwal AM, Sulaiman IM, Seangwattana T (2023a) The global convergence of spectral RMIL conjugate gradient method for unconstrained optimization with applications to robotic model and image recovery. PLoS ONE 18(3):0281250
Salihu N, Kumam P, Sulaiman IM, Seangwattana T (2023b) An efficient spectral minimization of the Dai–Yuan method with application to image reconstruction. AIMS Math 8(12):30940–30962
Salihu N, Kumam P, Awwal AM, Arzuka I, Seangwattana T (2023c) A structured Fletcher–Revees spectral conjugate gradient method for unconstrained optimization with application in robotic model. Oper Res Forum 4(4):81-25. https://doi.org/10.1007/s43069-023-00265-w
Sato H (2016) A Dai–Yuan-type Riemannian conjugate gradient method with the weak Wolfe conditions. Comput Optim Appl 64(1):101–118. https://doi.org/10.1007/s10589-015-9801-1
Sato H (2021) Riemannian optimization and its applications. Springer briefs in electrical and computer engineering, p 129. Springer briefs in control, automation and robotics. Springer. https://doi.org/10.1007/978-3-030-62391-3
Sato H (2022) Riemannian conjugate gradient methods: general framework and specific algorithms with convergence analyses. SIAM J Optim 32(4):2690–2717. https://doi.org/10.1137/21M1464178
Sato H, Iwai T (2015) A new, globally convergent Riemannian conjugate gradient method. Optimization 64(4):1011–1031. https://doi.org/10.1080/02331934.2013.836650
Tang C, Tan W, Xing S, Zheng H (2023) A class of spectral conjugate gradient methods for Riemannian optimization. Numer Algorithms 94(1):131–147. https://doi.org/10.1007/s11075-022-01495-5
Townsend J, Koep N, Weichwald S (2016) Pymanopt: a python toolbox for optimization on manifolds using automatic differentiation. J Mach Learn Res 17:137-5. https://doi.org/10.1016/j.laa.2015.02.027
Wolfe P (1969) Convergence conditions for ascent methods. SIAM Rev 11:226–235. https://doi.org/10.1137/1011036
Wolfe P (1971) Convergence conditions for ascent methods. II. Some corrections. SIAM Rev 13:185–188. https://doi.org/10.1137/1013035
Yousif OOO (2020) The convergence properties of RMIL+ conjugate gradient method under the strong Wolfe line search. Appl Math Comput 367:124777–8. https://doi.org/10.1016/j.amc.2019.124777
Yuan H, Gu X, Lai R, Wen Z (2019) Global optimization with orthogonality constraints via stochastic diffusion on manifold. J Sci Comput 80(2):1139–1170. https://doi.org/10.1007/s10915-019-00971-w
Acknowledgements
The authors extend their gratitude to the principal editor and anonymous reviewers for their valuable comments and suggestions, which have significantly enhanced the quality of the paper. The authors acknowledged the support provided by Center of Excellence in Theoretical and Computational Science (TaCS-CoE), KMUTT and the Petchra Pra Jom Klao PhD Scholarship of King Mongkut’s University of Technology Thonburi (KMUTT) with Contract No: 52/2564. Moreover, this research project is supported by King Mongkut’s University of Technology Thonburi (KMUTT), Thailand Science Research and Innovation (TSRI), and National Science, Research and Innovation Fund (NSRF) Fiscal year 2024 Grant number FRB670073/0164.
Funding
The authors acknowledged the financial support provided by Mid-Career Research Grant with Contract No: N41A640089.
Author information
Authors and Affiliations
Contributions
All the authors contributed equally.
Corresponding author
Ethics declarations
Conflict of interest
The authors declare no Conflict of interest.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Salihu, N., Kumam, P. & Salisu, S. Two efficient nonlinear conjugate gradient methods for Riemannian manifolds. Comp. Appl. Math. 43, 415 (2024). https://doi.org/10.1007/s40314-024-02920-2
Received:
Revised:
Accepted:
Published:
DOI: https://doi.org/10.1007/s40314-024-02920-2
Keywords
- Riemannian optimization
- Conjugate gradient method
- Riemannian Wolfe conditions
- Retraction
- Vector transport
- Global convergence