Abstract
Spectral conjugate gradient (SCG) methods are combinations of spectral gradient method and conjugate gradient (CG) methods, which have been well studied in Euclidean space. In this paper, we aim to extend this class of methods to solve optimization problems on Riemannian manifolds. Firstly, we present a Riemannian version of the spectral parameter, which guarantees that the search direction always satisfies the sufficient descent property without the help of any line search strategy. Secondly, we introduce a generic algorithmic framework for the Riemannian SCG methods, in which the selection of the CG parameter is very flexible. Under the Riemannian Wolfe conditions, the global convergence of the proposed algorithmic framework is established whenever the absolute value of the CG parameter is no more than the Riemannian Fletcher–Reeves CG parameter. Finally, some preliminary numerical results are reported and compared with several classical Riemannian CG methods, which show that our new methods are efficient.
Similar content being viewed by others
Data availability
Data used or analyzed in the study are available from the author upon reasonable request.
References
Hestenes, M.R., Stiefel, E.: Methods of conjugate gradients for solving linear systems. J. Res. Natl. Bur. Stand. 49(6), 409–435 (1952)
Fletcher, R., Reeves, C.M.: Function minimization by conjugate gradients. Comput. J. 7(2), 149–154 (1964)
Polyak, B.T.: The conjugate gradient method in extremal problems. USSR Comput. Math. Math. Phys. 9(4), 94–112 (1969)
Dai, Y.H., Yuan, Y.X.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10(1), 177–182 (1999)
Zhang, L., Zhou, W.J., Li, D.H.: A descent modified Polak-Ribière-Polyak conjugate gradient method and its global convergence. IMA J. Numer. Anal. 26(4), 629–640 (2006)
Zhang, L., Zhou, W.J., Li, D.H.: Some descent three-term conjugate gradient methods and their global convergence. Optimisation Methods Softw. 22 (4), 697–711 (2007)
Narushima, Y., Yabe, H., Ford, J.A.: A three-term conjugate gradient method with sufficient descent property for unconstrained optimization. SIAM J. Optim. 21(1), 212–230 (2011)
Andrei, N.: An accelerated subspace minimization three-term conjugate gradient algorithm for unconstrained optimization. Numer. Algoritm. 65(4), 859–874 (2014)
Andrei, N.: A new three-term conjugate gradient algorithm for unconstrained optimization. Numer. Algoritm. 68(2), 305–321 (2015)
Hu, Y.F., Storey, C.: Global convergence result for conjugate gradient methods. J. Optim. Theory Appl. 71(2), 399–405 (1991)
Touati-Ahmed, D., Storey, C.: Efficient hybrid conjugate gradient techniques. J. Optim. Theory Appl. 64(2), 379–397 (1990)
Jian, J.B., Han, L., Jiang, X.Z.: A hybrid conjugate gradient method with descent property for unconstrained optimization. Appl. Math. Model. 39(3–4), 1281–1290 (2015)
Birgin, E.G., Martínez, J.M.: A spectral conjugate gradient method for unconstrained optimization. Appl. Math. Optim. 43(2), 117–128 (2001)
Zhang, L., Zhou, W.J., Li, D.H.: Global convergence of a modified Fletcher–Reeves conjugate gradient method with Armijo-type line search. Numer. Math. 104(4), 561–572 (2006)
Wan, Z., Yang, Z.L., Wang, Y.L.: New spectral PRP conjugate gradient method for unconstrained optimization. Appl. Math. Lett. 24(1), 16–22 (2011)
Liu, J.K., Feng, Y.M., Zou, L.M.: A spectral conjugate gradient method for solving large-scale unconstrained optimization. Comput. Math. Appl. 77(3), 731–739 (2019)
Barzilai, J., Borwein, J.M.: Two-point step size gradient methods. IMA J. Numer. Anal. 8(1), 141–148 (1988)
Absil, P.-A., Mahony, R., Sepulchre, R.: Optimization algorithms on matrix manifolds (2008)
Boumal, N.: An introduction to optimization on smooth manifolds. Available online, Princeton NJ (2020)
Hu, J., Liu, X., Wen, Z.W., Yuan, Y.X.: A brief introduction to manifold optimization. J. Oper. Res. Soc. China 8(2), 199–248 (2020)
Sato, H.: Riemannian Optimization and its Applications. Springer Nature, Switzerland (2021)
Lichnewsky, A.: Une methode de gradient conjugue sur des varietes application a certains problemes de valeurs propres non lineaires. Numer. Funct. Anal. Optim. 1(5), 515–560 (1979)
Smith, S.T.: Optimization techniques on Riemannian manifolds. Fields Inst. Commun. 3(3), 113–135 (1994)
Ring, W., Wirth, B.: Optimization methods on Riemannian manifolds and their application to shape space. SIAM J. Optim. 22(2), 596–627 (2012)
Sato, H., Iwai, T.: A new, globally convergent Riemannian conjugate gradient method. Optimization 64(4), 1011–1031 (2015)
Sato, H.: A Dai-Yuan-type Riemannian conjugate gradient method with the weak Wolfe conditions. Comput. Optim. Appl. 64(1), 101–118 (2016)
Zhu, X.J.: A Riemannian conjugate gradient method for optimization on the Stiefel manifold. Comput. Optim. Appl. 67(1), 73–110 (2017)
Sakai, H., Iiduka, H.: Hybrid Riemannian conjugate gradient methods with global convergence properties. Comput. Optim. Appl. 77(3), 811–830 (2020)
Sakai, H., Iiduka, H.: Sufficient descent Riemannian conjugate gradient methods. J. Optim. Theory Appl. 190(1), 130–150 (2021)
Sato, H.: Riemannian conjugate gradient methods: general framework and specific algorithms with convergence analyses. SIAM J. Optim. 32(4), 2690–2717 (2022)
Gilbert, J.C., Nocedal, J.: Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2(1), 21–42 (1992)
Absil, P.-A., Gallivan, K.A.: Joint diagonalization on the oblique manifold for independent component analysis. In: 2006 IEEE International Conference on Acoustics Speech and Signal Processing Proceedings, vol. 5 (2006)
Yuan, H.L., Gu, X.Y., Lai, R.J., Wen, Z.W.: Global optimization with orthogonality constraints via stochastic diffusion on manifold. J. Sci. Comput. 80(2), 1139–1170 (2019)
Townsend, J., Koep, N., Weichwald, S.: Pymanopt: a python toolbox for optimization on manifolds using automatic differentiation. J. Mach. Learn. Res. 17(1), 4755–4759 (2016)
Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91(2), 201–213 (2002)
Acknowledgements
The authors are sincerely grateful to the editor and two anonymous referees for their valuable comments that improve the original version of this paper significantly.
Funding
This work was supported by the Guangxi Natural Science Foundation (2018GXNSFFA281007) and the National Natural Science Foundation of China (12271113, 71861002, 11761013).
Author information
Authors and Affiliations
Contributions
All authors read and approved the final manuscript. CT is mainly responsible for algorithm design and theoretical analysis; WT mainly contributes to theoretical analysis and numerical experiments; SX and HZ are mainly contributing to algorithm design and numerical experiments.
Corresponding author
Ethics declarations
Ethics approval and consent to participate
Not applicable.
Human and animal ethics
Not applicable.
Consent for publication
Not applicable.
Competing interests
The authors declare no competing interests.
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Tang, C., Tan, W., Xing, S. et al. A class of spectral conjugate gradient methods for Riemannian optimization. Numer Algor 94, 131–147 (2023). https://doi.org/10.1007/s11075-022-01495-5
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11075-022-01495-5
Keywords
- Riemannian optimization
- Spectral conjugate gradient method
- Riemannian Wolfe conditions
- Sufficient descent property
- Global convergence