Nothing Special   »   [go: up one dir, main page]

Skip to main content
Log in

A class of spectral conjugate gradient methods for Riemannian optimization

  • Original Paper
  • Published:
Numerical Algorithms Aims and scope Submit manuscript

Abstract

Spectral conjugate gradient (SCG) methods are combinations of spectral gradient method and conjugate gradient (CG) methods, which have been well studied in Euclidean space. In this paper, we aim to extend this class of methods to solve optimization problems on Riemannian manifolds. Firstly, we present a Riemannian version of the spectral parameter, which guarantees that the search direction always satisfies the sufficient descent property without the help of any line search strategy. Secondly, we introduce a generic algorithmic framework for the Riemannian SCG methods, in which the selection of the CG parameter is very flexible. Under the Riemannian Wolfe conditions, the global convergence of the proposed algorithmic framework is established whenever the absolute value of the CG parameter is no more than the Riemannian Fletcher–Reeves CG parameter. Finally, some preliminary numerical results are reported and compared with several classical Riemannian CG methods, which show that our new methods are efficient.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Algorithm 1
Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

Data availability

Data used or analyzed in the study are available from the author upon reasonable request.

References

  1. Hestenes, M.R., Stiefel, E.: Methods of conjugate gradients for solving linear systems. J. Res. Natl. Bur. Stand. 49(6), 409–435 (1952)

    Article  MathSciNet  MATH  Google Scholar 

  2. Fletcher, R., Reeves, C.M.: Function minimization by conjugate gradients. Comput. J. 7(2), 149–154 (1964)

    Article  MathSciNet  MATH  Google Scholar 

  3. Polyak, B.T.: The conjugate gradient method in extremal problems. USSR Comput. Math. Math. Phys. 9(4), 94–112 (1969)

    Article  MATH  Google Scholar 

  4. Dai, Y.H., Yuan, Y.X.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10(1), 177–182 (1999)

    Article  MathSciNet  MATH  Google Scholar 

  5. Zhang, L., Zhou, W.J., Li, D.H.: A descent modified Polak-Ribière-Polyak conjugate gradient method and its global convergence. IMA J. Numer. Anal. 26(4), 629–640 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  6. Zhang, L., Zhou, W.J., Li, D.H.: Some descent three-term conjugate gradient methods and their global convergence. Optimisation Methods Softw. 22 (4), 697–711 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  7. Narushima, Y., Yabe, H., Ford, J.A.: A three-term conjugate gradient method with sufficient descent property for unconstrained optimization. SIAM J. Optim. 21(1), 212–230 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  8. Andrei, N.: An accelerated subspace minimization three-term conjugate gradient algorithm for unconstrained optimization. Numer. Algoritm. 65(4), 859–874 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  9. Andrei, N.: A new three-term conjugate gradient algorithm for unconstrained optimization. Numer. Algoritm. 68(2), 305–321 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  10. Hu, Y.F., Storey, C.: Global convergence result for conjugate gradient methods. J. Optim. Theory Appl. 71(2), 399–405 (1991)

    Article  MathSciNet  MATH  Google Scholar 

  11. Touati-Ahmed, D., Storey, C.: Efficient hybrid conjugate gradient techniques. J. Optim. Theory Appl. 64(2), 379–397 (1990)

    Article  MathSciNet  MATH  Google Scholar 

  12. Jian, J.B., Han, L., Jiang, X.Z.: A hybrid conjugate gradient method with descent property for unconstrained optimization. Appl. Math. Model. 39(3–4), 1281–1290 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  13. Birgin, E.G., Martínez, J.M.: A spectral conjugate gradient method for unconstrained optimization. Appl. Math. Optim. 43(2), 117–128 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  14. Zhang, L., Zhou, W.J., Li, D.H.: Global convergence of a modified Fletcher–Reeves conjugate gradient method with Armijo-type line search. Numer. Math. 104(4), 561–572 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  15. Wan, Z., Yang, Z.L., Wang, Y.L.: New spectral PRP conjugate gradient method for unconstrained optimization. Appl. Math. Lett. 24(1), 16–22 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  16. Liu, J.K., Feng, Y.M., Zou, L.M.: A spectral conjugate gradient method for solving large-scale unconstrained optimization. Comput. Math. Appl. 77(3), 731–739 (2019)

    Article  MathSciNet  MATH  Google Scholar 

  17. Barzilai, J., Borwein, J.M.: Two-point step size gradient methods. IMA J. Numer. Anal. 8(1), 141–148 (1988)

    Article  MathSciNet  MATH  Google Scholar 

  18. Absil, P.-A., Mahony, R., Sepulchre, R.: Optimization algorithms on matrix manifolds (2008)

  19. Boumal, N.: An introduction to optimization on smooth manifolds. Available online, Princeton NJ (2020)

  20. Hu, J., Liu, X., Wen, Z.W., Yuan, Y.X.: A brief introduction to manifold optimization. J. Oper. Res. Soc. China 8(2), 199–248 (2020)

    Article  MathSciNet  MATH  Google Scholar 

  21. Sato, H.: Riemannian Optimization and its Applications. Springer Nature, Switzerland (2021)

    Book  MATH  Google Scholar 

  22. Lichnewsky, A.: Une methode de gradient conjugue sur des varietes application a certains problemes de valeurs propres non lineaires. Numer. Funct. Anal. Optim. 1(5), 515–560 (1979)

    Article  MathSciNet  MATH  Google Scholar 

  23. Smith, S.T.: Optimization techniques on Riemannian manifolds. Fields Inst. Commun. 3(3), 113–135 (1994)

    MathSciNet  MATH  Google Scholar 

  24. Ring, W., Wirth, B.: Optimization methods on Riemannian manifolds and their application to shape space. SIAM J. Optim. 22(2), 596–627 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  25. Sato, H., Iwai, T.: A new, globally convergent Riemannian conjugate gradient method. Optimization 64(4), 1011–1031 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  26. Sato, H.: A Dai-Yuan-type Riemannian conjugate gradient method with the weak Wolfe conditions. Comput. Optim. Appl. 64(1), 101–118 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  27. Zhu, X.J.: A Riemannian conjugate gradient method for optimization on the Stiefel manifold. Comput. Optim. Appl. 67(1), 73–110 (2017)

    Article  MathSciNet  MATH  Google Scholar 

  28. Sakai, H., Iiduka, H.: Hybrid Riemannian conjugate gradient methods with global convergence properties. Comput. Optim. Appl. 77(3), 811–830 (2020)

    Article  MathSciNet  MATH  Google Scholar 

  29. Sakai, H., Iiduka, H.: Sufficient descent Riemannian conjugate gradient methods. J. Optim. Theory Appl. 190(1), 130–150 (2021)

    Article  MathSciNet  MATH  Google Scholar 

  30. Sato, H.: Riemannian conjugate gradient methods: general framework and specific algorithms with convergence analyses. SIAM J. Optim. 32(4), 2690–2717 (2022)

    Article  MathSciNet  MATH  Google Scholar 

  31. Gilbert, J.C., Nocedal, J.: Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2(1), 21–42 (1992)

    Article  MathSciNet  MATH  Google Scholar 

  32. Absil, P.-A., Gallivan, K.A.: Joint diagonalization on the oblique manifold for independent component analysis. In: 2006 IEEE International Conference on Acoustics Speech and Signal Processing Proceedings, vol. 5 (2006)

  33. Yuan, H.L., Gu, X.Y., Lai, R.J., Wen, Z.W.: Global optimization with orthogonality constraints via stochastic diffusion on manifold. J. Sci. Comput. 80(2), 1139–1170 (2019)

    Article  MathSciNet  MATH  Google Scholar 

  34. Townsend, J., Koep, N., Weichwald, S.: Pymanopt: a python toolbox for optimization on manifolds using automatic differentiation. J. Mach. Learn. Res. 17(1), 4755–4759 (2016)

    MathSciNet  MATH  Google Scholar 

  35. Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91(2), 201–213 (2002)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

The authors are sincerely grateful to the editor and two anonymous referees for their valuable comments that improve the original version of this paper significantly.

Funding

This work was supported by the Guangxi Natural Science Foundation (2018GXNSFFA281007) and the National Natural Science Foundation of China (12271113, 71861002, 11761013).

Author information

Authors and Affiliations

Authors

Contributions

All authors read and approved the final manuscript. CT is mainly responsible for algorithm design and theoretical analysis; WT mainly contributes to theoretical analysis and numerical experiments; SX and HZ are mainly contributing to algorithm design and numerical experiments.

Corresponding author

Correspondence to Shajie Xing.

Ethics declarations

Ethics approval and consent to participate

Not applicable.

Human and animal ethics

Not applicable.

Consent for publication

Not applicable.

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Tang, C., Tan, W., Xing, S. et al. A class of spectral conjugate gradient methods for Riemannian optimization. Numer Algor 94, 131–147 (2023). https://doi.org/10.1007/s11075-022-01495-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11075-022-01495-5

Keywords

Mathematics Subject Classification (2010)

Navigation