Abstract
A double parameter self-scaling memoryless BFGS method for unconstrained optimization is presented. In this method, the first two terms of the self-scaling memoryless BFGS matrix are scaled with a positive parameter, while the third one is scaled with another positive parameter. The first parameter scaling the first two terms is determined to cluster the eigenvalues of the memoryless BFGS matrix. The second parameter scaling the third term is computed as a preconditioner to the Hessian of the minimizing function combined with the minimization of the conjugacy condition to shift the large eigenvalues of the self-scaling memoryless BFGS matrix to the left. The stepsize is determined by the Wolfe line search conditions. The global convergence of this method is proved, assuming that the minimizing function is uniformly convex. The preliminary computational experiments on a set of 80 unconstrained optimization test functions show that this algorithm is more efficient and more robust than the self-scaling BFGS updates by Oren and Luenberger and by Oren and Spedicato. Subject to the CPU time metric, CG-DESCENT is top performer. Comparisons with L-BFGS show that our algorithm is more efficient.
Similar content being viewed by others
References
Al-Baali M (1998a) Global and superlinear convergence of a class of self-scaling methods with inexact line search. Comput Optim Appl 9:191–203
Al-Baali M (1998b) Numerical experience with a class of self-scaling quasi-Newton algorithms. J Optim Theory Appl 96:533–553
Andrei N (2009) Accelerated conjugate gradient algorithm with finite difference Hessian/vector product approximation for unconstrained optimization. J Comput Appl Math 230:570–582
Andrei N (2017) Eigenvalues versus singular values study in conjugate gradient algorithms for large-scale unconstrained optimization. Optim Methods Softw 32:534–551
Andrei N (2018a) An adaptive scaled BFGS method for unconstrained optimization. Numer Algorithms 77(2):413–432
Andrei N (2018b) A set of unconstrained optimization test problems. Technical Report No. 4/May 2, 2018. Research Institute for Informatics, Bucharest, pp 1–10
Bongartz I, Conn AR, Gould NIM, Toint PL (1995) CUTE: constrained and unconstrained testing environments. ACM TOMS 21:123–160
Dolan ED, Moré JJ (2002) Benchmarking optimization software with performance profiles. Math Program 91:201–213
Hager WW, Zhang H (2005) A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J Optim 16:170–192
Huang HY (1970) Unified approach to quadratically convergent algorithms for function minimization. J Optim Theory Appl 6:405–423
Liu DC, Nocedal J (1989) On the limited-memory BFGS method for large scale optimization. Math Program 45:503–528
Nocedal J (1980) Updating quasi-Newton matrices with limited storage. Math Comput 35:773–782
Nocedal J (1992) Theory of algorithms for unconstrained optimization. Acta Numer 1:199–242
Nocedal J, Yuan YX (1993) Analysis of self-scaling quasi-Newton method. Math Program 61:19–37
Oren SS, Luenberger DG (1974) Self-scaling variable metric (SSVM) algorithms, part I: criteria and sufficient conditions for scaling a class of algorithms. Manag Sci 20:845–862
Oren SS, Spedicato E (1976) Optimal conditioning of self-scaling variable metric algorithms. Math Program 10:70–90
Perry JM (1977) A class of conjugate gradient algorithms with a two-step variable-metric memory. Discussion Paper 269, Center for Mathematical Studies in Economics and Management Sciences, Northwestern University, Evanston
Shanno DF (1978) Conjugate gradient methods with inexact searches. Math Oper Res 3:244–256
Shanno DF, Phua KH (1978) Matrix conditioning and nonlinear optimization. Math Program 14:149–160
Sun W, Yuan YX (2006) Optimization theory and methods. Nonlinear programming. Springer Science + Business Media, New York
Wolfe P (1969) Convergence conditions for ascent methods. SIAM Rev 11:226–235
Wolfe P (1971) Convergence conditions for ascent methods. II: Some corrections. SIAM Rev 13:185–188
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
No potential conflict of interest was reported by the author.
Additional information
Communicated by Orizon Pereira Ferreira.
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations
Rights and permissions
About this article
Cite this article
Andrei, N. A double parameter self-scaling memoryless BFGS method for unconstrained optimization. Comp. Appl. Math. 39, 159 (2020). https://doi.org/10.1007/s40314-020-01157-z
Received:
Revised:
Accepted:
Published:
DOI: https://doi.org/10.1007/s40314-020-01157-z
Keywords
- Unconstrained optimization
- Self-scaling memoryless BFGS method
- Global convergence
- Numerical comparisons