Abstract
A new nonmonotone algorithm is proposed and analyzed for unconstrained nonlinear optimization. The nonmonotone techniques applied in this algorithm are based on the estimate sequence proposed by Nesterov (Introductory Lectures on Convex Optimization: A Basic Course, 2004) for convex optimization. Under proper assumptions, global convergence of this algorithm is established for minimizing general nonlinear objective function with Lipschitz continuous derivatives. For convex objective function, this algorithm maintains the optimal convergence rate of convex optimization. In numerical experiments, this algorithm is specified by employing safe-guarded nonlinear conjugate gradient search directions. Numerical results show the nonmonotone algorithm performs significantly better than the corresponding monotone algorithm for solving the unconstrained optimization problems in the CUTEr (Bongartz et al. in ACM Trans. Math. Softw. 21:123–160, 1995) library.
Similar content being viewed by others
References
Bongartz, I., Conn, A.R., Gould, N.I.M., Toint, P.L.: CUTE: constrained and unconstrained testing environments. ACM Trans. Math. Softw. 21, 123–160 (1995)
Dai, Y.H., Zhang, H.: An adaptive two-point stepsize gradient algorithm. Numer. Algorithms 27, 377–385 (2001)
Dai, Y.H., Yuan, Y.Y.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10, 177–182 (1999)
Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002)
Ghadimi, S., Lan, G.: Accelerated gradient methods for nonconvex nonlinear and stochastic optimization. Technical Report, Department of Industrial and Systems Engineering, University of Florida (2013)
Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM J. Numer. Anal. 23, 707–716 (1986)
Hager, W.W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16, 170–192 (2005)
Hager, W.W., Zhang, H.: A new active set algorithm for box constrained optimization. SIAM J. Optim. 17, 526–557 (2006)
Hager, W.W., Zhang, H.: A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2, 35–58 (2006)
Hager, W.W., Zhang, H.: The limited memory conjugate gradient method. Technical Report (Nov. 6th, 2012)
Nemirovski, A., Juditsky, A., Lan, G., Shapiro, A.: Robust stochastic approximation approach to stochastic programming. SIAM J. Optim. 19, 1574–1609 (2009)
Nesterov, Y.: A method for unconstrained convex minimization problem with the rate of convergence \(\mathcal {O}(1/k^{2})\). Dokl. Akad. Nauk SSSR 269, 543–547 (1983). Translated as Soviet Math. Dokl.
Nesterov, Y.: Introductory Lectures on Convex Optimization: A Basic Course. Kluwer Academic, Norwell (2004)
Zhang, H., Hager, W.W.: A nonmonotone line search technique and its application to unconstrained optimization. SIAM J. Optim. 14, 1043–1056 (2004)
Author information
Authors and Affiliations
Corresponding author
Additional information
This material is based upon work supported by National Science Foundation grant 1016204.
Rights and permissions
About this article
Cite this article
Zhang, H. A nonmonotone approximate sequence algorithm for unconstrained nonlinear optimization. Comput Optim Appl 57, 27–43 (2014). https://doi.org/10.1007/s10589-013-9588-x
Received:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10589-013-9588-x