Abstract
In this paper, we consider a class of second-order ordinary differential equations with Hessian-driven damping and Tikhonov regularization, which arises from the minimization of a smooth convex function in Hilbert spaces. Inspired by Attouch et al. (J Differ Equ 261:5734–5783, 2016), we establish that the function value along the solution trajectory converges to the optimal value, and prove that the convergence rate can be as fast as \(o(1/t^2)\). By constructing proper energy function, we prove that the trajectory strongly converges to a minimizer of the objective function of minimum norm. Moreover, we propose a gradient-based optimization algorithm based on numerical discretization, and demonstrate its effectiveness in numerical experiments.
Similar content being viewed by others
Explore related subjects
Discover the latest articles and news from researchers in related subjects, suggested using machine learning.References
Attouch, H.: Viscosity solutions of minimization problems. SIAM J. Optim. 6, 769–806 (1996). https://doi.org/10.1137/S1052623493259616
Attouch, H., Balhag, A., Chbani, Z., Riahi, H.: Accelerated gradient methods combining Tikhonov regularization with geometric damping driven by the Hessian. Appl. Math. Optim. 88, 1–29 (2023). https://doi.org/10.1007/s00245-023-09997-x
Attouch, H., Balhag, A., Chbani, Z., Riahi, H.: Damped inertial dynamics with vanishing Tikhonov regularization: strong asymptotic convergence towards the minimum norm solution. J. Differ. Equ. 311, 29–58 (2022). https://doi.org/10.1016/j.jde.2021.12.005
Attouch, H., Chbani, Z., Fadili, J., Riahi, H.: First-order optimization algorithms via inertial systems with Hessian driven damping. Math. Program. (2020). https://doi.org/10.1007/s10107-020-01591-1
Attouch, H., Chbani, Z., Peypouquet, J., Redont, P.: Fast convergence of inertial dynamics and algorithms with asymptotic vanishing viscosity. Math. Program. 168, 123–175 (2018). https://doi.org/10.1007/s10107-016-0992-8
Attouch, H., Chbani, Z., Riahi, H.: Combining fast inertial dynamics for convex optimization with Tikhonov regularization. J. Math. Anal. Appl. 457, 1065–1094 (2018). https://doi.org/10.1016/j.jmaa.2016.12.017
Attouch, H., Chbani, Z., Riahi, H.: Rate of convergence of the Nesterov accelerated gradient method in the subcritical case \(\alpha \le 3\). ESAIM Control Optim. Calc. Var. 25, 2–35 (2019). https://doi.org/10.1051/cocv/2017083
Attouch, H., Laszlo, S.: Convex optimization via inertial algorithms with vanishing Tikhonov regularization: fast convergence to the minimum norm solution. arXiv:2104.11987 (2021)
Attouch, H., Peypouquet, J., Redont, P.: Fast convex optimization via inertial dynamics with Hessian driven damping. J. Differ. Equ. 261, 5734–5783 (2016). https://doi.org/10.1016/j.jde.2016.08.020
Boţ, R.I., Csetnek, E.R., László, S.C.: Tikhonov regularization of a second order dynamical system with Hessian driven damping. Math. Program. 189, 151–186 (2021). https://doi.org/10.1007/s10107-020-01528-8
Bottou, L., Curtis, F.E., Nocedal, J.: Optimization methods for large-scale machine learning. SIAM Rev. 60, 223–311 (2018). https://doi.org/10.1137/16M1080173
Brezis, H.: Functional Analysis, Sobolev Spaces and Partial Differential Equations. Springer, Berlin (2010)
Hairer, E., Lubich, C., Wanner, G.: Geometric Numerical Integration: Structure-Preserving Algorithms for Ordinary Differential Equations. Springer, Berlin (2006)
Lin, Z., Li, H., Fang, C.: Accelerated Optimization for Machine Learning: First-Order Algorithms. Springer, Berlin (2020)
May, R.: Asymptotic for a second-order evolution equation with convex potential and vanishing damping term. Turk. J. Math. 41, 681–685 (2017). https://doi.org/10.3906/mat-1512-28
May, R., Mnasri, C., Elloumi, M.: Asymptotic for a second order evolution equation with damping and regularizing terms. AIMS Math. 6, 4901–4914 (2021). https://doi.org/10.3934/math.2021287
Nesterov, Y.: A method of solving a convex programming problem with convergence rate \(o(\frac{1}{k^2})\). Soviet Math. Dokl. 27, 372–376 (1983)
Nesterov, Y.: Lectures on Convex Optimization, Second Edition, volume 137 of Springer Optimization and Its Applications. Springer, Cham (2018)
Sontag, E.D.: Mathematical Control Theory: Deterministic Finite Dimensional Systems, vol. 6. Springer, Berlin (2013)
Sra, S., Nowozin, S., Wright, S.J.: Optimization for Machine Learning. MIT Press, New York (2012)
Su, W., Boyd, S., Candès, E.J.: A differential equation for modeling Nesterov’s accelerated gradient method: theory and insights. J. Mach. Learn. Res. 17, 1–43 (2016)
Xu, B., Wen, B.: On the convergence of a class of inertial dynamical systems with Tikhonov regularization. Optim. Lett. 15, 2025–2052 (2020). https://doi.org/10.1007/s11590-020-01663-3
Zhang, J., Mokhtari, A., Sra, S., Jadbabaie, A.: Direct Runge-Kutta discretization achieves acceleration. arXiv:1805.00521 (2021)
Acknowledgements
The authors are supported by the National Natural Science Foundation of China (Grant No. 12071160).
Author information
Authors and Affiliations
Corresponding author
Additional information
Communicated by Akhtar A. Khan.
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Zhong, G., Hu, X., Tang, M. et al. Fast Convex Optimization via Differential Equation with Hessian-Driven Damping and Tikhonov Regularization. J Optim Theory Appl 203, 42–82 (2024). https://doi.org/10.1007/s10957-024-02462-x
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10957-024-02462-x