Nothing Special   »   [go: up one dir, main page]

×
Please click here if you are not redirected within a few seconds.
We propose stochastic optimization algorithms that can find local minima faster than existing algorithms for nonconvex optimization problems.
Missing: Even | Show results with:Even
In this section we will show how to employ the third-order smoothness of the objective function to make better use of the negative curvature direction for ...
Dec 18, 2017 · We propose stochastic optimization algorithms that can find local minima faster than existing algorithms for nonconvex optimization problems.
Dec 18, 2017 · We show that third-order smoothness of nonconvex function can lead to a faster escape from saddle points in the stochastic optimization. We ...
We propose stochastic optimization algorithms that can find local minima faster than existing algorithms for nonconvex optimization problems, by exploiting ...
Missing: Even | Show results with:Even
I Why approximate local minimum? . A local minimum is adequate and can be as good as a global minimum in terms of generalization performance.
Missing: Even | Show results with:Even
We propose stochastic optimization algorithms that can find local minima faster than existing algorithms for nonconvex optimization problems, by exploiting ...
Third-order Smoothness Helps: Faster Stochastic Optimization Algorithms for Finding Local Minima. Yaodong Yu*, Pan Xu* and Quanquan Gu, (*: equal contribution).
Missing: Even | Show results with:Even
We propose stochastic optimization algorithms that can find local minima faster than existing algorithms for nonconvex optimization problems, by exploiting ...
People also ask
Yu, Yaodong, Xu, Pan, & Gu, Quanquan. Third-order Smoothness Helps: Faster Stochastic Optimization Algorithms for Finding Local Minima. Advances in neural ...
Missing: Even | Show results with:Even