Access the full text.
Sign up today, get DeepDyve free for 14 days.
Donghui Li, M. Fukushima, L. Qi, N. Yamashita (2004)
Regularized Newton Methods for Convex Minimization Problems with Singular SolutionsComputational Optimization and Applications, 28
T. Tsuchiya, Qing Fang (2001)
An Explicit Inversion Formula for Tridiagonal Matrices, 15
Yingjie Li, Donghui Li (2009)
Truncated regularized Newton method for convex minimizationsComputational Optimization and Applications, 43
N. Yamashita, M. Fukushima (2001)
On the Rate of Convergence of the Levenberg-Marquardt Method, 15
R. Horn, Charles Johnson (1985)
Matrix analysis
J. Nocedal, S.J. Wright (1999)
Numerical Optimization
Hiroshige Dan, N. Yamashita, M. Fukushima (2002)
Convergence Properties of the Inexact Levenberg-Marquardt Method under Local Error Bound ConditionsOptimization Methods and Software, 17
R. Polyak (2009)
Regularized Newton method for unconstrained convex optimizationMathematical Programming, 120
The regularized Newton method (RNM) is one of the efficient solution methods for the unconstrained convex optimization. It is well-known that the RNM has good convergence properties as compared to the steepest descent method and the pure Newton’s method. For example, Li, Fukushima, Qi and Yamashita showed that the RNM has a quadratic rate of convergence under the local error bound condition. Recently, Polyak showed that the global complexity bound of the RNM, which is the first iteration k such that ‖ ∇ f ( x k )‖≤ ε , is O ( ε −4 ), where f is the objective function and ε is a given positive constant. In this paper, we consider a RNM extended to the unconstrained “nonconvex” optimization. We show that the extended RNM (E-RNM) has the following properties. (a) The E-RNM has a global convergence property under appropriate conditions. (b) The global complexity bound of the E-RNM is O ( ε −2 ) if ∇ 2 f is Lipschitz continuous on a certain compact set. (c) The E-RNM has a superlinear rate of convergence under the local error bound condition.
Applied Mathematics and Optimization – Springer Journals
Published: Aug 1, 2010
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get DeepDyve free for 14 days.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.