Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Renormalised Steepest Descent in Hilbert Space Converges to a Two-Point Attractor

Renormalised Steepest Descent in Hilbert Space Converges to a Two-Point Attractor The result that for quadratic functions the classical steepest descent algorithm in R d converges locally to a two-point attractor was proved by Akaike. In this paper this result is proved for bounded quadratic operators in Hilbert space. The asymptotic rate of convergence is shown to depend on the starting point while, as expected, confirming the Kantorovich bounds. The introduction of a relaxation coefficient in the steepest-descent algorithm completely changes its behaviour, which may become chaotic. Different attractors are presented. We show that relaxation allows a significantly improved rate of convergence. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Acta Applicandae Mathematicae Springer Journals

Renormalised Steepest Descent in Hilbert Space Converges to a Two-Point Attractor

Loading next page...
 
/lp/springer-journals/renormalised-steepest-descent-in-hilbert-space-converges-to-a-two-NAWLU8iTn0

References (9)

Publisher
Springer Journals
Copyright
Copyright © 2001 by Kluwer Academic Publishers
Subject
Mathematics; Mathematics, general; Computer Science, general; Theoretical, Mathematical and Computational Physics; Complex Systems; Classical Mechanics
ISSN
0167-8019
eISSN
1572-9036
DOI
10.1023/A:1010680020662
Publisher site
See Article on Publisher Site

Abstract

The result that for quadratic functions the classical steepest descent algorithm in R d converges locally to a two-point attractor was proved by Akaike. In this paper this result is proved for bounded quadratic operators in Hilbert space. The asymptotic rate of convergence is shown to depend on the starting point while, as expected, confirming the Kantorovich bounds. The introduction of a relaxation coefficient in the steepest-descent algorithm completely changes its behaviour, which may become chaotic. Different attractors are presented. We show that relaxation allows a significantly improved rate of convergence.

Journal

Acta Applicandae MathematicaeSpringer Journals

Published: Oct 19, 2004

There are no references for this article.