[Back]


Talks and Poster Presentations (with Proceedings-Entry):

R. Dallinger, M. Rupp:
"A Strict Stability Limit for Adaptive Gradient Type Algorithms";
Poster: Asilomar Conference on Signals, Systems, and Computers, Pacific Grove (CA), USA; 11-01-2009 - 11-04-2009; in: "Forty-Third Asilomar Conference on Signals, Systems, and Computers", (2009), ISBN: 978-1-4244-5826-4; 1370 - 1374.



English abstract:
This paper considers gradient type algorithms with a regression vector allowed to be different from the input vector. To cover the most general case, no restrictions are imposed on the dependency between the excitation vector and the regression vector. In terms of l2-stability, for the real-valued domain, a convergence analysis is performed based on the singular value decomposition. It reveals that such algorithms are potentially unstable if the input vector and the regression vector do not have the same direction. For the conventional gradient type algorithm (for which latter vectors are parallel), an l2-stability bound, from literature known to be sufficient, can be shown to be actually strict. Simulations demonstrate how the presented method can be used to discover unstable modes of an apparently stable algorithm.

Keywords:
gradient algorithms, stability, robustness, convergence, singular value decomposition


Electronic version of the publication:
http://publik.tuwien.ac.at/files/PubDat_178471.pdf


Created from the Publication Database of the Vienna University of Technology.