Local convergence of alternating low‐rank optimization methods with overrelaxation

  • The local convergence of alternating optimization methods with overrelaxation for low-rank matrix and tensor problems is established. The analysis is based on the linearization of the method which takes the form of an SOR iteration for a positive semidefinite Hessian and can be studied in the corresponding quotient geometry of equivalent low-rank representations. In the matrix case, the optimal relaxation parameter for accelerating the local convergence can be determined from the convergence rate of the standard method. This result relies on a version of Young's SOR theorem for positive semidefinite 2x2 block systems.

Download full text files

Export metadata

Statistics

Number of document requests

Additional Services

Share in Twitter Search Google Scholar
Metadaten
Author:Ivan V. Oseledets, Maxim V. Rakhuba, André UschmajewGND
URN:urn:nbn:de:bvb:384-opus4-1031124
Frontdoor URLhttps://opus.bibliothek.uni-augsburg.de/opus4/103112
ISSN:1070-5325OPAC
ISSN:1099-1506OPAC
Parent Title (English):Numerical Linear Algebra with Applications
Publisher:Wiley
Place of publication:Weinheim
Type:Article
Language:English
Year of first Publication:2023
Publishing Institution:Universität Augsburg
Release Date:2023/03/23
Tag:Applied Mathematics; Algebra and Number Theory
Volume:30
Issue:3
First Page:e2459
DOI:https://doi.org/10.1002/nla.2459
Institutes:Mathematisch-Naturwissenschaftlich-Technische Fakultät
Mathematisch-Naturwissenschaftlich-Technische Fakultät / Institut für Mathematik
Mathematisch-Naturwissenschaftlich-Technische Fakultät / Institut für Mathematik / Lehrstuhl für Mathematical Data Science
Dewey Decimal Classification:5 Naturwissenschaften und Mathematik / 51 Mathematik / 510 Mathematik
Licence (German):CC-BY-NC-ND 4.0: Creative Commons: Namensnennung - Nicht kommerziell - Keine Bearbeitung (mit Print on Demand)