Convergence of Gradient-Based Block Coordinate Descent Algorithms for Nonorthogonal Joint Approximate Diagonalization of Matrices
DOI10.1137/21m1456972arXiv2009.13377OpenAlexW4376129141MaRDI QIDQ6101125
Konstantin Usevich, Jian Ze Li, Pierre Comon
Publication date: 31 May 2023
Published in: SIAM Journal on Matrix Analysis and Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2009.13377
convergence analysisblind source separationblock coordinate descentmanifold optimizationJacobi-G algorithmjoint approximate diagonalization of matrices
Lua error in Module:PublicationMSCList at line 37: attempt to index local 'msc_result' (a nil value).
Related Items (1)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Error bounds and convergence analysis of feasible descent methods: A general approach
- On the convergence of the coordinate descent method for convex differentiable minimization
- Independent component analysis, a new concept?
- On semi- and subanalytic geometry
- On approximate diagonalization of third order symmetric tensors by orthogonal transformations
- Coordinate descent algorithms
- Jacobi Algorithm for the Best Low Multilinear Rank Approximation of Symmetric Tensors
- A Block Coordinate Descent Method for Regularized Multiconvex Optimization with Applications to Nonnegative Tensor Factorization and Completion
- Maximum Block Improvement and Polynomial Optimization
- Optimization Methods on Riemannian Manifolds and Their Application to Shape Space
- A new convergence proof for the higher-order power method and generalizations
- On Convergence of the Maximum Block Improvement Method
- Convergence Results for Projected Line-Search Methods on Varieties of Low-Rank Matrices Via Łojasiewicz Inequality
- Globally Convergent Jacobi-Type Algorithms for Simultaneous Orthogonal Symmetric Tensor Diagonalization
- Steepest Descent Algorithms for Optimization Under Unitary Matrix Constraint
- Nonorthogonal Joint Diagonalization by Combining Givens and Hyperbolic Rotations
- Jacobi Angles for Simultaneous Diagonalization
- Approximate Matrix and Tensor Diagonalization by Unitary Transformations: Convergence of Jacobi-Type Algorithms
- Approximate Joint Diagonalization with Riemannian Optimization on the General Linear Group
- Lie Groups, Lie Algebras, and Representations
- A new, globally convergent Riemannian conjugate gradient method
- Non-orthogonal joint diagonalization in the least-squares sense with application in blind source separation
- Convergence of the Iterates of Descent Methods for Analytic Cost Functions
- Independent Component Analysis and Blind Signal Separation
This page was built for publication: Convergence of Gradient-Based Block Coordinate Descent Algorithms for Nonorthogonal Joint Approximate Diagonalization of Matrices