Bregman proximal gradient algorithms for deep matrix factorization
From MaRDI portal
Publication:826170
DOI10.1007/978-3-030-75549-2_17zbMath1484.68208OpenAlexW3158730839MaRDI QIDQ826170
Daniel Cremers, Emanuel Laude, Mahesh Chandra Mukkamala, Felix Westerkamp, Peter Ochs
Publication date: 20 December 2021
Full work available at URL: https://doi.org/10.1007/978-3-030-75549-2_17
Artificial neural networks and deep learning (68T07) Factorization of matrices (15A23) Convex programming (90C25) Nonconvex programming, global optimization (90C26) Nonlinear programming (90C30)
Related Items (1)
Cites Work
- Unnamed Item
- Unnamed Item
- Tensor Decompositions and Applications
- Proximal alternating linearized minimization for nonconvex and nonsmooth problems
- On the convergence of the proximal algorithm for nonsmooth functions involving analytic features
- Introductory lectures on convex optimization. A basic course.
- Mirror descent and nonlinear projected subgradient methods for convex optimization.
- Fastest rates for stochastic mirror descent methods
- iPiano: Inertial Proximal Algorithm for Nonconvex Optimization
- Proximal Alternating Minimization and Projection Methods for Nonconvex Problems: An Approach Based on the Kurdyka-Łojasiewicz Inequality
- Inertial Proximal Alternating Linearized Minimization (iPALM) for Nonconvex and Nonsmooth Problems
- Clarke Subgradients of Stratifiable Functions
- Splitting Algorithms for the Sum of Two Nonlinear Operators
- First Order Methods Beyond Convexity and Lipschitz Gradient Continuity with Applications to Quadratic Inverse Problems
- Relatively Smooth Convex Optimization by First-Order Methods, and Applications
- Convex-Concave Backtracking for Inertial Bregman Proximal Gradient Algorithms in Nonconvex Optimization
- A Descent Lemma Beyond Lipschitz Gradient Continuity: First-Order Methods Revisited and Applications
This page was built for publication: Bregman proximal gradient algorithms for deep matrix factorization