Minimization of the Tikhonov functional in Banach spaces smooth and convex of power type by steepest descent in the dual
From MaRDI portal
Publication:535301
DOI10.1007/s10589-009-9257-2zbMath1237.90182OpenAlexW2009529378MaRDI QIDQ535301
Publication date: 11 May 2011
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10589-009-9257-2
convex optimizationconvergence ratelinear convergencesparsityconvex of power typesmooth of power type
Related Items (2)
Gradient descent technology for sparse vector learning in ontology algorithms ⋮ The learning rates of regularized regression based on reproducing kernel Banach spaces
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- On the uniform convexity of \(L^p\) and \(l^p\)
- A generalized conditional gradient method and its connection to an iterative shrinkage method
- Minimization of Tikhonov functionals in Banach spaces
- Characteristic inequalities of uniformly convex and uniformly smooth Banach spaces
- Nonlinear iterative methods for linear ill-posed problems in Banach spaces
- Rates of Convergence for Conditional Gradient Algorithms Near Singular and Nonsingular Extremals
- An iterative thresholding algorithm for linear inverse problems with a sparsity constraint
- Convergence rates of convex variational regularization
- A convergence rates result for Tikhonov regularization in Banach spaces with non-smooth operators
- Error estimates for non-quadratic regularization and the relation to enhancement
- Regularization of ill-posed problems in Banach spaces: convergence rates
This page was built for publication: Minimization of the Tikhonov functional in Banach spaces smooth and convex of power type by steepest descent in the dual