A modified discrepancy principle to attain optimal convergence rates under unknown noise
From MaRDI portal
Publication:5006372
DOI10.1088/1361-6420/ac1775zbMath1472.62066arXiv2103.03545OpenAlexW3189311248MaRDI QIDQ5006372
Publication date: 13 August 2021
Published in: Inverse Problems (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2103.03545
Asymptotic properties of nonparametric inference (62G20) Numerical solution to inverse problems in abstract spaces (65J22)
Related Items (2)
Noise Level Free Regularization of General Linear Inverse Problems under Unconstrained White Noise ⋮ Dual gradient method for ill-posed problems using multiple repeated measurement data
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Beyond the Bakushinkii veto: regularising linear inverse problems without knowing the noise distribution
- Optimal filtering of square-integrable signals in Gaussian noise
- Early stopping for statistical inverse problems via truncated SVD estimation
- On minimax filtering over ellipsoids
- Towards adaptivity via a new discrepancy principle for Poisson inverse problems
- Empirical risk minimization as parameter choice rule for general linear regularization methods
- Discrepancy based model selection in statistical inverse problems
- Regularization tools version \(4.0\) for matlab \(7.3\)
- Discrepancy principle for statistical inverse problems with application to conjugate gradient iteration
- Regularization independent of the noise level: an analysis of quasi-optimality
- Remarks on choosing a regularization parameter using the quasi-optimality and ratio criterion
- Discretization strategy for linear ill-posed problems in variable Hilbert scales
- Convergence Rates of General Regularization Methods for Statistical Inverse Problems and Applications
- Oracle Inequality for a Statistical Raus--Gfrerer-Type Rule
This page was built for publication: A modified discrepancy principle to attain optimal convergence rates under unknown noise