A probabilistic oracle inequality and quantification of uncertainty of a modified discrepancy principle for statistical inverse problems
DOI10.1553/etna_vol57s35zbMath1490.65112arXiv2202.12596OpenAlexW4285158562MaRDI QIDQ2153955
Publication date: 13 July 2022
Published in: ETNA. Electronic Transactions on Numerical Analysis (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2202.12596
discrepancy principleoracle inequalitystatistical inverse problemsearly stoppingnon-Bayesian approach
Nonparametric regression and quantile regression (62G08) Numerical solutions of ill-posed problems in abstract spaces; regularization (65J20) Linear operators and ill-posed problems, regularization (47A52) Numerical solution to inverse problems in abstract spaces (65J22)
Related Items (1)
Cites Work
- Unnamed Item
- Complexity of linear ill-posed problems in Hilbert space
- Comparing parameter choice methods for regularization of ill-posed problems
- Beyond the Bakushinkii veto: regularising linear inverse problems without knowing the noise distribution
- Estimation of the mean of a multivariate normal distribution
- Regularization tools: A Matlab package for analysis and solution of discrete ill-posed problems
- Asymptotic optimality of generalized cross-validation for choosing the regularization parameter
- Early stopping for statistical inverse problems via truncated SVD estimation
- Risk estimators for choosing regularization parameters in ill-posed problems -- properties and limitations
- On pointwise adaptive nonparametric deconvolution
- Empirical risk minimization as parameter choice rule for general linear regularization methods
- Discrepancy based model selection in statistical inverse problems
- The discretized discrepancy principle under general source conditions
- Discrepancy principle for statistical inverse problems with application to conjugate gradient iteration
- A Lepskij-type stopping rule for regularized Newton methods
- Regularization of some linear ill-posed problems with discretized random noisy data
- Regularization independent of the noise level: an analysis of quasi-optimality
- Analysis of Discrete Ill-Posed Problems by Means of the L-Curve
- Practical Approximate Solutions to Linear Operator Equations When the Data are Noisy
- Ideal spatial adaptation by wavelet shrinkage
- Geometry of linear ill-posed problems in variable Hilbert scales
- On the best rate of adaptive estimation in some inverse problems
- Adaptivity and Oracle Inequalities in Linear Statistical Inverse Problems: A (Numerical) Survey
- The quasi-optimality criterion in the linear functional strategy
- Optimal Adaptation for Early Stopping in Statistical Inverse Problems
- On a Problem of Adaptive Estimation in Gaussian White Noise
- Computational Methods for Inverse Problems
- Optimal Convergence of the Discrepancy Principle for Polynomially and Exponentially Ill-Posed Operators under White Noise
- Regularized Linear Inversion with Randomized Singular Value Decomposition
- Convergence Rates of General Regularization Methods for Statistical Inverse Problems and Applications
- How general are general source conditions?
- The Lepskii principle revisited
- Use of the regularization method in non-linear problems
- Regularizing linear inverse problems under unknown non-Gaussian white noise allowing repeated measurements
- Probability: A Graduate Course
This page was built for publication: A probabilistic oracle inequality and quantification of uncertainty of a modified discrepancy principle for statistical inverse problems