Adaptive estimation of a quadratic functional by model selection.
From MaRDI portal
Publication:1848826
DOI10.1214/aos/1015957395zbMath1105.62328OpenAlexW1560153690MaRDI QIDQ1848826
Publication date: 14 November 2002
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1214/aos/1015957395
Nonparametric estimation (62G05) Non-Markovian processes: estimation (62M09) Applications of functional analysis in probability theory and statistics (46N30)
Related Items
Greedy Algorithm Almost Dominates in Smoothed Contextual Bandits ⋮ Random sections of ellipsoids and the power of random information ⋮ Phaselift is robust to a constant fraction of arbitrary errors ⋮ Restricted isometry property of principal component pursuit with reduced linear measurements ⋮ Particle dual averaging: optimization of mean field neural network with global convergence rate analysis* ⋮ On the proliferation of support vectors in high dimensions* ⋮ Goodness-of-fit tests for high-dimensional Gaussian linear models ⋮ Model Selection With Lasso-Zero: Adding Straw to the Haystack to Better Find Needles ⋮ Unnamed Item ⋮ Unnamed Item ⋮ Performance of Johnson--Lindenstrauss Transform for $k$-Means and $k$-Medians Clustering ⋮ Structural inference from reduced forms with many instruments ⋮ Unnamed Item ⋮ Exact matrix completion via convex optimization ⋮ Convergence rate of Bayesian supervised tensor modeling with multiway shrinkage priors ⋮ Regularized estimation in sparse high-dimensional multivariate regression, with application to a DNA methylation study ⋮ On randomized trace estimates for indefinite matrices with an application to determinants ⋮ Group sparse recovery via group square-root elastic net and the iterative multivariate thresholding-based algorithm ⋮ Data assimilation using time-delay nudging in the presence of Gaussian noise ⋮ Deep unsupervised feature selection by discarding nuisance and correlated features ⋮ An Improved Analysis and Unified Perspective on Deterministic and Randomized Low-Rank Matrix Approximation ⋮ Debiasing convex regularized estimators and interval estimation in linear models ⋮ Adaptive denoising of signals with local shift-invariant structure ⋮ The Lasso with structured design and entropy of (absolute) convex hulls ⋮ Should we estimate a product of density functions by a product of estimators? ⋮ Unnamed Item ⋮ Unnamed Item ⋮ Unnamed Item ⋮ On approximations of the PSD cone by a polynomial number of smaller-sized PSD cones ⋮ Estimation of high-dimensional change-points under a group sparsity structure ⋮ Support union recovery in high-dimensional multivariate regression ⋮ Adapting to unknown noise level in sparse deconvolution ⋮ Detecting the large entries of a sparse covariance matrix in sub-quadratic time ⋮ Near-optimal estimation of simultaneously sparse and low-rank matrices from nested linear measurements ⋮ Towards classical hardness of module-LWE: the linear rank case ⋮ Semiparametric estimation of the high-dimensional elliptical distribution ⋮ The smoothed complexity of Frank-Wolfe methods via conditioning of random matrices and polytopes ⋮ Likelihood Ratio Tests for a Large Directed Acyclic Graph ⋮ Eigenvector delocalization for non‐Hermitian random matrices and applications ⋮ Singular value perturbation and deep network optimization ⋮ Sparse quadratic classification rules via linear dimension reduction ⋮ Sparse PCA: optimal rates and adaptive estimation ⋮ Aggregation of affine estimators ⋮ Approximating Spectral Clustering via Sampling: A Review ⋮ Robust Estimators in High-Dimensions Without the Computational Intractability ⋮ Targeted Random Projection for Prediction From High-Dimensional Features ⋮ The Lanczos Algorithm Under Few Iterations: Concentration and Location of the Output ⋮ Norm and Trace Estimation with Random Rank-one Vectors ⋮ Eigenvectors of Orthogonally Decomposable Functions ⋮ Adaptive quadratic functional estimation of a weighted density by model selection ⋮ Unnamed Item ⋮ Unnamed Item ⋮ Fast rate of convergence in high-dimensional linear discriminant analysis ⋮ On tight bounds for the Lasso ⋮ Classification Error of the Thresholded Independence Rule ⋮ Model selection for estimating the non zero components of a Gaussian vector ⋮ Adaptive estimation of a quadratic functional of a density by model selection ⋮ Adaptive tests of qualitative hypotheses ⋮ Adaptive nonparametric confidence sets ⋮ Unnamed Item ⋮ Unnamed Item ⋮ A new large deviation inequality for U-statistics of order 2 ⋮ Variance-stabilization-based compressive inversion under Poisson or Poisson–Gaussian noise with analytical bounds ⋮ Optimal Adaptation for Early Stopping in Statistical Inverse Problems ⋮ Parseval inequalities and lower bounds for variance-based sensitivity indices ⋮ Recursive estimators of integrated squared density derivatives ⋮ A selective review of group selection in high-dimensional models ⋮ Randomized maximum-contrast selection: subagging for large-scale regression ⋮ Principal component analysis in the local differential privacy model ⋮ Adaptive tests for periodic signal detection with applications to laser vibrometry ⋮ The bias of isotonic regression ⋮ Effect of Depth and Width on Local Minima in Deep Learning ⋮ Adaptive Global Testing for Functional Linear Models ⋮ Parallelization of a Common Changepoint Detection Method ⋮ Johnson-Lindenstrauss lemma for circulant matrices** ⋮ Tight lower bound of sparse covariance matrix estimation in the local differential privacy model ⋮ Rate optimal estimation of quadratic functionals in inverse problems with partially unknown operator and application to testing problems ⋮ Optimal Estimation of Genetic Relatedness in High-Dimensional Linear Models ⋮ Nonparametric screening under conditional strictly convex loss for ultrahigh dimensional sparse data ⋮ Spectral Methods for Passive Imaging: Nonasymptotic Performance and Robustness ⋮ Nonparametric confidence intervals for the integral of a function of an unknown density ⋮ Sharp Oracle Inequalities for Square Root Regularization ⋮ Multidimensional linear functional estimation in sparse Gaussian models and robust estimation of the mean ⋮ On the asymptotic variance of the debiased Lasso ⋮ On Approximating Matrix Norms in Data Streams ⋮ Probability Bounds for Polynomial Functions in Random Variables ⋮ Unnamed Item ⋮ Unnamed Item ⋮ Fluid heterogeneity detection based on the asymptotic distribution of the time-averaged mean squared displacement in single particle tracking experiments ⋮ Localizing differentially evolving covariance structures via scan statistics ⋮ Mean-square estimation of nonlinear functionals via Kalman filtering ⋮ ROP: matrix recovery via rank-one projections ⋮ Unnamed Item ⋮ Unnamed Item ⋮ Robust Width: A Characterization of Uniformly Stable and Robust Compressed Sensing ⋮ New analysis of manifold embeddings and signal recovery from compressive measurements ⋮ Curve registration by nonparametric goodness-of-fit testing ⋮ Realigning random states ⋮ Do semidefinite relaxations solve sparse PCA up to the information limit? ⋮ Optimal multiple change-point detection for high-dimensional data ⋮ On accuracy of Gaussian approximation in Bayesian semiparametric problems ⋮ Path regularity of the Brownian motion and the Brownian sheet ⋮ A Variable Density Sampling Scheme for Compressive Fourier Transform Interferometry ⋮ Provable sample-efficient sparse phase retrieval initialized by truncated power method ⋮ Interactive versus noninteractive locally differentially private estimation: two elbows for the quadratic functional ⋮ Simple adaptive estimation of quadratic functionals in nonparametric IV models ⋮ Dimension-free bounds for sums of dependent matrices and operators with heavy-tailed distributions ⋮ Early stopping for statistical inverse problems via truncated SVD estimation ⋮ On principal components regression, random projections, and column subsampling ⋮ High-dimensional analysis of semidefinite relaxations for sparse principal components ⋮ On estimation of the diagonal elements of a sparse precision matrix ⋮ An improved global risk bound in concave regression ⋮ On the optimization landscape of tensor decompositions ⋮ Multidimensional two-component Gaussian mixtures detection ⋮ Asymptotic normality of quadratic estimators ⋮ Estimating linear functionals of a sparse family of Poisson means ⋮ Stochastic Airy semigroup through tridiagonal matrices ⋮ Grouped variable importance with random forests and application to multiple functional data analysis ⋮ Optimal adaptive estimation of a quadratic functional ⋮ Optimal functional supervised classification with separation condition ⋮ Simple bounds for recovering low-complexity models ⋮ Probably certifiably correct \(k\)-means clustering ⋮ On change-point estimation under Sobolev sparsity ⋮ Optimal detection of sparse principal components in high dimension ⋮ Quality gain analysis of the weighted recombination evolution strategy on general convex quadratic functions ⋮ Optimal variable selection in multi-group sparse discriminant analysis ⋮ Learning a factor model via regularized PCA ⋮ Adaptive tests of linear hypotheses by model selection ⋮ Gaussian measures on the of space of Riemannian metrics ⋮ Random design analysis of ridge regression ⋮ Multivariate Hadamard self-similarity: testing fractal connectivity ⋮ Bayesian shrinkage towards sharp minimaxity ⋮ Adaptive covariance matrix estimation through block thresholding ⋮ Scalable interpretable learning for multi-response error-in-variables regression ⋮ Prediction error after model search ⋮ Witnessed \(k\)-distance ⋮ A variant of the Johnson-Lindenstrauss lemma for circulant matrices ⋮ Two are better than one: fundamental parameters of frame coherence ⋮ Minimax goodness-of-fit testing in ill-posed inverse problems with partially unknown operators ⋮ Estimating linear and quadratic forms via indirect observations ⋮ Detecting curved edges in noisy images in sublinear time ⋮ Non asymptotic minimax rates of testing in signal detection with heterogeneous variances ⋮ Laplace deconvolution with noisy observations ⋮ Bayesian fusion estimation via \(t\) shrinkage ⋮ Adaptive estimation of linear functionals by model selection ⋮ Simultaneous estimation of the mean and the variance in heteroscedastic Gaussian regression ⋮ Smoothed residual stopping for statistical inverse problems via truncated SVD estimation ⋮ On the total variation regularized estimator over a class of tree graphs ⋮ Minimax optimal procedures for testing the structure of multidimensional functions ⋮ Hanson-Wright inequality in Hilbert spaces with application to \(K\)-means clustering for non-Euclidean data ⋮ Adaptive risk bounds in unimodal regression ⋮ Online estimation of integrated squared density derivatives ⋮ Risk of estimators for Sobol' sensitivity indices based on metamodels ⋮ Finite-sample analysis of \(M\)-estimators using self-concordance ⋮ Concave group methods for variable selection and estimation in high-dimensional varying coefficient models ⋮ The spectral norm of random inner-product kernel matrices ⋮ The road to deterministic matrices with the restricted isometry property ⋮ Super-resolution from noisy data ⋮ Greedy subspace pursuit for joint sparse recovery ⋮ Discrete uncertainty principles and sparse signal processing ⋮ Locally adaptive estimation of evolutionary wavelet spectra ⋮ Isotonic regression meets Lasso ⋮ Maximum pairwise Bayes factors for covariance structure testing ⋮ A tight degree 4 sum-of-squares lower bound for the Sherrington-Kirkpatrick Hamiltonian ⋮ Adaptive minimax testing for circular convolution ⋮ Euclidean distance between Haar orthogonal and Gaussian matrices ⋮ Optimal bounds for aggregation of affine estimators ⋮ Adaptive estimation of high-dimensional signal-to-noise ratios ⋮ Optimal estimation of variance in nonparametric regression with random design ⋮ Minimax estimation of the integral of a power of a density ⋮ Data-driven neighborhood selection of a Gaussian field ⋮ Deviation optimal learning using greedy \(Q\)-aggregation ⋮ A global homogeneity test for high-dimensional linear regression ⋮ Greedy variance estimation for the LASSO ⋮ High-dimensional Gaussian model selection on a Gaussian design ⋮ Variable selection for partially linear models via Bayesian subset modeling with diffusing prior ⋮ Noise level estimation in high-dimensional linear models ⋮ Testing for high-dimensional network parameters in auto-regressive models ⋮ Non-parametric adaptive estimation of order 1 Sobol indices in stochastic models, with an application to epidemiology ⋮ A simple adaptive estimator of the integrated square of a density ⋮ Nonparametric estimation for i.i.d. Gaussian continuous time moving average models ⋮ Eigenvectors of random matrices: A survey ⋮ Tuning-Free Heterogeneity Pursuit in Massive Networks ⋮ Estimating the intensity of a random measure by histogram type estimators ⋮ Learning non-parametric basis independent models from point queries via low-rank methods ⋮ Tight conditions for consistency of variable selection in the context of high dimensionality ⋮ Stein 1956: Efficient nonparametric testing and estimation ⋮ Second-order Stein: SURE for SURE and other applications in high-dimensional inference ⋮ Optimality of spectral clustering in the Gaussian mixture model ⋮ Prediction bounds for higher order total variation regularized least squares ⋮ On the regularization effect of stochastic gradient descent applied to least-squares ⋮ Empirical Bayesian test of the smoothness ⋮ Gaussian model selection with an unknown variance ⋮ Asymptotic analysis for extreme eigenvalues of principal minors of random matrices ⋮ The all-or-nothing phenomenon in sparse linear regression ⋮ Asymptotic equivalence and adaptive estimation for robust nonparametric regression ⋮ Consistency of a range of penalised cost approaches for detecting multiple changepoints ⋮ Two-stage algorithm for estimation of nonlinear functions of state vector in linear Gaussian continuous dynamical systems ⋮ Marginals of a spherical spin Glass model with correlated disorder ⋮ On the robustness of minimum norm interpolators and regularized empirical risk minimizers ⋮ Stabilize deep ResNet with a sharp scaling factor \(\tau\) ⋮ AdaBoost and robust one-bit compressed sensing ⋮ Optimal detection of the feature matching map in presence of noise and outliers ⋮ Randomly initialized EM algorithm for two-component Gaussian mixture achieves near optimality in \(O(\sqrt{n})\) iterations ⋮ Functional convergence of sequential \(U\)-processes with size-dependent kernels ⋮ On two continuum armed bandit problems in high dimensions ⋮ Nonquadratic estimators of a quadratic functional
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Minimax quadratic estimation of a quadratic functional
- Geometrizing rates of convergence. II
- Minimum contrast estimators on sieves: Exponential bounds and rates of convergence
- Risk bounds for model selection via penalization
- Model selection for regression on a fixed design
- Minimax estimation via wavelet shrinkage
- Efficient estimation of integral functionals of a density
- On optimal adaptive estimation of a quadratic functional
- Estimation of integral functionals of a density
- Wavelet compression and nonlinear \(n\)-widths
- Wavelet methods to estimate an integrated quadratic functional: Adaptivity and asymptotic law
- Sample functions of the Gaussian process
- Compression of Wavelet Decompositions
- [https://portal.mardi4nfdi.de/wiki/Publication:4743580 Approximation dans les espaces m�triques et th�orie de l'estimation]
- Chi-square oracle inequalities