From \(\varepsilon\)-entropy to KL-entropy: analysis of minimum information complexity density estima\-tion
From MaRDI portal
Publication:869967
DOI10.1214/009053606000000704zbMath1106.62005arXivmath/0702653OpenAlexW2086333522MaRDI QIDQ869967
Publication date: 12 March 2007
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/math/0702653
Density estimation (62G07) Bayesian inference (62F15) Bayesian problems; characterization of Bayes procedures (62C10) Statistical aspects of information-theoretic topics (62B10)
Related Items
Unnamed Item, Unnamed Item, Posterior concentration and fast convergence rates for generalized Bayesian learning, Model-free posterior inference on the area under the receiver operating characteristic curve, Generalized Bayes Quantification Learning under Dataset Shift, A comparison of learning rate selection methods in generalized Bayesian inference, Joint production in stochastic non-parametric envelopment of data with firm-specific directions, Adaptive variational Bayes: optimality, computation and applications, Adaptive variable selection for sequential prediction in multivariate dynamic models, Dynamics of Bayesian updating with dependent data and misspecified models, Adaptive Bayesian density estimation with location-scale mixtures, PAC-Bayesian bounds for sparse regression estimation with exponential weights, Mirror averaging with sparsity priors, Model misspecification, Bayesian versus credibility estimation, and Gibbs posteriors, Quasi-Bayesian analysis of nonparametric instrumental variables models, New estimates for Csiszár divergence and Zipf-Mandelbrot entropy via Jensen-Mercer's inequality, Asymptotically minimax empirical Bayes estimation of a sparse normal mean vector, Bayesian fractional posteriors, Bayesian variable selection for high dimensional generalized linear models: convergence rates of the fitted densities, Fast adaptive estimation of log-additive exponential models in Kullback-Leibler divergence, Unnamed Item, Learning by mirror averaging, Gibbs posterior for variable selection in high-dimensional classification and data mining, Approximate models and robust decisions, Contextuality of misspecification and data-dependent losses, Generalized mirror averaging and \(D\)-convex aggregation, Linear and convex aggregation of density estimators, Aggregation by exponential weighting, sharp PAC-Bayesian bounds and sparsity, Sparse recovery in convex hulls via entropy penalization, Gibbs posterior inference on multivariate quantiles, Gibbs posterior inference on value-at-risk, On general Bayesian inference using loss functions, Minimum description length revisited, Predicting Panel Data Binary Choice with the Gibbs Posterior, Gibbs posterior convergence and the thermodynamic formalism
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Asymptotic methods in statistical decision theory
- Information-theoretic determination of minimax rates of convergence
- Convergence rates of posterior distributions.
- Rates of convergence of posterior distributions.
- Weak convergence and empirical processes. With applications to statistics
- The consistency of posterior distributions in nonparametric problems
- Convergence of estimates under dimensionality restrictions
- On Bayesian Consistency
- PAC-Bayesian Generalisation Error Bounds for Gaussian Process Classification
- Minimum complexity density estimation
- 10.1162/1532443041424300