Bin width selection in multivariate histograms by the combinatorial method
From MaRDI portal
Publication:882926
DOI10.1007/BF02603004zbMath1110.62049OpenAlexW2023504949MaRDI QIDQ882926
Publication date: 25 May 2007
Published in: Test (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/bf02603004
Related Items
Combining regular and irregular histograms by penalized likelihood, Probability density function estimation with the frequency polygon transform, Minimum distance histograms with universal performance guarantees, Selection rules based on divergences
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Hellinger distance and Akaike's information criterion for the histogram
- Maximum entropy histograms
- The \(L_ 2\)-optimal cell width for the histogram
- Asymptotically optimal cells for a histogram
- Akaike's information criterion and Kullback-Leibler loss for histogram density estimation
- An optimal variable cell histogram based on the sample spacings
- Risk bounds for model selection via penalization
- A universally acceptable smoothing factor for kernel density estimates
- Nonasymptotic universal smoothing factors, kernel complexity and Yatracos classes
- Consistency of data-driven histogram methods for density estimation and classification
- Almost sure \(L_ 1\)-norm convergence for data-based histogram density estimates
- On optimal and data-based histograms
- Akaike's information criterion and the histogram
- Large sample properties of maximum entropy histograms
- An optimal variable cell histogram
- On stochastic complexity and nonparametric density estimation
- Uniform consistency of a histogram density estimator and modal estimation
- On the Uniform Convergence of Relative Frequencies of Events to Their Probabilities
- Almost Sure $L_r$-Norm Convergence for Data-Based Histogram Density Estimates