On Relevant Features Selection Based on Information Theory
From MaRDI portal
Publication:6069447
DOI10.1137/s0040585x97t991520OpenAlexW4388464908MaRDI QIDQ6069447
No author found.
Publication date: 14 November 2023
Published in: Theory of Probability & Its Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1137/s0040585x97t991520
mutual informationfeature selectioninteraction informationepistasis effectsequential selection of features
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- On relationships between the Pearson and the distance correlation coefficients
- Advances in feature selection for data and pattern recognition
- Feature selection based on statistical estimation of mutual information
- Statistical estimation of mutual information for mixed model
- Asymptotic distributions of empirical interaction information
- Can high-order dependencies improve mutual information based feature selection?
- Statistical Implications of Turing's Formula
- The unreasonable effectiveness of mathematics in the natural sciences. Richard courant lecture in mathematical sciences delivered at New York University, May 11, 1959
- Linear Sampling Estimates of Sums
- Multiple mutual informations and multiple interactions in frequency data
- Foundations of Modern Probability
- Statistical estimation of conditional Shannon entropy
- Variable Selection with Error Control: Another Look at Stability Selection
This page was built for publication: On Relevant Features Selection Based on Information Theory