Massively parallel feature selection: an approach based on variance preservation
From MaRDI portal
Publication:374165
DOI10.1007/s10994-013-5373-4zbMath1273.68310OpenAlexW1994252348MaRDI QIDQ374165
Publication date: 22 October 2013
Published in: Machine Learning (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10994-013-5373-4
Learning and adaptive systems in artificial intelligence (68T05) Pattern recognition, speech recognition (68T10)
Related Items (4)
Consensus-based modeling using distributed feature construction with ILP ⋮ A greedy feature selection algorithm for big data of high dimensionality ⋮ Particle swarm stepwise (PaSS) algorithm for information criteria-based variable selections ⋮ A distributed feature selection scheme with partial information sharing
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Consistency of Bayesian procedures for variable selection
- Estimating the dimension of a model
- Theoretical and empirical analysis of ReliefF and RReliefF
- Principal component analysis.
- Least angle regression. (With discussion)
- Grid computing for parallel bioinspired algorithms
- Parallelizing feature selection
- Solving feature subset selection problem by a parallel scatter search
- Further analysis of the data by Akaike's information criterion and the finite corrections
- An asymptotic property of model selection criteria
- 10.1162/153244303322753616
- 10.1162/153244303322753670
- 10.1162/153244303322753751
- Learning the parts of objects by non-negative matrix factorization
- A Direct Formulation for Sparse PCA Using Semidefinite Programming
- A new look at the statistical model identification
This page was built for publication: Massively parallel feature selection: an approach based on variance preservation