Exploiting hidden structure in selecting dimensions that distinguish vectors
DOI10.1016/j.jcss.2015.11.011zbMath1333.68143arXiv1512.01150OpenAlexW2191737685MaRDI QIDQ899584
Manuel Sorge, Rolf Niedermeier, Vincent Froese, René van Bevern
Publication date: 30 December 2015
Published in: Journal of Computer and System Sciences (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1512.01150
dimension reductionNP-hardnessmachine learningfixed-parameter tractabilityW-hardness\(\Delta\)-systemscombinatorial feature selectioncombinatorics of binary matricesminimal reduct problem
Analysis of algorithms and problem complexity (68Q25) Computational difficulty of problems (lower bounds, completeness, difficulty of approximation, etc.) (68Q17)
Related Items (5)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Fundamentals of parameterized complexity
- Towards optimal and expressive kernelization for \(d\)-hitting set
- On distance-3 matchings and induced matchings
- Parameterized complexity of finding regular induced subgraphs
- Selection of relevant features and examples in machine learning
- Structure preserving reductions among convex optimization problems
- The \(k\)-feature set problem is \(W[2\)-complete]
- On explaining integer vectors by few homogeneous segments
- Parametrized complexity theory.
- Tight lower bounds for certain parameterized NP-hard problems
- A Parameterized Complexity Analysis of Combinatorial Feature Selection Problems
- Extremal Combinatorics
- 10.1162/153244303322753616
- 10.1162/153244303322753670
- Kernelization Lower Bounds Through Colors and IDs
- Parameterized Algorithms
- Digraphs
This page was built for publication: Exploiting hidden structure in selecting dimensions that distinguish vectors