On multivariate randomized classification trees: \(l_0\)-based sparsity, VC dimension and decomposition methods
From MaRDI portal
Publication:6109286
DOI10.1016/j.cor.2022.106058arXiv2112.05239MaRDI QIDQ6109286
No author found.
Publication date: 4 July 2023
Published in: Computers \& Operations Research (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2112.05239
Related Items (1)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Polynomial bounds for VC dimension of sigmoidal and general Pfaffian neural networks
- Mathematical optimization in classification and regression trees
- The Vapnik-Chervonenkis dimension of decision trees with bounded rank
- Constructing optimal binary decision trees is NP-complete
- Creating a marketing strategy in healthcare industry: a holistic data analytic approach
- Parallel decomposition methods for linearly constrained problems subject to simple bound with application to the SVMs training
- On the convergence of the block nonlinear Gauss-Seidel method under convex constraints
- Column generation based heuristic for learning classification trees
- Sparsity in optimal randomized classification trees
- A convergent decomposition algorithm for support vector machines
- On the implementation of an interior-point filter line-search algorithm for large-scale nonlinear programming
- Optimal randomized classification trees
- On sparse optimal regression trees
- Learnability and the Vapnik-Chervonenkis dimension
- Feature selection combining linear support vector machines and concave optimization
- 10.1162/153244303322753751
- On the convergence of a modified version of SVMlightalgorithm
- Optimal classification trees
This page was built for publication: On multivariate randomized classification trees: \(l_0\)-based sparsity, VC dimension and decomposition methods