Compressed labeling on distilled labelsets for multi-label learning
DOI10.1007/s10994-011-5276-1zbMath1243.68259OpenAlexW2038857347MaRDI QIDQ439035
Tianyi Zhou, Xindong Wu, Dacheng Tao
Publication date: 31 July 2012
Published in: Machine Learning (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10994-011-5276-1
support vector machinesrandom projectioncompressed sensingbinary matrix decompositiondistilled labelsetshypothesis test of distributionKL divergencelabel compressionlabelset selectionmulti-label prediction
Classification and discrimination; cluster analysis (statistical aspects) (62H30) Learning and adaptive systems in artificial intelligence (68T05)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Finding structure with randomness: Probabilistic algorithms for constructing approximate matrix decompositions
- Manifold elastic net: a unified framework for sparse dimension reduction
- On label dependence and loss minimization in multi-label classification
- ML-KNN: A lazy learning approach to multi-label learning
- Multilabel classification via calibrated label ranking
- BoosTexter: A boosting-based system for text categorization
- Least angle regression. (With discussion)
- Combining instance-based learning and logistic regression for multilabel classification
- Label ranking by learning pairwise preferences
- Extensions of Lipschitz mappings into a Hilbert space
- Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
- Tighter bounds for random projections of manifolds
- Improved approximation algorithms for maximum cut and satisfiability problems using semidefinite programming
- 10.1162/153244303322533188
- Algorithmic Learning Theory
- Learning Theory
- On Information and Sufficiency
- Compressed sensing
This page was built for publication: Compressed labeling on distilled labelsets for multi-label learning