Information-Theoretic Semi-Supervised Metric Learning via Entropy Regularization
From MaRDI portal
Publication:5383793
DOI10.1162/NECO_a_00614zbMath1415.68173arXiv1206.4614OpenAlexW2157911873WikidataQ44744384 ScholiaQ44744384MaRDI QIDQ5383793
Bo Dai, Gang Niu, Makoto Yamada, Masashi Sugiyama
Publication date: 20 June 2019
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1206.4614
Related Items
Ordinal margin metric learning and its extension for cross-distribution image data, SPACETIME DISCOUNTED VALUE OF NETWORK CONNECTIVITY, Classification from Triplet Comparison Data, Information-Theoretic Semi-Supervised Metric Learning via Entropy Regularization, Cross-Domain Metric and Multiple Kernel Learning Based on Information Theory
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Non-linear metric learning using pairwise similarity and dissimilarity constraints and the geometrical structure of data
- Convex multi-task feature learning
- Sufficient dimension reduction and graphics in regression
- Semi-supervised local Fisher discriminant analysis for dimensionality reduction
- Kernel dimension reduction in regression
- Information Theory and Statistical Mechanics
- Laplacian Eigenmaps for Dimensionality Reduction and Data Representation
- Metric Learning
- Maximum Entropy Distribution Estimation with Generalized Regularization
- Information-Theoretic Semi-Supervised Metric Learning via Entropy Regularization
- Model Selection and Estimation in Regression with Grouped Variables