Periodic step-size adaptation in second-order gradient descent for single-pass on-line structured learning
DOI10.1007/s10994-009-5142-6zbMath1470.68116OpenAlexW2102547221MaRDI QIDQ1959528
Chun-Nan Hsu, Han-Shen Huang, Yu-Ming Chang, Yuh-Jye Lee
Publication date: 7 October 2010
Published in: Machine Learning (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10994-009-5142-6
online learningstochastic gradient descentconvolutional neural networksconditional random fieldssequence labeling
Random fields; image analysis (62M40) Learning and adaptive systems in artificial intelligence (68T05) Stochastic programming (90C15) Online algorithms; streaming algorithms (68W27)
Related Items (1)
Uses Software
Cites Work
- Pegasos: primal estimated sub-gradient solver for SVM
- Statistical analysis of learning dynamics
- On the global and componentwise rates of convergence of the EM algorithm
- On computing the largest fraction of missing information for the EM algorithm and the worst linear function for data augmentation.
- Numerical Optimization
- Introduction to Stochastic Search and Optimization
- On‐line learning for very large data sets
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: Periodic step-size adaptation in second-order gradient descent for single-pass on-line structured learning