ACCELERATING GENERALIZED ITERATIVE SCALING BASED ON STAGGERED AITKEN METHOD FOR ON-LINE CONDITIONAL RANDOM FIELDS
DOI10.1142/S0219691312500592zbMath1261.65064OpenAlexW2080401122MaRDI QIDQ4910879
Seong-Whan Lee, Heung-Il Suk, Hee-Deok Yang
Publication date: 13 March 2013
Published in: International Journal of Wavelets, Multiresolution and Information Processing (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1142/s0219691312500592
convergencenumerical examplesstochastic optimization methodsAitken accelerationgeneralized iterative scalingsequence labelingon-line conditional random field
Random fields (60G60) Numerical mathematical programming methods (65K05) Stochastic programming (90C15)
Cites Work
- Unnamed Item
- Unnamed Item
- On the limited memory BFGS method for large scale optimization
- Simultaneous spotting of signs and fingerspellings based on hierarchical conditional random fields and boostmap embeddings
- Convergence analysis of gradient descent stochastic algorithms
- Periodic step-size adaptation in second-order gradient descent for single-pass on-line structured learning
- Numerical Optimization
- Introduction to Stochastic Search and Optimization
- On‐line learning for very large data sets
This page was built for publication: ACCELERATING GENERALIZED ITERATIVE SCALING BASED ON STAGGERED AITKEN METHOD FOR ON-LINE CONDITIONAL RANDOM FIELDS