Deprecated: $wgMWOAuthSharedUserIDs=false is deprecated, set $wgMWOAuthSharedUserIDs=true, $wgMWOAuthSharedUserSource='local' instead [Called from MediaWiki\HookContainer\HookContainer::run in /var/www/html/w/includes/HookContainer/HookContainer.php at line 135] in /var/www/html/w/includes/Debug/MWDebug.php on line 372
scientific article - MaRDI portal

scientific article

From MaRDI portal
Publication:2934047

zbMath1319.90051MaRDI QIDQ2934047

Po-Wei Wang, Chih-Jen Lin

Publication date: 8 December 2014

Full work available at URL: http://jmlr.csail.mit.edu/papers/v15/wang14a.html

Title: zbMATH Open Web Interface contents unavailable due to conflicting licenses.



Related Items

Efficient iterative method for SOAV minimization problem with linear equality and box constraints and its linear convergence, Linearly convergent away-step conditional gradient for non-strongly convex functions, Convergence results of a new monotone inertial forward-backward splitting algorithm under the local Hölder error bound condition, Parallel random block-coordinate forward-backward algorithm: a unified convergence analysis, On the Complexity Analysis of the Primal Solutions for the Accelerated Randomized Dual Coordinate Ascent, Parallel Random Coordinate Descent Method for Composite Minimization: Convergence Analysis and Error Bounds, Incremental learning for \(\nu\)-support vector regression, Linear convergence of first order methods for non-strongly convex optimization, Randomness and permutations in coordinate descent methods, Methodology and first-order algorithms for solving nonsmooth and non-strongly convex bilevel optimization problems, Consistent algorithms for multiclass classification with an abstain option, An easily computable upper bound on the Hoffman constant for homogeneous inequality systems, Augmented Lagrangian optimization under fixed-point arithmetic, Stochastic block-coordinate gradient projection algorithms for submodular maximization, Multi-label Lagrangian support vector machine with random block coordinate descent method, The restricted strong convexity revisited: analysis of equivalence to error bound and quadratic growth, RSG: Beating Subgradient Method without Smoothness and Strong Convexity, New characterizations of Hoffman constants for systems of linear constraints, Unnamed Item, Nonlinear optimization and support vector machines, Error bounds for non-polyhedral convex optimization and applications to linear convergence of FDM and PGM, Nonlinear optimization and support vector machines, New analysis of linear convergence of gradient-type methods via unifying error bound conditions, Restarting the accelerated coordinate descent method with a rough strong convexity estimate, Projection onto a Polyhedron that Exploits Sparsity, A Block Successive Upper-Bound Minimization Method of Multipliers for Linearly Constrained Convex Optimization, Linear Convergence of Descent Methods for the Unconstrained Minimization of Restricted Strongly Convex Functions, Variational analysis perspective on linear convergence of some first order methods for nonsmooth convex optimization problems, Accelerate stochastic subgradient method by leveraging local growth condition, Convergence results of a nested decentralized gradient method for non-strongly convex problems, Distributed block-diagonal approximation methods for regularized empirical risk minimization, Iteration Complexity of a Block Coordinate Gradient Descent Method for Convex Optimization, Linear convergence of prox-SVRG method for separable non-smooth convex optimization problems under bounded metric subregularity