Some worst-case datasets of deterministic first-order methods for solving binary logistic regression
From MaRDI portal
Publication:2028922
DOI10.3934/ipi.2020047zbMath1469.90090arXiv1908.04091OpenAlexW3047706078MaRDI QIDQ2028922
Publication date: 3 June 2021
Published in: Inverse Problems and Imaging (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1908.04091
nonlinear optimizationinformation-based complexitylower complexity boundfirst-order methodsbinary logistic regression
Convex programming (90C25) Large-scale problems in mathematical programming (90C06) Nonlinear programming (90C30)
Cites Work
- Unnamed Item
- On lower complexity bounds for large-scale smooth convex optimization
- The exact information-based complexity of smooth convex minimization
- Information-based complexity of linear operator equations
- Introductory lectures on convex optimization. A basic course.
- Self-concordant analysis for logistic regression
- Lower bounds for finding stationary points I
- Lower complexity bounds of first-order methods for convex-concave bilinear saddle-point problems
- Deterministic and stochastic primal-dual subgradient algorithms for uniformly convex minimization
This page was built for publication: Some worst-case datasets of deterministic first-order methods for solving binary logistic regression