Subsampled first-order optimization methods with applications in imaging
DOI10.1007/978-3-030-98661-2_78zbMATH Open1548.90392MaRDI QIDQ6606441
Stefania Bellavia, Benedetta Morini, Tommaso Bianconcini, Nataša Krejić
Publication date: 16 September 2024
neural networksimage classificationstochastic gradientfirst-order methodsconvolutional neural networksfinite-sum minimization
Applications of mathematical programming (90C90) Nonconvex programming, global optimization (90C26) Learning and adaptive systems in artificial intelligence (68T05) Computing methodologies for image processing (68U10) Image processing (compression, reconstruction, etc.) in information and communication theory (94A08)
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- A stochastic quasi-Newton method for large-scale optimization
- On stochastic gradient and subgradient methods with adaptive steplength sequences
- Minimizing finite sums with the stochastic average gradient
- Sample size selection in optimization methods for machine learning
- Efficient sample sizes in stochastic nonlinear programming
- On the behavior of the gradient norm in the steepest descent method
- Global convergence rate analysis of unconstrained optimization methods based on probabilistic models
- Stochastic optimization using a trust-region method and random models
- Sub-sampled Newton methods
- New stochastic approximation algorithms with adaptive step sizes
- Line search methods with variable sample size for unconstrained optimization
- Inexact restoration with subsampled trust-region methods for finite-sum minimization
- Newton-type methods for non-convex optimization under inexact Hessian information
- Nonmonotone line search methods with variable sample size
- Minimization of functions having Lipschitz continuous first partial derivatives
- A gradient method for unconstrained optimization in noisy environment
- Inexact restoration approach for minimization with inexact evaluation of the objective function
- Hybrid deterministic-stochastic methods for data fitting
- Newton Sketch: A Near Linear-Time Optimization Algorithm with Linear-Quadratic Convergence
- The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
- Accelerated Stochastic Approximation
- Descent direction method with line search for unconstrained optimization in noisy environment
- Robust Stochastic Approximation Approach to Stochastic Programming
- Two-Point Step Size Gradient Methods
- Numerical Optimization
- Accelerated Stochastic Approximation
- A Scaled Stochastic Approximation Algorithm
- Introduction to Stochastic Search and Optimization
- Gradient Convergence in Gradient methods with Errors
- Neural Networks and Deep Learning
- On the employment of inexact restoration for the minimization of functions whose evaluation is subject to errors
- Optimization Methods for Large-Scale Machine Learning
- Nonlinear stepsize control, trust regions and regularizations for unconstrained optimization
- Adaptive cubic regularization methods with dynamic inexact Hessian information and applications to finite-sum minimization
- An investigation of Newton-Sketch and subsampled Newton methods
- A Stochastic Line Search Method with Expected Complexity Analysis
- Adaptive Regularization Algorithms with Inexact Evaluations for Nonconvex Optimization
- Learning representations by back-propagating errors
- An Introduction to Matrix Concentration Inequalities
- Stochastic Estimation of the Maximum of a Regression Function
- A Stochastic Approximation Method
- Exact and inexact subsampled Newton methods for optimization
- Subsampled inexact Newton methods for minimizing large sums of convex functions
- The elements of statistical learning. Data mining, inference, and prediction
This page was built for publication: Subsampled first-order optimization methods with applications in imaging
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6606441)