An augmented Lagrangian method for training recurrent neural networks
DOI10.1137/23m1627614MaRDI QIDQ6663228
Chao Zhang, [[Person:6046821|Author name not available (Why is that?)]], Xiaojun Chen
Publication date: 14 January 2025
Published in: SIAM Journal on Scientific Computing (Search for Journal in Brave)
recurrent neural networkaugmented Lagrangian methodblock coordinate descentnonsmooth nonconvex optimization
Artificial neural networks and deep learning (68T07) Numerical mathematical programming methods (65K05) Nonconvex programming, global optimization (90C26) Nonlinear programming (90C30)
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- On Fréchet subdifferentials
- Multiplier rules under mixed assumptions of differentiability and Lipschitz continuity
- An Augmented Lagrangian Method for Non-Lipschitz Nonconvex Programming
- GARCH based artificial neural networks in forecasting conditional variance of stock returns
- Learning Deep Architectures for AI
- Semismooth and Semiconvex Functions in Constrained Optimization
- Linearly Constrained Nonsmooth Optimization for Training Autoencoders
- MultiComposite Nonconvex Optimization for Training Deep Neural Networks
- Computation of second-order directional stationary points for group sparse optimization
- A COMPARISON OF VAR AND NEURAL NETWORKS WITH GENETIC ALGORITHM IN FORECASTING PRICE OF OIL
- An Adaptive Lagrangian-Based Scheme for Nonconvex Composite Optimization
This page was built for publication: An augmented Lagrangian method for training recurrent neural networks
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6663228)