General five-step discrete-time Zhang neural network for time-varying nonlinear optimization
From MaRDI portal
Publication:2305651
DOI10.1007/s40840-019-00770-4zbMath1440.90075OpenAlexW2942682251MaRDI QIDQ2305651
Publication date: 11 March 2020
Published in: Bulletin of the Malaysian Mathematical Sciences Society. Second Series (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s40840-019-00770-4
Related Items (3)
Relationship between time-instant number and precision of ZeaD formulas with proofs ⋮ Noise-tolerant continuous-time Zhang neural networks for time-varying Sylvester tensor equations ⋮ A novel noise-tolerant Zhang neural network for time-varying Lyapunov equation
Cites Work
- Unnamed Item
- Design and analysis of two discrete-time ZD algorithms for time-varying nonlinear minimization
- General four-step discrete-time zeroing and derivative dynamics applied to time-varying nonlinear optimization
- Presentation, error analysis and numerical experiments on a group of 1-step-ahead numerical differentiation formulas
- Discrete-time Zhang neural networks for time-varying nonlinear optimization
- Generalized Peaceman-Rachford splitting method for multiple-block separable convex programming with applications to robust PCA
- Stepsize domain confirmation and optimum of zead formula for future optimization
- New five-step DTZD algorithm for future nonlinear minimization with quartic steady-state error pattern
- Taylor-type 1-step-ahead numerical differentiation rule for first-order derivative approximation and ZNN discretization
- Noise-Tolerant ZNN Models for Solving Time-Varying Zero-Finding Problems: A Control-Theoretic Approach
This page was built for publication: General five-step discrete-time Zhang neural network for time-varying nonlinear optimization