Pages that link to "Item:Q6038848"
From MaRDI portal
The following pages link to Stochastic gradient descent with noise of machine learning type. I: Discrete time analysis (Q6038848):
Displaying 13 items.
- On large batch training and sharp minima: a Fokker-Planck perspective (Q828491) (← links)
- Semigroups of stochastic gradient descent and online principal component analysis: properties and diffusion approximations (Q1990549) (← links)
- Convergence of stochastic gradient descent in deep neural network (Q2025203) (← links)
- Analysis of stochastic gradient descent in continuous time (Q2058762) (← links)
- Lower error bounds for the stochastic gradient descent optimization algorithm: sharp convergence rates for slowly and fast decaying learning rates (Q2303416) (← links)
- Strong error analysis for stochastic gradient descent optimization algorithms (Q4964091) (← links)
- One-dimensional system arising in stochastic gradient descent (Q5022277) (← links)
- The inverse variance–flatness relation in stochastic gradient descent is critical for finding flat minima (Q5073270) (← links)
- Stochastic Gradient Descent in Continuous Time: A Central Limit Theorem (Q5119415) (← links)
- Analysis of kinetic models for label switching and stochastic gradient descent (Q6106920) (← links)
- Nonlinear Gradient Mappings and Stochastic Optimization: A General Framework with Applications to Heavy-Tail Noise (Q6155875) (← links)
- Stochastic gradient descent with noise of machine learning type. II: Continuous time analysis (Q6188971) (← links)
- On the existence of minimizers in shallow residual ReLU neural network optimization landscapes (Q6652417) (← links)