Pages that link to "Item:Q5020041"
From MaRDI portal
The following pages link to Deep double descent: where bigger models and more data hurt* (Q5020041):
Displaying 20 items.
- On the influence of optimizers in deep learning-based side-channel analysis (Q832396) (← links)
- On the properties of bias-variance decomposition for kNN regression (Q2700517) (← links)
- Triple descent and the two kinds of overfitting: where and why do they appear?* (Q5020037) (← links)
- Sensitivity-Informed Provable Pruning of Neural Networks (Q5037562) (← links)
- Binary Classification of Gaussian Mixtures: Abundance of Support Vectors, Benign Overfitting, and Regularization (Q5065474) (← links)
- Overparameterization and Generalization Error: Weighted Trigonometric Interpolation (Q5088865) (← links)
- A Unifying Tutorial on Approximate Message Passing (Q5863992) (← links)
- Fit without fear: remarkable mathematical phenomena of deep learning through the prism of interpolation (Q5887828) (← links)
- Reliable extrapolation of deep neural operators informed by physics or sparse observations (Q6097626) (← links)
- Consistent Sparse Deep Learning: Theory and Computation (Q6110715) (← links)
- Semi-Supervised Node Classification via Semi-Global Graph Transformer Based on Homogeneity Augmentation (Q6135733) (← links)
- Stability of the scattering transform for deformations with minimal regularity (Q6139821) (← links)
- Fragility, robustness and antifragility in deep learning (Q6152676) (← links)
- On the robustness of sparse counterfactual explanations to adverse perturbations (Q6156854) (← links)
- The deep arbitrary polynomial chaos neural network or how deep artificial neural networks could benefit from data-driven homogeneous chaos theory (Q6488834) (← links)
- Working with machines in mathematics (Q6554710) (← links)
- A data-driven approach for optimal operational and financial commodity hedging (Q6586281) (← links)
- Tradeoff of generalization error in unsupervised learning (Q6607283) (← links)
- Redundant representations help generalization in wide neural networks (Q6611451) (← links)
- Dropout drops double descent (Q6670075) (← links)