Quantification on the generalization performance of deep neural network with Tychonoff separation axioms
DOI10.1016/j.ins.2022.06.065WikidataQ114666889 ScholiaQ114666889MaRDI QIDQ6195433
No author found.
Publication date: 13 March 2024
Published in: Information Sciences (Search for Journal in Brave)
performance measuresmulti-class classificationdeep neural networkgeneralization gaplocally indiscrete topologyTychonoff separation axioms
Artificial neural networks and deep learning (68T07) Metric spaces, metrizability (54E35) Lower separation axioms ((T_0)--(T_3), etc.) (54D10) Higher separation axioms (completely regular, normal, perfectly or collectionwise normal, etc.) (54D15) Vector spaces, linear dependence, rank, lineability (15A03)
Cites Work
- Some properties of \((1, 2)^\ast\)-locally closed sets
- Deep UQ: learning deep neural network surrogate models for high dimensional uncertainty quantification
- Quantifying the generalization error in deep learning in terms of data distribution and neural network smoothness
- Assessing the data complexity of imbalanced datasets
- RSMOTE: a self-adaptive robust SMOTE for imbalanced problems with label noise
- Application of innovative risk early warning mode under big data technology in Internet credit financial risk assessment
- On the stability and generalization of neural networks with VC dimension and fuzzy feature encoders
- Parameter estimation in systems exhibiting spatially complex solutions via persistent homology and machine learning
- Topological Approaches to Deep Learning
This page was built for publication: Quantification on the generalization performance of deep neural network with Tychonoff separation axioms