Approximation with random bases: pro et contra
From MaRDI portal
Publication:2282874
DOI10.1016/j.ins.2015.09.021zbMath1427.68361arXiv1506.04631OpenAlexW1786513448WikidataQ56050911 ScholiaQ56050911MaRDI QIDQ2282874
Konstantin I. Sofeikov, I. Yu. Tyukin, Alexander N. Gorban, Danil Prokhorov
Publication date: 20 December 2019
Published in: Information Sciences (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1506.04631
Artificial neural networks and deep learning (68T07) Approximation algorithms (68W25) Randomized algorithms (68W20)
Related Items (20)
Probabilistic lower bounds for approximation by shallow perceptron networks ⋮ Blessing of dimensionality at the edge and geometry of few-shot learning ⋮ General stochastic separation theorems with optimal bounds ⋮ Stochastic configuration networks with chaotic maps and hierarchical learning strategy ⋮ Stochastic configuration networks for multi-dimensional integral evaluation ⋮ A geometric view on the role of nonlinear feature maps in few-shot learning ⋮ Sensitivity Analysis of the Neural Networks Randomized Learning ⋮ Prediction of X-ray fluorescence copper grade using regularized stochastic configuration networks ⋮ Correction of AI systems by linear discriminants: probabilistic foundations ⋮ Randomized mixture models for probability density approximation and estimation ⋮ One-trial correction of legacy AI systems and stochastic separation theorems ⋮ Blessing of dimensionality: mathematical foundations of the statistical physics of data ⋮ High-dimensional brain: a tool for encoding and rapid learning of memories by single neurons ⋮ Randomized multi-scale kernels learning with sparsity constraint regularization for regression ⋮ Stochastic separation theorems ⋮ Insights into randomized algorithms for neural networks: practical issues and common pitfalls ⋮ Prescribed performance based model-free adaptive sliding mode constrained control for a class of nonlinear systems ⋮ Robust stochastic configuration networks with kernel density estimation for uncertain data regression ⋮ New Method to Determine Topology of Low-Dimension Manifold Approximating Multidimensional Data Sets ⋮ Correlations of random classifiers on large data sets
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Fast decorrelated neural network ensembles with random weights
- Approximation of continuous functions of several variables by an arbitrary nonlinear continuous function of one variable, linear functions, and their superpositions
- A simple lemma on greedy approximation in Hilbert space and convergence rates for projection pursuit regression and neural network training
- Isoperimetry of waists and concentration of maps
- On the applicability conditions for the algorithms of adaptive control in nonconvex problems
- Proportional concentration phenomena on the sphere
- Quasiorthogonal dimension of Euclidean spaces
- Is the \(k\)-NN classifier in high dimensions affected by the curse of dimensionality?
- Near-Optimal Signal Recovery From Random Projections: Universal Encoding Strategies?
- Universal approximation bounds for superpositions of a sigmoidal function
- Adaptation in the presence of a general nonlinear parameterization: an error model approach
- Neuro-Fuzzy Control of Industrial Systems with Actuator Nonlinearities
- Gamma function asymptotics by an extension of the method of steepest descents
- Adaptive control with nonconvex parameterization
- Adaptation and Parameter Estimation in Systems With Unstable Target Dynamics and Nonlinear Parametrization
- Metric structures for Riemannian and non-Riemannian spaces. Transl. from the French by Sean Michael Bates. With appendices by M. Katz, P. Pansu, and S. Semmes. Edited by J. LaFontaine and P. Pansu
- Approximation by superpositions of a sigmoidal function
This page was built for publication: Approximation with random bases: pro et contra