The complexity of model classes, and smoothing noisy data
From MaRDI portal
Publication:1274410
DOI10.1016/S0167-6911(98)00008-5zbMath0909.93076OpenAlexW2178158849MaRDI QIDQ1274410
Sanjeev R. Kulkarni, Bartlett, Peter L.
Publication date: 12 January 1999
Published in: Systems \& Control Letters (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/s0167-6911(98)00008-5
system identificationsmoothingcomputational learning theorycovering numberscomplexity of the model class
Estimation and detection in stochastic control theory (93E10) Data smoothing in stochastic control theory (93E14)
Related Items (1)
Nonparametric nonlinear regression using polynomial and neural approximators: a numerical comparison
Cites Work
- Decision theoretic generalizations of the PAC model for neural net and other learning applications
- Bounding the Vapnik-Chervonenkis dimension of concept classes parameterized by real numbers
- Universal approximation bounds for superpositions of a sigmoidal function
- Efficient agnostic learning of neural networks with bounded fan-in
- The sample complexity of pattern classification with neural networks: the size of the weights is more important than the size of the network
- Rates of convergence of nearest neighbor estimation under arbitrary sampling
- Sample complexity for learning recurrent perceptron mappings
- A note on metric dimension and feedback in discrete time
- Convergence of stochastic processes
This page was built for publication: The complexity of model classes, and smoothing noisy data