Learning Theory Estimates with Observations from General Stationary Stochastic Processes
From MaRDI portal
Publication:5380606
DOI10.1162/NECO_a_00870zbMath1476.68229arXiv1605.02887OpenAlexW2963651118WikidataQ50499957 ScholiaQ50499957MaRDI QIDQ5380606
Yun-Long Feng, Johan A. K. Suykens, Hanyuan Hang, Ingo Steinwart
Publication date: 5 June 2019
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1605.02887
Classification and discrimination; cluster analysis (statistical aspects) (62H30) Nonparametric estimation (62G05) Stationary stochastic processes (60G10) Learning and adaptive systems in artificial intelligence (68T05)
Related Items (3)
Prediction of dynamical time series using kernel based regression and smooth splines ⋮ Spectral algorithms for learning with dependent observations ⋮ Estimation and asymptotic properties of a stationary univariate GARCH(\(p,q\)) process
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Two oracle inequalities for regularized boosting classifiers
- Model selection for weakly dependent time series forecasting
- Spectral estimation of the Lévy density in partially observed affine models
- Estimating conditional quantiles with the help of the pinball loss
- Regularization in kernel learning
- Regularized least square regression with dependent samples
- Learning from dependent observations
- Basic properties of strong mixing conditions. A survey and some open questions
- A note on application of integral operator in learning theory
- A tail inequality for suprema of unbounded empirical processes with applications to Markov chains
- Correlation theory of stationary and related random functions. Volume II: Supplementary notes and references
- Rates of convergence for empirical processes of stationary mixing sequences
- Limits to classification and regression estimation from ergodic processes
- A distribution-free theory of nonparametric regression
- Learning and generalisation. With applications to neural networks.
- Nonlinear time series. Nonparametric and parametric methods
- Concentration of measure inequalities for Markov chains and \(\Phi\)-mixing processes.
- Statistical behavior and consistency of classification methods based on convex risk minimization.
- Optimal aggregation of classifiers in statistical learning.
- Optimal regression rates for SVMs using Gaussian kernels
- A Bernstein-type inequality for some mixing processes and dynamical systems with an application to learning
- Learning rates of regularized regression for exponentially strongly mixing sequence
- Fast learning from \(\alpha\)-mixing observations
- Mathematical Statistics and Stochastic Processes
- Bernstein inequality and moderate deviations under strong mixing conditions
- Least-squares regularized regression with dependent samples andq-penalty
- A CENTRAL LIMIT THEOREM AND A STRONG MIXING CONDITION
- Some Limit Theorems for Random Functions. I
- EXPONENTIAL INEQUALITIES AND FUNCTIONAL ESTIMATIONS FOR WEAK DEPENDENT DATA: APPLICATIONS TO DYNAMICAL SYSTEMS
- Support Vector Machines
- An Explicit Description of the Reproducing Kernel Hilbert Spaces of Gaussian RBF Kernels
- Some Limit Theorems for Random Functions. II
- Bernstein-type large deviations inequalities for partial sums of strong mixing processes
- Minimum complexity regression estimation with weakly dependent observations
- 10.1162/1532443041424319
- Ergodic Mirror Descent
- Convergence of Distributions Generated by Stationary Stochastic Processes
- Some Limit Theorems for Stationary Processes
- Combinatorial methods in density estimation
This page was built for publication: Learning Theory Estimates with Observations from General Stationary Stochastic Processes