Learning from dependent observations

From MaRDI portal
Publication:958916

DOI10.1016/j.jmva.2008.04.001zbMath1158.68040arXiv0707.0303OpenAlexW2021440127WikidataQ59196395 ScholiaQ59196395MaRDI QIDQ958916

Clint Scovel, Don Hush, Ingo Steinwart

Publication date: 10 December 2008

Published in: Journal of Multivariate Analysis (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/0707.0303



Related Items

Error analysis of the moving least-squares method with non-identical sampling, Sampling and empirical risk minimization, Convergence rate for the moving least-squares learning with dependent sampling, Consistent identification of Wiener systems: a machine learning viewpoint, Prediction of dynamical time series using kernel based regression and smooth splines, An oracle inequality for regularized risk minimizers with strongly mixing observations, Exponential inequalities for nonstationary Markov chains, Adaptive group Lasso neural network models for functions of few variables and time-dependent data, Generalization bounds of ERM algorithm with Markov chain samples, Learning performance of Tikhonov regularization algorithm with geometrically beta-mixing observations, Improved estimation of relaxation time in nonreversible Markov chains, Universal regression with adversarial responses, High-dimensional VAR with low-rank transition, Learning from regularized regression algorithms with \(p\)-order Markov chain sampling, Learning from non-irreducible Markov chains, Generalization bounds of ERM algorithm with \(V\)-geometrically ergodic Markov chains, Statistical learning based on Markovian data maximal deviation inequalities and learning rates, On biased random walks, corrupted intervals, and learning under adversarial design, LEARNING GRADIENTS FROM NONIDENTICAL DATA, Recovery guarantees for polynomial coefficients from weakly dependent data with outliers, Regularized least-squares regression: learning from a sequence, Fast learning from \(\alpha\)-mixing observations, The generalization performance of ERM algorithm with strongly mixing observations, Learning Theory Estimates with Observations from General Stationary Stochastic Processes, Classification with non-i.i.d. sampling, Compressed classification learning with Markov chain samples, Generalization performance of least-square regularized regression algorithm with Markov chain samples, Qualitative robustness of estimators on stochastic processes, Learning performance of regularized regression with multiscale kernels based on Markov observations, Prediction of time series by statistical learning: general losses and fast rates, Least-square regularized regression with non-iid sampling, System identification using kernel-based regularization: new insights on stability and consistency issues, Measuring the Capacity of Sets of Functions in the Analysis of ERM, GENERALIZATION BOUNDS OF REGULARIZATION ALGORITHMS DERIVED SIMULTANEOUSLY THROUGH HYPOTHESIS SPACE COMPLEXITY, ALGORITHMIC STABILITY AND DATA QUALITY, Non parametric learning approach to estimate conditional quantiles in the dependent functional data case, Consistency of support vector machines for forecasting the evolution of an unknown ergodic dynamical system from observations with unknown noise, Learning from uniformly ergodic Markov chains, Mixing time estimation in reversible Markov chains from a single sample path, Generalization performance of Gaussian kernels SVMC based on Markov sampling, Unnamed Item, Error analysis of the moving least-squares regression learning algorithm with β-mixing and non-identical sampling



Cites Work