Boosting With theL2Loss
From MaRDI portal
Publication:4468450
DOI10.1198/016214503000125zbMath1041.62029OpenAlexW2088883866WikidataQ56169181 ScholiaQ56169181MaRDI QIDQ4468450
Publication date: 10 June 2004
Published in: Journal of the American Statistical Association (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1198/016214503000125
Nonparametric regression and quantile regression (62G08) Classification and discrimination; cluster analysis (statistical aspects) (62H30)
Related Items
Gaining Outlier Resistance With Progressive Quantiles: Fast Algorithms and Theoretical Studies, Identification of biomarker‐by‐treatment interactions in randomized clinical trials with survival outcomes and high‐dimensional spaces, A new approach of subgroup identification for high-dimensional longitudinal data, Conditional sparse boosting for high-dimensional instrumental variable estimation, On stability issues in deriving multivariable regression models, Generalized Additive Modeling with Implicit Variable Selection by Likelihood‐Based Boosting, Boosting of Image Denoising Algorithms, Variable selection – A review and recommendations for the practicing statistician, Machine learning based on extended generalized linear model applied in mixture experiments, Nonparametric Rotations for Sphere-Sphere Regression, Ensemble of fast learning stochastic gradient boosting, Stratified Cox models with time‐varying effects for national kidney transplant patients: A new blockwise steepest ascent method, Fully corrective gradient boosting with squared hinge: fast learning rates and early stopping, On the selection of predictors by using greedy algorithms and information theoretic criteria, Boosting Distributional Copula Regression, Accelerated Componentwise Gradient Boosting Using Efficient Data Representation and Momentum-Based Optimization, A boosting first-hitting-time model for survival analysis in high-dimensional settings, Bayesian variable selection and estimation in semiparametric joint models of multivariate longitudinal and survival data, Probabilistic forecast reconciliation: properties, evaluation and score optimisation, Infinitesimal gradient boosting, Unnamed Item, Functional Additive Models on Manifolds of Planar Shapes and Forms, Spectral Algorithms for Supervised Learning, Estimation and inference of treatment effects with \(L_2\)-boosting in high-dimensional settings, Unbiased Boosting Estimation for Censored Survival Data, Privacy-preserving and lossless distributed estimation of high-dimensional generalized additive mixed models, Data sharpening on unknown manifold, Evolution of high-frequency systematic trading: a performance-driven gradient boosting model, Generalized additive models with unknown link function including variable selection, An overview of techniques for linking high‐dimensional molecular data to time‐to‐event endpoints by risk prediction models, Boosting method for nonlinear transformation models with censored survival data, Boosting with missing predictors, Nonparametric Regression Based Image Analysis, Stochastic boosting algorithms, Theory of Classification: a Survey of Some Recent Advances, Remembering Leo, Discussion on “Two-Stage Procedures for High-Dimensional Data” by Makoto Aoshima and Kazuyoshi Yata, The functional linear array model, Subject-specific Bradley–Terry–Luce models with implicit variable selection, Random gradient boosting for predicting conditional quantiles, Dimension reduction boosting, Variable Selection and Model Choice in Geoadditive Regression Models, Novel Aggregate Deletion/Substitution/Addition Learning Algorithms for Recursive Partitioning, Generalized Additive Models for Pair-Copula Constructions, Modelling Price Paths in On-Line Auctions: Smoothing Sparse and Unevenly Sampled Curves by Using Semiparametric Mixed Models, Detection of differential item functioning in Rasch models by boosting techniques, Stochastic boosting algorithms, Unnamed Item, General Sparse Boosting: Improving Feature Selection of L2Boosting by Correlation-Based Penalty Family, CLEAR: Covariant LEAst-Square Refitting with Applications to Image Restoration, Optimization by Gradient Boosting, High-Dimensional Data Classification, Regularization: From Inverse Problems to Large-Scale Machine Learning, Greedy algorithms for prediction, Two-step sparse boosting for high-dimensional longitudinal data with varying coefficients, Penalized likelihood and Bayesian function selection in regression models, A review on instance ranking problems in statistical learning, Variable selection in general multinomial logit models, Group orthogonal greedy algorithm for change-point estimation of multivariate time series, Population theory for boosting ensembles., On the Bayes-risk consistency of regularized boosting methods., Statistical behavior and consistency of classification methods based on convex risk minimization., Boosting for real and functional samples: an application to an environmental problem, Boosting techniques for nonlinear time series models, A unified framework of constrained regression, Network-based naive Bayes model for social network, Econometric estimation with high-dimensional moment equalities, A precise high-dimensional asymptotic theory for boosting and minimum-\(\ell_1\)-norm interpolated classifiers, Mean and quantile boosting for partially linear additive models, Improved nearest neighbor classifiers by weighting and selection of predictors, Logitboost autoregressive networks, A geometrical approach to iterative isotone regression, An integrated approach of data envelopment analysis and boosted generalized linear mixed models for efficiency assessment, An update on statistical boosting in biomedicine, A multicriteria approach to find predictive and sparse models with stable feature selection for high-dimensional data, \(L_{2}\) boosting in kernel regression, Small area estimation of the homeless in Los Angeles: an application of cost-sensitive stochastic gradient boosting, Component selection in additive quantile regression models, Variable selection in functional additive regression models, Variable selection for generalized linear mixed models by \(L_1\)-penalized estimation, Sparse conjugate directions pursuit with application to fixed-size kernel models, Accelerated gradient boosting, Forecasting with many predictors: is boosting a viable alternative?, Practical variable selection for generalized additive models, Density estimation with minimization of \(U\)-divergence, Boosting local quasi-likelihood estimators, Boosting flexible functional regression models with a high number of functional historical effects, Marginal integration for nonparametric causal inference, Gradient boosting for distributional regression: faster tuning and improved variable selection via noncyclical updates, Pathway-based kernel boosting for the analysis of genome-wide association studies, Multi-output learning via spectral filtering, Semiparametric regression during 2003--2007, Forest Garrote, Survival ensembles by the sum of pairwise differences with application to lung cancer microarray studies, Smoothed residual stopping for statistical inverse problems via truncated SVD estimation, Characterizing \(L_{2}\)Boosting, Comment on: ``Support vector machines with applications, Boosting algorithms: regularization, prediction and model fitting, Comment on: Boosting algorithms: regularization, prediction and model fitting, Adaptive kernel methods using the balancing principle, Functional gradient ascent for probit regression, Controlled sequential Monte Carlo, Early stopping in \(L_{2}\)Boosting, On the consistency of multi-label learning, Invariance, causality and robustness, Sparse regression and support recovery with \(\mathbb{L}_2\)-boosting algorithms, Boosting multi-features with prior knowledge for mini unmanned helicopter landmark detection, Boosting iterative stochastic ensemble method for nonlinear calibration of subsurface flow models, Regression with stagewise minimization on risk function, Sparse HP filter: finding kinks in the COVID-19 contact rate, CAM: causal additive models, high-dimensional order search and penalized regression, Optimal rates for regularization of statistical inverse learning problems, Improved outcome prediction across data sources through robust parameter tuning, Analysis of a two-layer neural network via displacement convexity, Response shrinkage estimators in binary regression, Boosting additive models using component-wise P-splines, Shrinkage and model selection with correlated variables via weighted fusion, Boosting nonlinear additive autoregressive time series, Variable selection and model choice in structured survival models, Multinomial logit models with implicit variable selection, Model-based boosting in R: a hands-on tutorial using the R package mboost, Remembering Leo Breiman, Remembrance of Leo Breiman, Node harvest, Sparse modeling of categorial explanatory variables, Boosting for high-dimensional linear models, Boosting with structural sparsity: a differential inclusion approach, On the choice and influence of the number of boosting steps for high-dimensional linear Cox-models, Block-based refitting in \(\ell_{12}\) sparse regularization, Forecasting financial and macroeconomic variables using data reduction methods: new empirical evidence, New multicategory boosting algorithms based on multicategory Fisher-consistent losses, High-dimensional classification using features annealed independence rules, Robust boosting for regression problems, Tree-structured modelling of categorical predictors in generalized additive regression, Boosting in Cox regression: a comparison between the likelihood-based and the model-based approaches with focus on the R-packages \textit{CoxBoost} and \textit{mboost}, Sparse recovery via differential inclusions, Nonparametric estimation of the link function including variable selection, Iterative bias reduction: a comparative study, Transformation boosting machines, Inference for \(L_2\)-boosting, Boosted nonparametric hazards with time-dependent covariates, Knot selection by boosting techniques, Boosting ridge regression, A stochastic approximation view of boosting, On boosting kernel regression, Order selection for possibly infinite-order non-stationary time series, On Lasso refitting strategies, Scalar on network regression via boosting, High-dimensional additive modeling, Boosting as a kernel-based method, Adaptive step-length selection in gradient boosting for Gaussian location and scale models, Boosting with early stopping: convergence and consistency, A new accelerated proximal boosting machine with convergence rate \(O(1/t^2)\)