Slowly Varying Regression under Sparsity

From MaRDI portal
Publication:6361236

arXiv2102.10773MaRDI QIDQ6361236

Dimitris Bertsimas, Omar Skali Lami, Vassilis Digalakis, Michael Linghzi Li

Publication date: 21 February 2021

Abstract: We consider the problem of parameter estimation in slowly varying regression models with sparsity constraints. We formulate the problem as a mixed integer optimization problem and demonstrate that it can be reformulated exactly as a binary convex optimization problem through a novel exact relaxation. The relaxation utilizes a new equality on Moore-Penrose inverses that convexifies the non-convex objective function while coinciding with the original objective on all feasible binary points. This allows us to solve the problem significantly more efficiently and to provable optimality using a cutting plane-type algorithm. We develop a highly optimized implementation of such algorithm, which substantially improves upon the asymptotic computational complexity of a straightforward implementation. We further develop a heuristic method that is guaranteed to produce a feasible solution and, as we empirically illustrate, generates high quality warm-start solutions for the binary optimization problem. We show, on both synthetic and real-world datasets, that the resulting algorithm outperforms competing formulations in comparable times across a variety of metrics including out-of-sample predictive performance, support recovery accuracy, and false positive rate. The algorithm enables us to train models with 10,000s of parameters, is robust to noise, and able to effectively capture the underlying slowly changing support of the data generating process.




Has companion code repository: https://github.com/vvdigalakis/ssvregression








This page was built for publication: Slowly Varying Regression under Sparsity

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6361236)