Theseus: A Library for Differentiable Nonlinear Optimization
From MaRDI portal
Publication:6405472
arXiv2207.09442MaRDI QIDQ6405472
Austin Wang, Daniel DeTone, Joseph Ortiz, Luis Pineda, Maurizio Monge, Mustafa Mukadam, Paloma Sodhi, Stuart Anderson, Brandon Amos, Taosha Fan, Jing Dong, Ricky T. Q. Chen, Shobha Venkataraman
Publication date: 19 July 2022
Abstract: We present Theseus, an efficient application-agnostic open source library for differentiable nonlinear least squares (DNLS) optimization built on PyTorch, providing a common framework for end-to-end structured learning in robotics and vision. Existing DNLS implementations are application specific and do not always incorporate many ingredients important for efficiency. Theseus is application-agnostic, as we illustrate with several example applications that are built using the same underlying differentiable components, such as second-order optimizers, standard costs functions, and Lie groups. For efficiency, Theseus incorporates support for sparse solvers, automatic vectorization, batching, GPU acceleration, and gradient computation with implicit differentiation and direct loss minimization. We do extensive performance evaluation in a set of applications, demonstrating significant efficiency gains and better scalability when these features are incorporated. Project page: https://sites.google.com/view/theseus-ai
Has companion code repository: https://github.com/facebookresearch/theseus
This page was built for publication: Theseus: A Library for Differentiable Nonlinear Optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6405472)