Timezone: »

Theseus: A Library for Differentiable Nonlinear Optimization
Luis Pineda · Taosha Fan · Maurizio Monge · Shobha Venkataraman · Paloma Sodhi · Ricky T. Q. Chen · Joseph Ortiz · Daniel DeTone · Austin Wang · Stuart Anderson · Jing Dong · Brandon Amos · Mustafa Mukadam

Tue Nov 29 09:00 AM -- 11:00 AM (PST) @ Hall J #915

We present Theseus, an efficient application-agnostic open source library for differentiable nonlinear least squares (DNLS) optimization built on PyTorch, providing a common framework for end-to-end structured learning in robotics and vision. Existing DNLS implementations are application specific and do not always incorporate many ingredients important for efficiency. Theseus is application-agnostic, as we illustrate with several example applications that are built using the same underlying differentiable components, such as second-order optimizers, standard costs functions, and Lie groups. For efficiency, Theseus incorporates support for sparse solvers, automatic vectorization, batching, GPU acceleration, and gradient computation with implicit differentiation and direct loss minimization. We do extensive performance evaluation in a set of applications, demonstrating significant efficiency gains and better scalability when these features are incorporated. Project page: https://sites.google.com/view/theseus-ai/

Author Information

Luis Pineda (Facebook AI Research)
Taosha Fan (Northwestern University, Northwestern University)
Maurizio Monge
Shobha Venkataraman (Facebook)
Paloma Sodhi (ASAPP, Inc.)
Ricky T. Q. Chen (FAIR Labs, Meta AI)
Joseph Ortiz (Imperial College London)
Daniel DeTone (Meta)
Austin Wang (CMU, Carnegie Mellon University)
Stuart Anderson (Facebook)
Jing Dong (Georgia Institute of Technology)
Brandon Amos (Facebook AI Research)
Mustafa Mukadam (Meta AI / FAIR)

More from the Same Authors