Skip to yearly menu bar Skip to main content


Poster

Computing Higher Order Derivatives of Matrix and Tensor Expressions

Sören Laue · Matthias Mitterreiter · Joachim Giesen

Room 210 #28

Keywords: [ Convex Optimization ] [ Optimization ] [ Non-Convex Optimization ] [ AutoML ]


Abstract:

Optimization is an integral part of most machine learning systems and most numerical optimization schemes rely on the computation of derivatives. Therefore, frameworks for computing derivatives are an active area of machine learning research. Surprisingly, as of yet, no existing framework is capable of computing higher order matrix and tensor derivatives directly. Here, we close this fundamental gap and present an algorithmic framework for computing matrix and tensor derivatives that extends seamlessly to higher order derivatives. The framework can be used for symbolic as well as for forward and reverse mode automatic differentiation. Experiments show a speedup between one and four orders of magnitude over state-of-the-art frameworks when evaluating higher order derivatives.

Live content is unavailable. Log in and register to view live content