Timezone: »

 
Demonstration
Matrix Calculus -- The Power of Symbolic Differentiation
Sören Laue · Matthias Mitterreiter · Joachim Giesen

Wed Dec 06 07:00 PM -- 10:30 PM (PST) @ Pacific Ballroom Concourse #D7
Event URL: http://www.matrixcalculus.org »

Numerical optimization is a work horse of machine learning that often requires the derivation and computation of gradients and Hessians. For learning problem that are modeled by some loss or likelihood function, the gradients and Hessians are typically derived manually, which is a time consuming and error prone process. Computing gradients (and Hessians) is also an integral part of deep learning frameworks that mostly employ automatic differentiation, aka algorithmic differentiation (typically in reverse mode). At www.MatrixCalculus.org we provide a tool for symbolically computing gradients and Hessians that can be used in the classical setting of loss and likelihood functions, for constrained optimization, but also for deep learning.

Author Information

Sören Laue (Universitaet Jena)
Matthias Mitterreiter (Friedrich Schiller University Jena)
Joachim Giesen (Friedrich-Schiller-Universitat Jena)

More from the Same Authors