Screenshot oft the Matrix Calculus Website

Matrix Calculus

Computing vector and matrix derivatives
Screenshot oft the Matrix Calculus Website
Image: Julien Klaus

Matrix CalculusExternal link is an online tool for computing derivatives of linear algebra expressions. The derivatives are computed in vectorized form, that is, using compound expressions for vectors and matrices that avoid the need for indices. Vectorized linear algebra expressions can be readily mapped to highly optimized linear algebra libraries like Eigen and Numpy, and thus be evaluated efficiently. 

Our work on matrix calculus is motivated by our work on generic optimization, where an optimization problem is specified in a simple modelling language and a solver is generated from the specification at the click of a button. However, the stand-alone matrix calculus tool has proven its independent value. Since its launch in 2018, millions of derivatives have been computed by tens of thousands of users from all over the world.

Computing derivatives in vectorized form is a nontrivial task. For many years, vectorized derivatives have been tabulated in repositories like the Matrix Cookbook, but no generic algorithm to compute them was known. We have solved this problem by translating vectorized linear algebra expressions into some index form, computing the derivative in index form, and then translating back into vectorized form. An introduction to our approach can be found in a Research Highlight de that is based on the publications:

Social media: Matrix calculus on Hacker NewsExternal link and on Andrew Gelman's BlogExternal link.

We use our matrix calculus for computing convexity certificates for a fairly general class of expressions that covers much of classical machine learning. Details can be found in the following publication:

  •  J. Klaus, N. Merk, K. Wiedom, S. Laue and J. Giesen. Convexity Certificates from Hessians. Proceedings of the 36th Conference on Neural Information Processing Systems (NeurIPS), (2022) accepted