25th July 2021

Diagonal of Squared Jacobian

Assume \(f\) is a \(m\)-vector valued function in \(n\)-variables, i.e., \(f:U\subseteq\mathbb{R}^n\to\mathbb{R}^m\). The Jacobian is given by

$$ J = \begin{pmatrix} f_{x_1}^{(1)} & \cdots & f_{x_n}^{(1)} \\ \vdots & \ddots & \vdots \\ f_{x_1}^{(m)} & \cdots & f_{x_n}^{(m)} \\ \end{pmatrix} $$

We use

$$ f_{x_i}^{(j)} = {\partial f^{(j)}\over\partial x_i}. $$

The diagonal elements \(D_{ii}\) of \(D := J^T J \in\mathbb{R}^{m\times m}\) are

$$ D_{ii} = \left(f_{x_i}^{(1)}\right)^2 + \left(f_{x_i}^{(2)}\right)^2 + \cdots + \left(f_{x_i}^{(m)}\right)^2 . $$

One diagonal element can be computed in \(\mathcal{O}(m)\) multiplications and \(\mathcal{O}(m)\) additions. All \(m\) diagonals can therefore be computed in \(\mathcal{O}(m^2)\) multiplications and \(\mathcal{O}(m^2)\) additions. That's the same effort as one matrix-vector multiplication.

Open questions:

  1. How to fetch the diagonal elements in case of a matrix-free setting?
  2. In case of projections with \((1,0,\ldots,0)\), \((0,1,\ldots,0)\), etc., do we incur too much effort?



Categories: Math
Author: Elmar Klausmeier