Matrix algebra notes

Capital letters, likeAA, indicate matrices.

Transpose of a matrix

Transposing a matrixAA, of elements [A]ij=aij[A]_{ij} = a_{ij} , is the operation which switches the row and column positions of each element:

[At]ij=aji[A^t]_{ij} = a_{ji}

Properties

  • Transpose of the transpose:(At)t=A(A^t)^t = A

  • Transpose of the sum:(A+B)t=At+Bt(A + B)^t = A^t + B^t

  • Transpose of the product:(AB)t=BtAt(AB)^t = B^t A^t

Proofs

The first one follows straightly from definition.

The second one is straightforward just because the elements of(A+B)t(A+B)^tare the sums of elements inAtA^tandBtB^t.

The third one is easily proven using the fact that [AB]ij=kaikbkj[AB]_{ij} = \sum_k a_{ik} b_{kj}, so that we can say [(AB)t]ij=[AB]ji=kajkbki,[(AB)^t]_{ij} = [AB]_{ji} = \sum_k a_{jk} b_{ki}, and [BtAt]ij=kbiktakjt=kbkiajk[B^t A^t]_{ij} = \sum_k b^t_{ik} a^t_{kj} = \sum_k b_{ki} a_{jk} , so the two things are the same.

Special types of matrices

  • IDEMPOTENT: M2=MM^2 = M

Matrix Convolution

Given two matrices A and B (typically kernel and image, as this is used in computer vision),

their convolution is obtained via the multiplication of locationally similar entries and summing:

C=i=0i=j=1j=BijAninj\mathcal{C} = \sum_{i=0}^{i=} \sum_{j=1}^{j=} B_{ij} A_{n-in-j}

This procedure is loosely related to mathematical convolution.

Frobenius norm of a matrix

Given matrix M, its Frobenious norm is the square root of the sum of the squares of its elements.

M=i,jMij2||M|| = \sqrt{\sum_{i,j} M_{ij}^2}