Do you have 1 hour? Do you want to LEARN Linear Algebra for Machine Learning for FREE? Read Linear Algebra series NOW. Try to LEARN subjects represented in that series, and WRITE DOWN notes.

*This post is being continuously updated.
Last update: 4th of March, 2019*


  1. Every time you do matrix multiplication, you could view it as a transformation, where movement of space is involved. That is, squishing or stretching space, resulting in movement of a vector.


Term Notation (denotation, short explanation)
vector $\vec{v}$, denoted by small letters with arrow above
scalar any real number, e.g. $2$, $1/3$ or $\pi$
matrix A, denoted by capital letters and equals a $m\times n$ matrix
$m\times n$ m rows (horizontal) times n coloumns (vertical)
basis vectors $\hat{i}$, $\hat{j}$, $\hat{k}$ - denoted by letters i, j and k with a hat over
mapping (transformation) $T: \mathbb{R}^{m} \rightarrow \mathbb{R}^{n}$, transformation from m to n dimension
determinant scalar, the area or volume of vectors
cross product length, perpendicular to the plane of two vectors in 3D
dot product scalar, where a vector lands on another vector


Adding vectors:

$$ \vec{u} + \vec{v} = \begin{bmatrix} x_1 + x_2\\ y_1 + y_2 \end{bmatrix} $$

Can also be written without matrices and all the way to m dimensions:

$$ \vec{u} + \vec{v} = (u_1+v_1,u_2+v_2, ..., u_n+v_n) $$


$$ \vec{v} = 3 \begin{bmatrix} 2\\ 1 \end{bmatrix} = \begin{bmatrix} 3(2)\\ 3(1) \end{bmatrix} = \begin{bmatrix} 6\\ 3 \end{bmatrix} $$

That is, multiplying a scalar (some constant, a real number) into vector coordinates.

Matrix multiplication (2x2):

$$ \begin{bmatrix} a & b\\ c & d \end{bmatrix} \begin{bmatrix} x\\ y \end{bmatrix} = x \begin{bmatrix} a\\ c \end{bmatrix} + y \begin{bmatrix} b\\ d \end{bmatrix} = \begin{bmatrix} ax + by\\ cx + dy \end{bmatrix} $$

Determinant (2-dimensional):

$$ det\left(\begin{bmatrix} a & b \\c & d \end{bmatrix}\right) = ad-bc$$

Cross Product:

$$\begin{bmatrix}v_1 \\v_2 \\v_3\end{bmatrix}\times\begin{bmatrix}w_1 \\w_2 \\w_3\end{bmatrix}=det\left(\begin{bmatrix}\hat{i} & v_1 & w_1 \\\hat{j} & v_2 & w_2 \\\hat{k} & v_3 & w_3\end{bmatrix}\right)=\hat{i}(v_2w_3-v_3w_2)+\hat{j}(v_3w_1-v_1w_3)+\hat{k}(v_1w_2-v_2w_1)$$

Dot Product: