Do you have 1 hour? Do you want to LEARN Linear Algebra for Machine Learning for FREE? Read Linear Algebra series NOW. Try to LEARN subjects represented in that series, and WRITE DOWN notes.

**This post is being continuously updated.**Last update: 4th of March, 2019**

## Intuition

- Every time you do matrix multiplication, you could view it as a transformation, where movement of space is involved. That is, squishing or stretching space, resulting in movement of a vector.

## Notation

Term | Notation (denotation, short explanation) |
---|---|

vector |
$\vec{v}$, denoted by small letters with arrow above |

scalar |
any real number, e.g. $2$, $1/3$ or $\pi$ |

matrix |
A, denoted by capital letters and equals a $m\times n$ matrix |

$m\times n$ | m rows (horizontal) times n coloumns (vertical) |

basis vectors |
$\hat{i}$, $\hat{j}$, $\hat{k}$ - denoted by letters i, j and k with a hat over |

mapping (transformation) |
$T: \mathbb{R}^{m} \rightarrow \mathbb{R}^{n}$, transformation from m to n dimension |

determinant |
scalar, the area or volume of vectors |

cross product |
length, perpendicular to the plane of two vectors in 3D |

dot product |
scalar, where a vector lands on another vector |

## Formulas

**Adding vectors:**

Can also be written without matrices and all the way to *m* dimensions:

**Scalars:**

That is, multiplying a scalar (some constant, a real number) into vector coordinates.

**Matrix multiplication (2x2):**

**Determinant (2-dimensional):**

**Cross Product:**

**Dot Product:**