Do you have 1 hour? Do you want to LEARN Linear Algebra for Machine Learning for FREE? Read Linear Algebra series NOW. Try to LEARN subjects represented in that series, and WRITE DOWN notes.

*This post is being continuously updated.
Last update: 5th of February, 2019*


Intuition

  1. Every time you do matrix multiplication, you could view it as a transformation, where movement of space is involved.

Notation

Term Notation (denotation, short explanation)
vector $\vec{v}$, denoted by small letters with arrow above
scalar any real number, e.g. $2$, $1/3$ or $\pi$
matrix A, denoted by capital letters and equals a $m\times n$ matrix
$m\times n$ m rows (horizontal) times n coloumns (vertical)
basis vectors $\hat{i}$, $\hat{j}$, $\hat{k}$ - denoted by letters i, j and k with a hat over
mapping (transformation) $T: \mathbb{R}^{m} \rightarrow \mathbb{R}^{n}$, transformation from m to n dimension
determinant scalar, the area or volume of vectors
cross product length, perpendicular to the plane of two vectors in 3D
dot product scalar, where a vector lands on another vector

Formulas