Linear Independence#


Table of Contents#


Definition of Linear Independence#

Linear Independence

A set of vectors

\(\{ \mathbf{v_1, v_2, \dots, v_p} \} \in \mathbb{R}^n\)

is linearly independent if the vector equation

\(x_1\mathbf{v_1} + x_2\mathbf{v_2} + \dots + x_p\mathbf{v_p} = \mathbf{0}\)

only admits the trivial solution \(\mathbf{x = 0}\).


On the other hand, a set of vectors

\(\{ \mathbf{v_1, v_2, \dots, v_p} \} \in \mathbb{R}^n\)

is linearly dependent if the vector equation

\(x_1\mathbf{v_1} + x_2\mathbf{v_2} + \dots + x_p\mathbf{v_p} = \mathbf{0}\)

admits non trivial solutions.

Let there be a set of vectors \(\{ \mathbf{v_1, v_2, \dots, v_p} \} \in \mathbb{R}^n\).

If the vector equation \(x_1\mathbf{v_1} + x_2\mathbf{v_2} + \dots + x_p\mathbf{v_p} = \mathbf{0}\) admits non trivial solutions then there exist \(r_1, r_2, \dots, r_p \in \mathbb{R}\) not all of which are \(0\) (say for example \(\textcolor{#0096FF}{r_k \ne 0}\)) such that

\( \begin{aligned} r_1\mathbf{v_1} && + && \dots && + && r_{k - 1}\mathbf{v_{k - 1}} && + && \textcolor{#0096FF}{r_k}\mathbf{v_k} && + && r_{k + 1}\mathbf{v_{k + 1}} && + && \dots && + && x_p\mathbf{v_p} && = && \mathbf{0} \\ \frac{r_1}{r_k} \mathbf{v_1} && + && \dots && + && \frac{r_{k - 1}}{r_k} \mathbf{v_{k - 1}} && + && \textcolor{#0096FF}{1} \mathbf{v_k} && + && \frac{r_{k + 1}}{r_k} \mathbf{v_{k + 1}} && + && \dots && + && \frac{r_p}{r_k}\mathbf{v_p} && = && \mathbf{0} \\ -\frac{r_1}{r_k} \mathbf{v_1} && - && \dots && - && \frac{r_{k - 1}}{r_k} \mathbf{v_{k - 1}} && && && - && \frac{r_{k + 1}}{r_k} \mathbf{v_{k + 1}} && - && \dots && - && \frac{r_p}{r_k}\mathbf{v_p} && = && \mathbf{v_k} \\ \end {aligned} \)

Therefore, \(\mathbf{v_k} \in \text{span}\{ \mathbf{v_1, \dots, v_{k - 1}, v_{k + 1}, \dots, v_p} \}\) and so

If a set of vectors \(\{ \mathbf{v_1, v_2, \dots, v_p} \} \in \mathbb{R}^n\) is linearly dependent then (at least) one of its vectors \(\mathbf{v_k}\) can be written as a linear combination of the remaining vectors \(\mathbf{v_1, \dots, v_{k - 1}, v_{k + 1}, \dots, v_p}\).

Conversely, let there be a set of vectors \(\{ \mathbf{v_1, v_2, \dots, v_p} \} \in \mathbb{R}^n\) and let it be the case that for some \(c_1, c_2, \dots, c_p \in \mathbb{R}\) one of the vectors \(\mathbf{v_k}\) be a linear combination of the remaining vectors

\( \begin{aligned} c_1\mathbf{v_1} && + && \dots && + && c_{k - 1}\mathbf{v_{k - 1}} && && && + && c_{k + 1}\mathbf{v_{k + 1}} && + && \dots && + && c_p\mathbf{v_p} && = && \mathbf{v_k} \\ c_1\mathbf{v_1} && + && \dots && + && c_{k - 1}\mathbf{v_{k - 1}} && + && (-1)\mathbf{v_k} && + && c_{k + 1}\mathbf{v_{k + 1}} && + && \dots && + && c_p\mathbf{v_p} && = && \mathbf{0} \\ \end {aligned} \)

The \(p\)-tuple \((c_1, \dots, c_{k - 1}, -1, c_{k + 1}, \dots, c_p)\) is a non trivial solution of the equation \(c_1\mathbf{v_1} + c_2\mathbf{v_2} + \dots + c_p\mathbf{v_p} = \mathbf{0}\) and so

the set of vectors \(\{ \mathbf{v_1, v_2, \dots, v_p} \} \in \mathbb{R}^n\) is linearly dependent.

In other words, if the vector \(\mathbf{v_k} \in \text{span}\{ \mathbf{v_1, \dots, v_{k - 1}, v_{k + 1}, \dots, v_p} \}\) then the set of vectors \(\{ \mathbf{v_1, v_2, \dots, v_p} \}\) is linearly dependent.


Theorem on Linear Independence#

Theorem

A set of vectors in \(\mathbb{R}^n\) is linearly dependent iff at least one vector of the set can be written as a linear combination of the other vectors of the set.

A set of vectors in \(\mathbb{R}^n\) is linearly independent iff no vector of the set can be written as a linear combination of the other vectors of the set.


In practice#

Independence#

A set of vectors

\(\{ \mathbf{v_1, v_2, \dots, v_p} \} \in \mathbb{R}^n\)

is linearly independent if the vector equation

\(x_1 \mathbf{v_1} + x_2 \mathbf{v_2} + \dots + x_p \mathbf{v_p} = \mathbf{0}\)

admits only the trivial solution \(\mathbf{x = 0}\).

This occurs when the homogeneous linear system

\( \underset{\mathbf{A}}{ \begin{bmatrix} \vert & \vert & & \vert \\ \vert & \vert & & \vert \\ \mathbf{v_1} & \mathbf{v_2} & \dots & \mathbf{v_p} \\ \vert & \vert & & \vert \\ \vert & \vert & & \vert \\ \end {bmatrix} } \underset{\mathbf{x}}{ \begin{bmatrix} x_1 \\ x_2 \\ \vdots \\ x_p \\ \end {bmatrix} } = \underset{\mathbf{0}}{ \begin{bmatrix} 0 \\ 0 \\ \vdots \\ 0 \\ \end {bmatrix} } \)

has no free variable (when every column of its coefficient matrix contains a pivot position).

The columns of a matrix \(\mathbf{A}\) are linearly independent iff the matrix equation \(\mathbf{Ax = 0}\) admits only the trivial solution \(\mathbf{x = 0}\).

The columns of a matrix \(\mathbf{A}\) are linearly independent iff the matrix \(\mathbf{A}\) is row-equivalent to a row echelon matrix with a pivot in each column.

Dependence#

A set of vectors

\(\{ \mathbf{v_1, v_2, \dots, v_p} \} \in \mathbb{R}^n\)

is linearly dependent if the vector equation

\(x_1 \mathbf{v_1} + x_2 \mathbf{v_2} + \dots + x_p \mathbf{v_p} = \mathbf{0}\)

admits non trivial solutions.

This occurs when the homogeneous linear system

\( \underset{\mathbf{A}}{ \begin{bmatrix} \vert & \vert & & \vert \\ \vert & \vert & & \vert \\ \mathbf{v_1} & \mathbf{v_2} & \dots & \mathbf{v_p} \\ \vert & \vert & & \vert \\ \vert & \vert & & \vert \\ \end {bmatrix} } \underset{\mathbf{x}}{ \begin{bmatrix} x_1 \\ x_2 \\ \vdots \\ x_p \\ \end {bmatrix} } = \underset{\mathbf{0}}{ \begin{bmatrix} 0 \\ 0 \\ \vdots \\ 0 \\ \end {bmatrix} } \)

has at least one free variable (when at least one column of its coefficient matrix does not contain a pivot position).

The columns of a matrix \(\mathbf{A}\) are linearly dependent iff the matrix equation \(\mathbf{Ax = 0}\) admits non trivial solutions.

The columns of a matrix \(\mathbf{A}\) are linearly dependent iff the matrix \(\mathbf{A}\) is row-equivalent to a row echelon matrix with at least one pivot-free column.


Examples#

Example#

PROBLEM STATEMENT

Is the following set of vectors linearly independent?

\( \left \{ \begin{bmatrix*}[r] 1 \\ 3 \\ 5 \\ \end {bmatrix*}, \begin{bmatrix*}[r] 2 \\ 5 \\ 9 \\ \end {bmatrix*}, \begin{bmatrix*}[r] -3 \\ 9 \\ 3 \\ \end {bmatrix*} \right\} \)

ROW REDUCE

\( \begin{bmatrix*}[r] \boxed{1} & 2 & -3 \\ 3 & 5 & 9 \\ 5 & 9 & 3 \\ \end {bmatrix*} \underset{r_2 \leftarrow\,\,\, r_2 + (-3)r_1}{\rightarrow} \begin{bmatrix*}[r] \boxed{1} & 2 & -3 \\ 0 & -1 & 18 \\ 5 & 9 & 3 \\ \end {bmatrix*} \underset{r_3 \leftarrow\,\,\, r_3 + (-5)r_1}{\rightarrow} \begin{bmatrix*}[r] \boxed{1} & 2 & -3 \\ 0 & \boxed{-1} & 18 \\ 0 & -1 & 18 \\ \end {bmatrix*} \)

\( \begin{bmatrix*}[r] \boxed{1} & 2 & -3 \\ 0 & \boxed{-1} & 18 \\ 0 & -1 & 18 \\ \end {bmatrix*} \underset{r_3 \leftarrow\,\,\, r_3 + (-1)r_2}{\rightarrow} \underbrace{ \begin{bmatrix*}[r] \boxed{1} & 2 & -3 \\ 0 & \boxed{-1} & 18 \\ 0 & 0 & 0 \\ \end {bmatrix*} }_{\textbf{row echelon}} \underset{r_2 \leftarrow\,\,\, (-1)r_2}{\rightarrow} \begin{bmatrix*}[r] \boxed{1} & 2 & -3 \\ 0 & \boxed{1} & -18 \\ 0 & 0 & 0 \\ \end {bmatrix*} \)

The set of vectors is linearly dependent because there is at least one pivot-free column in the row echelon matrix.

\( \begin{bmatrix*}[r] \boxed{1} & 2 & -3 \\ 0 & \boxed{1} & -18 \\ 0 & 0 & 0 \\ \end {bmatrix*} \underset{r_1 \leftarrow\,\,\, r_1 + (-2)r_2}{\rightarrow} \underbrace{ \begin{bmatrix*}[r] \boxed{1} & 0 & 33 \\ 0 & \boxed{1} & -18 \\ 0 & 0 & 0 \\ \end {bmatrix*} }_{\textbf{reduced}} \)

\( \begin{aligned} x_1 + 33x_3 &= 0 && \iff && x_1 = -33x_3 \\ x_2 - 18x_3 &= 0 && \iff && x_2 = 18x_3 \\ & && && x_3 \,\,\,\text{free} \\ \end {aligned} \)

\( \mathbf{x} = \begin{bmatrix*}[r] -33t \\ 18t \\ t \\ \end{bmatrix*} = t \begin{bmatrix*}[r] -33 \\ 18 \\ 1 \\ \end{bmatrix*} \)

\( \underset{\mathbf{A}}{ \begin{bmatrix*}[r] 1 & 2 & -3 \\ 3 & 5 & 9 \\ 5 & 9 & 3 \\ \end {bmatrix*} } \underset{\mathbf{x}}{ \begin{bmatrix} x_1 \\ x_2 \\ x_3 \\ \end{bmatrix} } = \underset{\mathbf{0}}{ \begin{bmatrix} 0 \\ 0 \\ 0 \\ \end{bmatrix} } = x_1 \begin{bmatrix*}[r] 1 \\ 3 \\ 5 \\ \end{bmatrix*} + x_2 \begin{bmatrix*}[r] 2 \\ 5 \\ 9 \\ \end{bmatrix*} + x_3 \begin{bmatrix*}[r] -3 \\ 9 \\ 3 \\ \end{bmatrix*} = -33 \begin{bmatrix*}[r] 1 \\ 3 \\ 5 \\ \end{bmatrix*} + 18 \begin{bmatrix*}[r] 2 \\ 5 \\ 9 \\ \end{bmatrix*} + 1 \begin{bmatrix*}[r] -3 \\ 9 \\ 3 \\ \end{bmatrix*} \iff \underbrace{ \begin{bmatrix*}[r] -3 \\ 9 \\ 3 \\ \end{bmatrix*} = 33 \begin{bmatrix*}[r] 1 \\ 3 \\ 5 \\ \end{bmatrix*} - 18 \begin{bmatrix*}[r] 2 \\ 5 \\ 9 \\ \end{bmatrix*} }_{\mathbf{v_3} = 33\mathbf{v_1} - 18\mathbf{v_2}} \)

\(\mathbf{v_3} = 33\mathbf{v_1} - 18\mathbf{v_2}\)


Example#

PROBLEM STATEMENT

Is the following set of vectors linearly independent?

\( \left \{ \begin{bmatrix*}[r] 0 \\ 2 \\ 2 \\ \end {bmatrix*}, \begin{bmatrix*}[r] 0 \\ 0 \\ -8 \\ \end {bmatrix*}, \begin{bmatrix*}[r] -1 \\ 3 \\ 1 \\ \end {bmatrix*} \right\} \)

ROW REDUCE

\( \begin{bmatrix*}[r] \boxed{0} & 0 & -1 \\ 2 & 0 & 3 \\ 2 & -8 & 1 \\ \end {bmatrix*} \underset{r_1 \leftrightarrow\,\,\, r_3}{\rightarrow} \begin{bmatrix*}[r] \boxed{2} & -8 & 1 \\ 2 & 0 & 3 \\ 0 & 0 & -1 \\ \end {bmatrix*} \underset{r_2 \leftarrow\,\,\, r_2 + (-1)r_1}{\rightarrow} \underbrace{ \begin{bmatrix*}[r] \boxed{2} & -8 & 1 \\ 0 & \boxed{8} & 2 \\ 0 & 0 & \boxed{-1} \\ \end {bmatrix*} }_{\textbf{row echelon}} \)

The set of vectors is linearly independent because there is no pivot-free column in the row echelon matrix.

\( \begin{bmatrix*}[r] \boxed{2} & -8 & 1 \\ 0 & \boxed{8} & 2 \\ 0 & 0 & \boxed{-1} \\ \end {bmatrix*} \underset{r_3 \leftarrow\,\,\, (-1)r_3}{\rightarrow} \begin{bmatrix*}[r] \boxed{2} & -8 & 1 \\ 0 & \boxed{8} & 2 \\ 0 & 0 & \boxed{1} \\ \end {bmatrix*} \underset{r_2 \leftarrow\,\,\, r_2 + (-2)r_3}{\rightarrow} \begin{bmatrix*}[r] \boxed{2} & -8 & 1 \\ 0 & \boxed{8} & 0 \\ 0 & 0 & \boxed{1} \\ \end {bmatrix*} \)

\( \begin{bmatrix*}[r] \boxed{2} & -8 & 1 \\ 0 & \boxed{8} & 0 \\ 0 & 0 & \boxed{1} \\ \end {bmatrix*} \underset{r_1 \leftarrow\,\,\, r_1 + (-1)r_3}{\rightarrow} \begin{bmatrix*}[r] \boxed{2} & -8 & 0 \\ 0 & \boxed{8} & 0 \\ 0 & 0 & \boxed{1} \\ \end {bmatrix*} \underset{r_2 \leftarrow\,\,\, \frac{1}{8} r_2}{\rightarrow} \begin{bmatrix*}[r] \boxed{2} & -8 & 0 \\ 0 & \boxed{1} & 0 \\ 0 & 0 & \boxed{1} \\ \end {bmatrix*} \)

\( \begin{bmatrix*}[r] \boxed{2} & -8 & 0 \\ 0 & \boxed{1} & 0 \\ 0 & 0 & \boxed{1} \\ \end {bmatrix*} \underset{r_1 \leftarrow\,\,\, r_1 + 8r_2}{\rightarrow} \begin{bmatrix*}[r] \boxed{2} & 0 & 0 \\ 0 & \boxed{1} & 0 \\ 0 & 0 & \boxed{1} \\ \end {bmatrix*} \underset{r_1 \leftarrow\,\,\, \frac{1}{2} r_1}{\rightarrow} \underbrace{ \begin{bmatrix*}[r] \boxed{1} & 0 & 0 \\ 0 & \boxed{1} & 0 \\ 0 & 0 & \boxed{1} \\ \end {bmatrix*} }_{\textbf{reduced}} \)


Example#

PROBLEM STATEMENT

For which values of \(h\) is the following set of vectors linearly dependent?

\( \left \{ \begin{bmatrix*}[r] 1 \\ -3 \\ -5 \\ \end {bmatrix*}, \begin{bmatrix*}[r] -3 \\ 9 \\ 15 \\ \end {bmatrix*}, \begin{bmatrix*}[r] 2 \\ -5 \\ h \\ \end {bmatrix*} \right\} \)

ROW REDUCE

\( \begin{bmatrix*}[r] \boxed{1} & -3 & 2 \\ -3 & 9 & -5 \\ -5 & 15 & h \\ \end {bmatrix*} \underset{r_2 \leftarrow\,\,\, r_2 + 3r_1}{\rightarrow} \begin{bmatrix*}[r] \boxed{1} & -3 & 2 \\ 0 & 0 & 1 \\ -5 & 15 & h \\ \end {bmatrix*} \underset{r_3 \leftarrow\,\,\, r_3 + 5r_1}{\rightarrow} \begin{bmatrix*}[r] \boxed{1} & -3 & 2 \\ 0 & 0 & \boxed{1} \\ 0 & 0 & h + 10 \\ \end {bmatrix*} \)

\( \begin{bmatrix*}[r] \boxed{1} & -3 & 2 \\ 0 & 0 & \boxed{1} \\ 0 & 0 & h + 10 \\ \end {bmatrix*} \underset{r_3 \leftarrow\,\,\, r_3 + (-h - 10)r_2}{\rightarrow} \underbrace{ \begin{bmatrix*}[r] \boxed{1} & -3 & 2 \\ 0 & 0 & \boxed{1} \\ 0 & 0 & 0 \\ \end {bmatrix*} }_{\textbf{row echelon}} \underset{r_1 \leftarrow\,\,\, r_1 + (-2)r_2}{\rightarrow} \underbrace{ \begin{bmatrix*}[r] \boxed{1} & -3 & 0 \\ 0 & 0 & \boxed{1} \\ 0 & 0 & 0 \\ \end {bmatrix*} }_{\textbf{reduced}} \)

The set of vectors is linearly dependent for any value of \(h\) because there is at least one pivot-free column in the row echelon matrix.

\( \begin{aligned} x_1 - 3x_2 &= 0 && \iff && x_1 = 3x_2 \\ & && && x_2 \,\,\,\text{free} \\ & && && x_3 = 0 \\ \end {aligned} \)

\( \mathbf{x} = \begin{bmatrix*}[r] 3t \\ t \\ 0 \\ \end{bmatrix*} = t \begin{bmatrix*}[r] 3 \\ 1 \\ 0 \\ \end{bmatrix*} \)

\( \underset{\mathbf{A}}{ \begin{bmatrix*}[r] 1 & -3 & 2 \\ -3 & 9 & -5 \\ -5 & 15 & h \\ \end {bmatrix*} } \underset{\mathbf{x}}{ \begin{bmatrix} x_1 \\ x_2 \\ x_3 \\ \end{bmatrix} } = \underset{\mathbf{0}}{ \begin{bmatrix} 0 \\ 0 \\ 0 \\ \end{bmatrix} } = x_1 \begin{bmatrix*}[r] 1 \\ -3 \\ -5 \\ \end{bmatrix*} + x_2 \begin{bmatrix*}[r] -3 \\ 9 \\ 15 \\ \end{bmatrix*} + x_3 \begin{bmatrix*}[r] 2 \\ -5 \\ h \\ \end{bmatrix*} = 3 \begin{bmatrix*}[r] 1 \\ -3 \\ -5 \\ \end{bmatrix*} + 1 \begin{bmatrix*}[r] -3 \\ 9 \\ 15 \\ \end{bmatrix*} + 0 \begin{bmatrix*}[r] 2 \\ -5 \\ h \\ \end{bmatrix*} \iff \underbrace{ \begin{bmatrix*}[r] -3 \\ 9 \\ 15 \\ \end{bmatrix*} = 3 \begin{bmatrix*}[r] 1 \\ -3 \\ -5 \\ \end{bmatrix*} + 0 \begin{bmatrix*}[r] 2 \\ -5 \\ h \\ \end{bmatrix*} }_{\mathbf{v_2} = (-3)\mathbf{v_1} + 0\mathbf{v_3}} \)

\(\mathbf{v_2} = (-3)\mathbf{v_1} + 0\mathbf{v_3}\)


Example#

PROBLEM STATEMENT

For which values of \(h\) is the following set of vectors linearly dependent?

\( \left \{ \begin{bmatrix*}[r] 3 \\ -6 \\ 1 \\ \end {bmatrix*}, \begin{bmatrix*}[r] -6 \\ 4 \\ -3 \\ \end {bmatrix*}, \begin{bmatrix*}[r] 9 \\ h \\ 3 \\ \end {bmatrix*} \right\} \)

ROW REDUCE

\( \begin{bmatrix*}[r] \boxed{3} & -6 & 9 \\ -6 & 4 & h \\ 1 & -3 & 3 \\ \end {bmatrix*} \underset{r_2 \leftarrow\,\,\, r_2 + 2r_1}{\rightarrow} \begin{bmatrix*}[r] \boxed{3} & -6 & 9 \\ 0 & -8 & h + 18 \\ 1 & -3 & 3 \\ \end {bmatrix*} \underset{r_3 \leftarrow\,\,\, r_3 + (-\frac{1}{3})r_1}{\rightarrow} \begin{bmatrix*}[r] \boxed{3} & -6 & 9 \\ 0 & \boxed{-8} & h + 18 \\ 0 & -1 & 0 \\ \end {bmatrix*} \)

\( \begin{bmatrix*}[r] \boxed{3} & -6 & 9 \\ 0 & \boxed{-8} & h + 18 \\ 0 & -1 & 0 \\ \end {bmatrix*} \underset{r_2 \leftrightarrow\,\,\, r_3}{\rightarrow} \begin{bmatrix*}[r] \boxed{3} & -6 & 9 \\ 0 & \boxed{-1} & 0 \\ 0 & -8 & h + 18 \\ \end {bmatrix*} \underset{r_3 \leftarrow\,\,\, r_3 + (-8)r_2}{\rightarrow} \underbrace{ \begin{bmatrix*}[r] \boxed{3} & -6 & 9 \\ 0 & \boxed{-1} & 0 \\ 0 & 0 & \textcolor{yellow}{\boxed{h + 18}} \\ \end {bmatrix*} }_{\textbf{row echelon}} \)

\( \begin{bmatrix*}[r] \boxed{3} & -6 & 9 \\ 0 & \boxed{-1} & 0 \\ 0 & 0 & \textcolor{yellow}{\boxed{h + 18}} \\ \end {bmatrix*} \underset{r_2 \leftarrow\,\,\, (-1)r_2}{\rightarrow} \begin{bmatrix*}[r] \boxed{3} & -6 & 9 \\ 0 & \boxed{1} & 0 \\ 0 & 0 & \textcolor{yellow}{\boxed{h + 18}} \\ \end {bmatrix*} \underset{r_1 \leftarrow\,\,\, \frac{1}{3} r_1}{\rightarrow} \begin{bmatrix*}[r] \boxed{1} & -2 & 3 \\ 0 & \boxed{1} & 0 \\ 0 & 0 & \textcolor{yellow}{\boxed{h + 18}} \\ \end {bmatrix*} \)

\( \begin{cases} h + 18 \ne 0 && \iff && h \ne -18 && \text{no pivot-free column and so linearly independent} \\ h + 18 = 0 && \iff && h = -18 && \text{ a pivot-free column and so linearly dependent} \\ \end {cases} \)

\( \begin{bmatrix*}[r] \boxed{1} & -2 & 3 \\ 0 & \boxed{1} & 0 \\ 0 & 0 & 0 \\ \end {bmatrix*} \underset{r_1 \leftarrow\,\,\, r_1 + 2r_2}{\rightarrow} \underbrace{ \begin{bmatrix*}[r] \boxed{1} & 0 & 3 \\ 0 & \boxed{1} & 0 \\ 0 & 0 & 0 \\ \end {bmatrix*} }_{\textbf{reduced}} \)

\( \begin{aligned} x_1 + 3x_3 &= 0 && \iff && x_1 = -3x_3 \\ & && && x_2 = 0 \\ & && && x_3 \,\,\,\text{free} \\ \end {aligned} \)

\( \mathbf{x} = \begin{bmatrix*}[r] -3t \\ 0 \\ t \\ \end{bmatrix*} = t \begin{bmatrix*}[r] -3 \\ 0 \\ 1 \\ \end{bmatrix*} \)

\( \underset{\mathbf{A}}{ \begin{bmatrix*}[r] 3 & -6 & 9 \\ -6 & 4 & h = -18 \\ 1 & -3 & 3 \\ \end {bmatrix*} } \underset{\mathbf{x}}{ \begin{bmatrix} x_1 \\ x_2 \\ x_3 \\ \end{bmatrix} } = \underset{\mathbf{0}}{ \begin{bmatrix} 0 \\ 0 \\ 0 \\ \end{bmatrix} } = x_1 \begin{bmatrix*}[r] 3 \\ -6 \\ 1 \\ \end{bmatrix*} + x_2 \begin{bmatrix*}[r] -6 \\ 4 \\ -3 \\ \end{bmatrix*} + x_3 \begin{bmatrix*}[r] 9 \\ -18 \\ 3 \\ \end{bmatrix*} = -3 \begin{bmatrix*}[r] 3 \\ -6 \\ 1 \\ \end{bmatrix*} + 0 \begin{bmatrix*}[r] -6 \\ 4 \\ -3 \\ \end{bmatrix*} + 1 \begin{bmatrix*}[r] 9 \\ -18 \\ 3 \\ \end{bmatrix*} \iff \underbrace{ \begin{bmatrix*}[r] 9 \\ -18 \\ 3 \\ \end{bmatrix*} = 3 \begin{bmatrix*}[r] 3 \\ -6 \\ 1 \\ \end{bmatrix*} + 0 \begin{bmatrix*}[r] -6 \\ 4 \\ -3 \\ \end{bmatrix*} }_{\mathbf{v_3} = 3\mathbf{v_1} + 0\mathbf{v_2}} \)

\(\mathbf{v_3} = 3\mathbf{v_1} + 0\mathbf{v_2}\)


Theorem#

Theorem

A set of vectors \(\{ \mathbf{v_1, v_2, \dots, v_p} \} \in \mathbb{R}^n\) that contains more vectors than there are entries per vector

\(p > n\)

is necessarily linearly dependent.


Example#

PROBLEM STATEMENT

Is the following set of vectors linearly independent?

\( \left \{ \begin{bmatrix*}[r] 2 \\ 7 \\ 6 \\ \end {bmatrix*}, \begin{bmatrix*}[r] 3 \\ 0 \\ 8 \\ \end {bmatrix*}, \begin{bmatrix*}[r] 4 \\ 2 \\ 5 \\ \end {bmatrix*}, \begin{bmatrix*}[r] 5 \\ 1 \\ 9 \\ \end {bmatrix*} \right\} \)

The linear system

\( \begin{bmatrix} 2 & 3 & 4 & 5 \\ 7 & 0 & 2 & 1 \\ 6 & 8 & 5 & 9 \\ \end {bmatrix} \begin{bmatrix} x_1 \\ x_2 \\ x_3 \\ x_4 \\ \end{bmatrix} = \begin{bmatrix} 0 \\ 0 \\ 0 \\ \end{bmatrix} \)

has \(3\) equations and \(4\) unknowns.

Since there are more unknowns than there are equations the system must have a free variable and so the set of vectors is linearly dependent.


A set of vectors containing the zero vector is necessarily linearly dependent#

Let there be a set of vectors \(\{ \mathbf{v_1, v_2, \dots, v_p} \}\) and let one of its vectors \(\textcolor{#0096FF}{\mathbf{v_k = 0}}\) where \((1 \le k \le p)\).

\( \begin{aligned} x_1 \mathbf{v_1} && + && \dots && + && x_{k-1} \mathbf{v_{k-1}} && + && x_k \textcolor{#0096FF}{\mathbf{v_k}} && + && x_{k+1} \mathbf{v_{k+1}} && + && \dots && + && x_p \mathbf{v_p} && = && \mathbf{0} \\ x_1 \mathbf{v_1} && + && \dots && + && x_{k-1} \mathbf{v_{k-1}} && + && x_k \textcolor{#0096FF}{\mathbf{ 0}} && + && x_{k+1} \mathbf{v_{k+1}} && + && \dots && + && x_p \mathbf{v_p} && = && \mathbf{0} \\ x_1 \mathbf{v_1} && + && \dots && + && x_{k-1} \mathbf{v_{k-1}} && + && \textcolor{red}{1} \cdot \textcolor{#0096FF}{\mathbf{ 0}} && + && x_{k+1} \mathbf{v_{k+1}} && + && \dots && + && x_p \mathbf{v_p} && = && \mathbf{0} && \text{non trivial solution}\\ \end {aligned} \)

The set of vectors \(\{ \mathbf{v_1, \dots, v_{k-1}, \textcolor{#0096FF}{\mathbf{0}}, v_{k+1}, \dots, v_p} \}\) is linearly dependent.


When is a set containing a single vector linearly independent?#

Let there be a set of a vector \(\{ \mathbf{v} \}\).

If \(\mathbf{v = 0}\) then \(1\) is a non trivial solution of \(x \mathbf{v = 0}\) and so the set is linearly dependent.

If \(\mathbf{v \ne 0}\) then one of the coordinates \(v_k\) of \(\mathbf{v}\) is not zero and the vector equation \(x \mathbf{v = 0}\) can be rewritten as

\( x \begin{bmatrix} v_1 \\ v_2 \\ \vdots \\ v_p \end{bmatrix} = \begin{bmatrix} 0 \\ 0 \\ \vdots \\ 0 \end{bmatrix} \implies x \cdot v_k = 0 \implies x = 0 \)

The set \(\{ \mathbf{v} \}\) is linearly independent.


When is a set containing two vectors linearly independent?#

A set of two vectors \(\{ \mathbf{v, w} \}\) is linearly dependent if one of the two vectors is a multiple of the other.

A set of two vectors \(\{ \mathbf{v, w} \}\) is linearly independent if neither of the two vectors is a multiple of the other.

Example#

Is the following set of vectors linearly independent?

\( \left \{ \underset{\mathbf{v}}{ \begin{bmatrix} 2 \\ 1 \\ \end{bmatrix} }, \underset{\mathbf{w}}{ \begin{bmatrix} 4 \\ 2 \\ \end{bmatrix} } \right\} \)

\( \mathbf{w} = \begin{bmatrix} 4 \\ 2 \\ \end{bmatrix} = 2 \begin{bmatrix} 2 \\ 1 \\ \end{bmatrix} = 2 \mathbf{v} \ \)

The set of vectors is linearly dependent because the vector \(\mathbf{w}\) is a multiple of the vector \(\mathbf{v}\).

Example#

Is the following set of vectors linearly independent?

\( \left \{ \underset{\mathbf{v}}{ \begin{bmatrix} 2 \\ 1 \\ \end{bmatrix} }, \underset{\mathbf{w}}{ \begin{bmatrix} 2 \\ 3 \\ \end{bmatrix} } \right\} \)

The set of vectors is linearly independent because neither vector is a multiple of the other.