Skip to main content

Section 2.3 Linear Dependence

In this section we define concept of linear dependence and linear indepence of a sect of vectors in \(\R^n\) with several examples.

Definition 2.3.1. Linearly Dependence.

A set of vectors \(\{v_1,v_2,\ldots, v_k\}\subset \R^n\) is said to be linearly dependent if there exist scalars \(\alpha_1,\alpha_2,\ldots \alpha_k\) not all zero such that \(\alpha_1 v_1+\alpha_2 v_2+\cdots+\alpha_k v_k=0\text{.}\)
Note that the set \(\{0\}\) in \(\R^n\) is lindearly dependent as we have \(1\cdot 0=0\text{.}\) If we have a set \(\{v_1,\ldots, v_k\}\) which contains a zero vector, then it is linearly dependent. (why?)
What does it mean to say that two vectors \(u,v\in \R^n\) are linearly dependent? It mean, there exist scalars, say \(\alpha\) and \(\beta\) not both zero such that \(\alpha u+\beta v=0\text{.}\) Without loss of generality, let \(\alpha\neq 0\text{,}\) then we have \(u=-\frac{\beta}{\alpha}v\text{.}\) Similarly if \(\beta\neq 0\text{,}\) then we have \(v=-\frac{\alpha}{\beta}u\text{.}\) Thus if \(u,v\in \R^n\) are linearly dependent then one is scalar multiple of the other. Geometrically, both \(u\) and \(v\) are along the same line passing through the origin in \(\R^n\text{.}\)

Example 2.3.2.

Suppose \(x=\begin{pmatrix}x_1\\x_2\end{pmatrix}\text{,}\) \(y=\begin{pmatrix}y_1\\y_2\end{pmatrix}\) and \(z=\begin{pmatrix}z_1\\z_2\end{pmatrix}\) be three vectors in \(\R^2\text{.}\) We claim that \(x,y,z\) are linearly dependent. In particular, any three vectors in \(\R^2\) are linearly dependent. Let \(\alpha=\begin{pmatrix}\alpha_1\\\alpha_2\\\alpha_3\end{pmatrix}\) be scalars such that \(\alpha_1 x+\alpha_2 y+\alpha_3 z=0\text{.}\) We need to solve this equations for \(\alpha_1,\alpha_2,\alpha_3\text{.}\) Thsese equation can be written as
\begin{equation*} \begin{pmatrix} x_1 \amp y_1\amp z_1\\x_2 \amp y_2\amp z_2\end{pmatrix} \begin{pmatrix}\alpha_1\\\alpha_2\\\alpha_3\end{pmatrix}= \begin{pmatrix} 0\\0\end{pmatrix}\text{.} \end{equation*}
The above equations can be written as \(A\alpha=0\) which is a system of 2 linear equations in 3 variables. Hence it has a non-zero solution by TheoremΒ 1.6.3. In particular, there exist scalars \(\alpha_1,\alpha_2,\alpha_3\) not all zero such that \(\alpha_1 x+\alpha_2 y+\alpha_3 z=0\text{.}\) Hence \(x,y,z\) are linearly dependent. Can you generalize this?

Proof.

Definition 2.3.4. Linearly Independence.

A set of vectors \(\{v_1,v_2,\ldots, v_k\}\) is said to be linearly independent if it is not linearly dependent. That is, if \(\alpha_1 v_1+\alpha_2 v_2+\cdots+\alpha_k v_k=0\) then it implies \(\alpha_1,\ldots, \alpha_k=0\) for any set of scalars \(\alpha_1,\alpha_2,\ldots \alpha_k\text{.}\)
Suppose a set of vectors \(\{v_1,v_2,\ldots, v_k\}\) is linearly dependent in \(\R^n\text{.}\) Let \(\alpha_1, \ldots, \alpha_n\) be scalars not all zero such that
\begin{equation*} \alpha_1v_1+\cdots +\alpha_n v_n=0. \end{equation*}
If \(\alpha_i\neq 0\text{,}\) then we can write \(v_i\) in terms other vectors as follows.
\begin{equation*} v_i = -\frac{1}{\alpha_i}\sum_{j\neq i} \alpha_j v_j. \end{equation*}
Next assume that vectors \(\{v_1,v_2,\ldots, v_k\}\) are such that one of the \(v_i's\) can be written as \(v_i = {\alpha_i}\sum_{j\neq i} \alpha_j v_j\text{,}\) then we have \(\sum_{j} \alpha_j v_j=0 \) with \(\alpha_i=-1\neq 0\text{.}\) Thus we have the following result.

Remark 2.3.6.

Let \(v,v_1,v_2,\ldots, v_k\) be vectors in \(\R^n\) such that \(v=\alpha_1 v_1+\alpha_2 v_2+\cdots+\alpha_k v_k\text{.}\) Then we have
\begin{equation*} \begin{bmatrix} v_1 \amp v_2 \amp \cdots \amp v_k\end{bmatrix} \begin{bmatrix} \alpha_1\\\vdots\\\alpha_k\end{bmatrix}=v. \end{equation*}
Thus if we want to find \(\alpha_1,\ldots,\alpha_k\) such that \(v=\alpha_1 v_1+\alpha_2 v_2+\cdots+\alpha_k v_k\text{,}\) it amount to solving the system \(A\alpha =v\text{,}\) where \(A\) is the column matrix whose columns are \(v_1,v_2,\ldots, v_k\) and \(\alpha =\begin{bmatrix} \alpha_1\\\vdots\\\alpha_k\end{bmatrix}\text{.}\)

Remark 2.3.7.

  1. A set of vectors \(v_1,v_2,\ldots, v_k\) in \(\R^n\) is linearly dependent iff the matrix \(A=\begin{bmatrix} v_1 \amp v_2 \amp \cdots \amp v_k\end{bmatrix}\) is of rank strictly less than \(k\text{.}\)
  2. A set of vectors \(v_1,v_2,\ldots, v_k\) in \(\R^n\) is linearly independent iff the matrix \(A=\begin{bmatrix} v_1 \amp v_2 \amp \cdots \amp v_k\end{bmatrix}\) is of rank \(k\text{.}\)