Section 2.3 Linear Dependence
In this section we define concept of linear dependence and linear indepence of a sect of vectors in \(\R^n\) with several examples.
Definition 2.3.1. Linearly Dependence.
A set of vectors \(\{v_1,v_2,\ldots, v_k\}\subset \R^n\) is said to be linearly dependent if there exist scalars \(\alpha_1,\alpha_2,\ldots \alpha_k\) not all zero such that \(\alpha_1 v_1+\alpha_2 v_2+\cdots+\alpha_k v_k=0\text{.}\)
Note that the set \(\{0\}\) in \(\R^n\) is lindearly dependent as we have \(1\cdot 0=0\text{.}\) If we have a set \(\{v_1,\ldots, v_k\}\) which contains a zero vector, then it is linearly dependent. (why?)
What does it mean to say that two vectors \(u,v\in \R^n\) are linearly dependent? It mean, there exist scalars, say \(\alpha\) and \(\beta\) not both zero such that \(\alpha u+\beta v=0\text{.}\) Without loss of generality, let \(\alpha\neq 0\text{,}\) then we have \(u=-\frac{\beta}{\alpha}v\text{.}\) Similarly if \(\beta\neq 0\text{,}\) then we have \(v=-\frac{\alpha}{\beta}u\text{.}\) Thus if \(u,v\in \R^n\) are linearly dependent then one is scalar multiple of the other. Geometrically, both \(u\) and \(v\) are along the same line passing through the origin in \(\R^n\text{.}\)
Example 2.3.2.
Suppose \(x=\begin{pmatrix}x_1\\x_2\end{pmatrix}\text{,}\) \(y=\begin{pmatrix}y_1\\y_2\end{pmatrix}\) and \(z=\begin{pmatrix}z_1\\z_2\end{pmatrix}\) be three vectors in \(\R^2\text{.}\) We claim that \(x,y,z\) are linearly dependent. In particular, any three vectors in \(\R^2\) are linearly dependent. Let \(\alpha=\begin{pmatrix}\alpha_1\\\alpha_2\\\alpha_3\end{pmatrix}\) be scalars such that \(\alpha_1 x+\alpha_2 y+\alpha_3 z=0\text{.}\) We need to solve this equations for \(\alpha_1,\alpha_2,\alpha_3\text{.}\) Thsese equation can be written as
\begin{equation*}
\begin{pmatrix} x_1 \amp y_1\amp z_1\\x_2 \amp y_2\amp z_2\end{pmatrix} \begin{pmatrix}\alpha_1\\\alpha_2\\\alpha_3\end{pmatrix}=
\begin{pmatrix} 0\\0\end{pmatrix}\text{.}
\end{equation*}
The above equations can be written as \(A\alpha=0\) which is a sustem of 2 linear equations in 3 variables. Hence it has a non-zero solution. In particular, there exist scalars \(\alpha_1,\alpha_2,\alpha_3\) not all zero such that \(\alpha_1 x+\alpha_2 y+\alpha_3 z=0\text{.}\) Hence \(x,y,z\) are linearly dependence. Can you generalize this?
Theorem 2.3.3.
Any \(n+1\) vectors in \(\R^n\) are linearly dependent.
Definition 2.3.4. Linearly Independence.
A set of vectors \(\{v_1,v_2,\ldots, v_k\}\) is said to be linearly independent if it is not linearly dependent. That is, if \(\alpha_1 v_1+\alpha_2 v_2+\cdots+\alpha_k v_k=0\) then it implies \(\alpha_1,\ldots, \alpha_k=0\) for any set of scalars \(\alpha_1,\alpha_2,\ldots \alpha_k\text{.}\)
Problem 2.3.5.
A set of vectors \(\{v_1,v_2,\ldots, v_k\}\) is linearly dependent if and only if one of the vectors from the set is a linear combination of the remaining vectors. That is, there exists \(j\in \{1,\ldots,k\}\) such that \(v_j=\sum_{i,i\neq j} \alpha_i v_i\text{.}\)
Problem 2.3.8.
Check if the following set of vectors are linearly independent or dependent.
(i) \(\{(1,0,1,2), (0,1,1,2),(1,1,1,3)\}\)
(ii) \(\{(1,0,3),(1,2,4),(1,4,5)\}\text{.}\)
(iii) \(\{(1, 0, -2, 5), (2, 1, 0, -1), (1, 1, 2, 1)\}\text{.}\)
(iv) \(\{(1, 1, 0, 0), (1, 0, 1, 0), (0, 0, 1, 1),
(0, 1, 0, 1)\}\)