Skip to main content

Section 4.5 Basis and dimension

Subsection 4.5.1 Basis of a Vector Space

We can defined basis of a vector space similar to basis of subspaces in \(\R^n\text{.}\)

Definition 4.5.1. Basis of a vector space.

Let \(V\) be a vector space over \(\R\text{.}\) A set of vectors \(\beta=\{v_1,v_2,\ldots,v_n\}\subset V\) is called a basis of \(V\) if every vector \(v\in \R^n\) can be expressed uniquely as linear combinations of \(v_1,v_2,\ldots,v_n\text{.}\)
Thus \(\beta\) is basis of \(V\) if (i) \(L(\beta)=\R^n\text{,}\) that every vector \(v\in \R^n\) can be expressed as linear combinations of \(v_1,v_2,\ldots,v_n\text{.}\)
(ii) If \(v=\alpha_1v_1+\alpha_2v_2+\cdots +\alpha_nv_n\) and \(v=\beta_1v_1+\beta_2v_2+\cdots +\beta_nv_n\text{,}\) then \(\alpha_1=\beta_1, \alpha_2=\beta_2=\cdots,\alpha_n=\beta_n\text{.}\)
We have already seen several examples of bases in \(\R^n\) and some subspaces of \(\R^n\text{.}\)

Example 4.5.2.

Let \(V={\cal P}_n(\R)\text{.}\) The set \(\{1,x,x^2,\ldots, x^n\}\) is basis of \(V\text{,}\) called the standard basis.

Example 4.5.3.

\(\{1,i\}\) is a basis of \(\mathbb{C}\) as a vector space over \(\R\text{.}\)

Example 4.5.4.

\begin{equation*} S=\left\{ \begin{bmatrix}1 \amp 0 \\0 \amp 0 \end{bmatrix} , \begin{bmatrix}0 \amp 1 \\0 \amp 0 \end{bmatrix} , \begin{bmatrix}0 \amp 0 \\1 \amp 0 \end{bmatrix} , \begin{bmatrix}0 \amp 0 \\0 \amp 1 \end{bmatrix} \right\}\text{.} \end{equation*}
is a basis \(M_2(\R)\text{,}\) called the standard basis.

Example 4.5.5.

Any \(n\) linearly independent set of vectors forms a basis of \(\R^n\text{.}\)

Definition 4.5.8. Finite Dimensional Vector Space.

A vector space \(V\) is called finite dimensional if there exists a finite subset \(S\) of \(V\) such that \(L(S)=V\text{.}\)
A vector space which is not finite dimensional is called an infinite dimensional.

Definition 4.5.9.

We say a vector space \(V\) is of dimension \(n\) if it has a basis \(\beta\) consisting of \(n\) elements.

Checkpoint 4.5.10.

What is the dimension of \(V=\{0\}\text{,}\) the zero space?

Example 4.5.11.

(i) \(\R^n\) is a \(n\) dimensional vectors space over \(\R\text{.}\)
(ii) \(M_n(\R)\text{,}\) the set of all \(n\times n\) matrices pver \(\R\) is a \(n^2\)-dimensional vector space over \(\R\text{.}\)
(iii) \({\cal P}_n(\R)\text{,}\) the set of all polynomials of degree less than or equal to \(n\) over \(\R\) is \((n+1)\)-dimensional vector space over \(\R\text{.}\)

Example 4.5.12.

Let \(W\) be the set of all \(3\times 3\) real symmetric matrices. The set
\begin{align*} \beta=\left\{ \begin{bmatrix}1 \amp 0 \amp 0 \\0 \amp 0 \amp 0 \amp \\ 0 \amp 0 \amp 0 \end{bmatrix}, \begin{bmatrix}0 \amp 0 \amp 0 \\0 \amp 1 \amp 0 \amp \\ 0 \amp 0 \amp 0 \end{bmatrix}, \begin{bmatrix}0 \amp 0 \amp 0 \\0 \amp 0 \amp 0 \amp \\ 0 \amp 0 \amp 1 \end{bmatrix} \right.\\ \left. \begin{bmatrix}0 \amp 1 \amp 0 \\1 \amp 0 \amp 0 \\ 0 \amp 0 \amp 0 \end{bmatrix}, \begin{bmatrix}0 \amp 0 \amp 1 \\0 \amp 0 \amp 0 \\ 1 \amp 0 \amp 0 \end{bmatrix}, \begin{bmatrix}0 \amp 0 \amp 0 \\0 \amp 0 \amp 1 \amp \\ 0 \amp 1 \amp 0 \end{bmatrix} \right\} \end{align*}
is a basis of \(W\text{.}\) That is, \(W\) is 6 dimensional vector space over \(\R\text{.}\) What is dimension of the set of \(n\times n\) real symmetric matrices and dimension of \(n\times n\) real skew-symmetric matrices?

Checkpoint 4.5.13.

Let \(W\) be the set of all \(3\times 3\) real skew-symmetric matrices. Find a basis and hence the dimension of \(W\text{.}\)

Subsection 4.5.2 How to find a basis of a finite dimensional vector space?

First let us look at the following results.

Checkpoint 4.5.14.

Let \(\{v_1,\ldots, v_k\}\) be a linearly independent set of vectors. Suppose \(v\notin { span}(\{v_1,\ldots, v_k\})\text{.}\) Then \(\{v,v_1,\ldots, v_k\}\) is linearly independent.

Checkpoint 4.5.15.

Let \(V\) be a finite dimensional vector space over \(\R\text{.}\) Then any linearly independent set \(S =\{v_1,\ldots, v_k\}\) can be extended to a basis of \(V\text{.}\) More precisely, there exist vectors, \(u_1,\ldots, u_{n-k}\) where \(n={ dim}(V)\) such that \(\beta=\{v_1,\ldots, v_k,u_1,\ldots, u_{n-k}\}\) is a basis of \(V\text{.}\)
These two exercises give a way to find a basis of a finite dimensional vector space starting with a nonzero vector in \(V\text{.}\)

Example 4.5.16.

Complete the set \(S=\{v_1=(1, 2, 1, 0), v_2=(2, 2, 1, 0)\}\) to a basis of \(\R^4\text{.}\) One way of achieving this to find \(v_3\notin L(S)\text{.}\) Then Chose \(v_4\notin L(\{v_3\}\cup S)\text{.}\) Then in view of Exercise 4.5.14, \(\beta=\{v_1,v_2,v_3,v_4\}\) is linearly independent. Since \(\dim(\R^4)=4\text{,}\) \(\beta\) is a basis of \(\R^4\text{.}\)
Another way to achieve this is to look at the standard basis vectors \(e_i\) not in \(L(S)\text{.}\) In particular, \(v_3,v_4\in\{e_1,e_2,e_3,e_4\}\text{.}\) In order to find this we can apply RREF to the matrix \(\begin{bmatrix}v_1\amp v_2 \amp e_1 \amp e_2 \amp e_3 \amp e_4 \end{bmatrix}\) and choose columns corresponding to the pivots. We have
\begin{equation*} \left[\begin{array}{rrrrrr} 1 \amp 2 \amp 1 \amp 0 \amp 0 \amp 0 \\ 2 \amp 2 \amp 0 \amp 1 \amp 0 \amp 0 \\ 1 \amp 1 \amp 0 \amp 0 \amp 1 \amp 0 \\ 0 \amp 0 \amp 0 \amp 0 \amp 0 \amp 1 \end{array} \right]\xrightarrow{RREF} \left[ \begin{array}{rrrrrr} 1 \amp 0 \amp -1 \amp 0 \amp 2 \amp 0 \\ 0 \amp 1 \amp 1 \amp 0 \amp -1 \amp 0 \\ 0 \amp 0 \amp 0 \amp 1 \amp -2 \amp 0 \\ 0 \amp 0 \amp 0 \amp 0 \amp 0 \amp 1 \end{array} \right]\text{.} \end{equation*}
Clearly pivot columns are 1,2,4,6, which corresponds to vector \(v_1,v_2,e_2, e_4\text{.}\) Thus \(\{v_1,v_2,e_2,e_4\}\) is an extended basis of \(\R^4\text{.}\)

Checkpoint 4.5.17.

Let \(V\) be a finite dimensional vector space over \(\R\text{.}\) Suppose \(S\) is a finite set such that \(L(S)=V\text{.}\) Then there exists a subset \(S'\subset S\) such that \(S'\) is a basis of \(V\text{.}\)

Example 4.5.18.

Consider \(v_1,\ldots, v_8\) in \(\R^5\text{,}\) where
\begin{equation*} \begin{split} v_1=(2, -3, 4, -5, -2), v_2=(-6, 9, -12, 15, -6), v_3=(3, -2, 7, -9, 1),\\ v_4=(2, -8, 2, -2, 6), v_5=(-1, 1, 2, 1, -3), v_6=(0, -3, -18, 9, 12), \\ v_7=(1, 0, -2, 3, -2), v_8=(2, -1, 1, -9, 7) \end{split} \end{equation*}
We wish to find a subset of \(\{v_1,\ldots, v_8\}\) which is a basis of \(\R^5\text{.}\) We can achieve this by applying RREF to the column matrix \(\begin{bmatrix}v_1\amp v_2\amp \cdots \amp v_8 \end{bmatrix}\text{.}\) Thus
\begin{align*} \left[\begin{array}{rrrrrrrr} 2 \amp -6 \amp 3 \amp 2 \amp -1 \amp 0 \amp 1 \amp 2 \\ -3 \amp 9 \amp -2 \amp -8 \amp 1 \amp -3 \amp 0 \amp -1 \\ 4 \amp -12 \amp 7 \amp 2 \amp 2 \amp -18 \amp -2 \amp 1 \\ -5 \amp 15 \amp -9 \amp -2 \amp 1 \amp 9 \amp 3 \amp -9 \\ -2 \amp -6 \amp 1 \amp 6 \amp -3 \amp 12 \amp -2 \amp 7 \end{array}\right]\\ \xrightarrow{RREF} \left[\begin{array}{rrrrrrrr} 1 \amp 0 \amp 0 \amp 0 \amp 0 \amp 0 \amp 0 \amp 0 \\ 0 \amp 1 \amp 0 \amp -\frac{4}{3} \amp 0 \amp -\frac{1}{3} \amp 0 \amp \frac{1}{3} \\ 0 \amp 0 \amp 1 \amp -2 \amp 0 \amp -2 \amp 0 \amp 1 \\ 0 \amp 0 \amp 0 \amp 0 \amp 1 \amp -4 \amp 0 \amp -2 \\ 0 \amp 0 \amp 0 \amp 0 \amp 0 \amp 0 \amp 1 \amp -1 \end{array} \right] \end{align*}
Clearly pivot columns are 1, 2, 3, 5, 7. Hence \(\{v_1,v_2,v_3,v_5,v_7\}\) is basis of \(\R^5\text{.}\)

Definition 4.5.19.

Let \(V\) be a vector space. A set of vectors \(S\) of \(V\) is called a maximal linearly independent set if \(S\cup \{v\}\) is linearly dependent for any vector \(v\in V\text{.}\)

Example 4.5.20.

(i) Any set \(S\) with two linearly independent set of vectors in \(\R^2\) is a maximal linearly independent set.
(ii) Any set \(S\) with three linearly independent set of vectors in \(\R^3\) is a maximal linearly independent set.

Definition 4.5.21.

Let \(V\) be a vector space. A set of vectors \(S\) of \(V\) is called a minimal set of generators if (i) \(L(S)=V\) and (ii) for any \(u\in S\text{,}\) \(L(S\setminus \{u\})\neq V\text{.}\)

Example 4.5.22.

(i) Any set \(S\) with two linearly independent set of vectors in \(\R^2\) is a minimal set of generators.
(ii) Any set \(S\) with three linearly independent set of vectors in \(\R^3\) is a minimal set of generators.
In the following theorem we mention the equivalent condition for a set to be a basis of a finite dimensional vector space.

Subsection 4.5.3 Lagrange Interpolation

Consider the vector space \({\cal P}_n(\R)\text{.}\) Fix \(n+1\) distinct real numbers \(c_0,c_1,\ldots, c_n\text{.}\) Define polynomials
\begin{equation} \ell_i(x)= \frac{(x-c_0)\cdots (x-c_{i-1})(x-c_{i+1})\cdots(x-c_n)}{(c_i-c_0)\cdots (c_i-c_{i-1})(c_i-c_{i+1})\cdots(c_i-c_n)}\tag{4.5.1} \end{equation}
for \(i=0,1,\ldots n\text{.}\) The above equation can be written as
\begin{equation} \ell_i(x)=\prod_{j=0,j\neq i}^{n}\frac{x-c_j}{c_i-c_j}.\tag{4.5.2} \end{equation}
It is easy to see that \(\ell_i(c_j)=1\) if \(j=i\) and 0 otherwise. We claim that \(\{\ell_i\}_{i=0}^n\) is a linearly independent subset of \({\cal P}_n(\R)\text{.}\) For
\begin{equation} \alpha_0\ell_0 + \alpha_1\ell_1+\cdots+\alpha_n\ell_n=\sum\alpha_i\ell_i=0\text{.}\tag{4.5.3} \end{equation}
Here the right hand side is the zero polynomial. This implies \(\sum\alpha_i\ell_i(c_j)=0\) for all \(j=0,\ldots, n\text{.}\) Since \(\sum\alpha_i\ell_i(c_j)=\alpha_j\text{,}\) it implies that \(\alpha_j=0\) for all \(j=0,\ldots, n\text{.}\) Hence the claim.
Since \({\cal P}_n(\R)\) is \((n+1)\)-dimensional vector space, the set \(\{\ell_i\}_{i=0}^n\) is a basis. Hence every \(n\)-th degree polynomial can be expressed uniquely as linear combination of \(\ell_i\text{.}\) Suppose \(g\) is polynomial passing through points \(\{(x_i,y_i)\}_{i=0}^n\text{,}\) (that is \(g(x_i)=y_i)\)) where \(x_0,\ldots,x_n\) are \(n\) distinct real numbers. This unique polynomial is given by
\begin{equation} g(x)=\sum_{i=0}^n \ell_i(x)y_i\tag{4.5.4} \end{equation}
called the Lagrange interpolation polynomial passing through \(\{(x_i,y_i)\}_{i=0}^n\text{.}\)

Subsection 4.5.4 Dimension Formula

Problem 4.5.24.

Let \(V\) be a finite dimensional vector space over \(\R\text{.}\) Let \(W_1\) and \(W_2\) be subspaces of \(V\text{.}\) Then
\begin{equation*} W_1+W_2:= \{x+y:x\in W_2,y\in W_2\}\text{.} \end{equation*}
It is easy to check that \(W_1+W_2\) is a subspace of \(V\text{.}\) Moreover
\begin{gather} \dim{(W_1+W_2)}=\dim{(W_1)}+\dim{(W_2)}-\dim{(W_1\cap W_2)}\tag{4.5.5} \end{gather}

Example 4.5.25.

Let \(V=\R^3\text{.}\) Consider subspaces \(W_1=\{(x_1,x_2,x_3):x_1+x_2+x_3=0\}\) and \(W_2=\{(x_1,x_2,x_3):x_1+x_2-x_3=0\}\text{.}\) Clearly \(W_1\) and \(W_2\) are subspaces of \(V\) each of dimension 2. What is \(W_1\cap W_2\text{?}\) It is the line of intersection of the two planes, \(x_1+x_2+x_3=0\) and \(x_1+x_2-x_3=0\text{.}\) Thus \(\dim{(W_1\cap W_2)}=1\text{.}\) It is easy to see that
\begin{equation*} W_2\cap W_2=\{\alpha(1,-1,0):\alpha\in\R\} \end{equation*}
What is \(W_1+W_2\text{?}\) One can easily show that \(W_1+W_2=\R^3=V\text{.}\) However by dimension formula
\begin{align*} \dim{(W_1+W_2)}=\amp \dim{(W_1)}+\dim{(W_2)}-\dim{(W_1\cap W_2)}\\ =\amp 2+2-1=3\text{.} \end{align*}
Since \(W_1+W_2\) is a 3 dimensional subspace of \(\R^3\text{,}\) it is in fact \(\R^3\text{.}\)