Skip to main content

Section 6.1 Orthogonality

In this chapter we deal with orthogonality of vectors and various properties.
Recall, that if \(S=\{v_1,\ldots, v_k\}\) is linearly independent subset of \(\R^n\) and \(v_{k+1}\notin span(S)\text{,}\) then \(S\cup \{v_{k+1}\}=\{v_1,\ldots, v_k,v_{k+1}\}\) is linearly independent subset of \(\R^n\text{.}\) (why?)

Definition 6.1.1.

A set of non zero vectors \(\{v_1,\ldots, v_k\}\) is called orthogonal if
\begin{equation*} v_i\cdot v_j = 0 \text{ if } i\neq j \end{equation*}
where \(u \cdot v \) represents the standard dot product in \(\mathbb{R}^n.\) More precisely for \(x=(x_1,\ldots,x_n),y=(y_1,\ldots, y_n)\in \R^n\text{,}\) we have
\begin{equation*} x\cdot y = x_1y_1+\cdots+x_ny_n=\sum x_i y_i. \end{equation*}
If \(x=(x_1,\ldots, x_n) \in \R^n\text{,}\) then \(\norm{x}^2: =x_1^2+\cdots+x_n^2\text{.}\)

Checkpoint 6.1.2.

Show that orthogonal set of vectors are linearly independent.
Hint.
Let \(\{u_1,\ldots, u_n\}\) be an orthogonal set. Let
\begin{equation*} \alpha_1 u_1+\cdots+\alpha_n u_n=0. \end{equation*}
Taking dot product with \(u_i\text{,}\) we get \(\alpha_i \norm{u_i}^2=0\text{.}\) Since \(u_i\neq 0, \norm{u}\neq 0\text{.}\) Hence \(\alpha_i=0\text{.}\)

Checkpoint 6.1.3.

Let \(u\) be a vector in \(\R^n\) which is orthoogoanl of all vectors in \(\R^n\text{.}\) Then what can you say about \(u\text{.}\)
Hint.
\(u=0\text{,}\) for such a vector \(u=(u_1,\ldots, u_n)\in \R^n\text{,}\) we have \(u\cdot u=\norm{u}^2=0\text{,}\) implies for each \(i\text{,}\) \(u_i=0\text{.}\)

Checkpoint 6.1.4.

Let \(\{u_1,\ldots, u_k\}\) be an orthogonal set of vectors in \(\R^n\text{.}\) Let \(u\in \R^n\) and define
\begin{equation} u_{k+1}:=u-\frac{u_1\cdot u}{\norm{u_1}^2}u_1-\frac{u_2\cdot u}{\norm{u_2}^2}u_2-\cdots -\frac{u_k\cdot u}{\norm{u_k}^2}u_k\text{,}\tag{6.1.1} \end{equation}
(i) \(u_i\cdot u_{k+1}=0\) for all \(i=1,\ldots, k\)
(ii) If \(u\notin {\rm span}(\{u_1,\ldots, u_k\})\text{,}\) then \(u_{k+1}\neq 0\) and \(\{u_1,\ldots, u_k,u_{k+1}\}\) is an orthogonal set. What is \(u_{k+1}\) if \(u\in {\rm span}(\{u_1,\ldots, u_k\})\text{?}\)

Definition 6.1.5.

A basis \(\beta =\{u_1,\ldots, u_n\}\) is called an orthogonal basis if \(\beta\) is an orthogonal set in \(\R^n\text{.}\) In addition if \(\norm{u_1}=1\) for all \(i\text{,}\) then \(\beta\) is called an orthonormal basis.

Checkpoint 6.1.6.

  1. The standard basis \({\cal E} = \{e_1,\cdots, e_n\}\) is an orthonormal basis of \(\R^n\text{.}\)
  2. \(\beta =\left\{\left(\frac{1}{\sqrt{2}},\frac{1}{\sqrt{2}}\right), \left(\frac{1}{\sqrt{2}},-\frac{1}{\sqrt{2}}\right)\right\}\) is an orthonormal basis of \(\R^2\text{.}\)
  3. \(\beta=\{(2,1,1), (-1,1,1),(0,-1,1)\}\) is an orthogonal basis of \(\R^3\text{.}\) However, it is not an orthonormal basis.
  4. \(\beta'=\left\{\left(\frac{2}{\sqrt{6}},\frac{1}{\sqrt{6}},\frac{1}{\sqrt{6}}\right), \left(\frac{-1}{\sqrt{3}},\frac{1}{\sqrt{3}},\frac{1}{\sqrt{3}}\right),\left(0,\frac{-1}{\sqrt{2}},\frac{1}{\sqrt{2}}\right)\right\}\) is an orthonormal basis of \(\R^n\text{.}\)
What is an advantage of having an orthonormal basis?
Let \(\beta =\{u_1,\ldots, u_n\}\) be an orthonormal basis of \(\R^n\) and \(x\in \R^n\text{.}\) Then there exist scalars \(\alpha_1,\ldots, \alpha_n\) such that \(x=\alpha_1u_1+\cdots +\alpha_nu_n\text{.}\) Then it is easy to check that \(\alpha_i = x\cdot u_i\) for all \(i\text{.}\) In particular, the scalars \(\alpha_i\) can be explicitly written in terms of \(u\) and \(u_i\text{.}\) This is advantage of having an orthonormal basis.

Checkpoint 6.1.7.

(i) Find the coordinates of a vector \((1,2)\) with respect to an orthonormal basis \(\beta =\left\{\left(\frac{1}{\sqrt{2}},\frac{1}{\sqrt{2}}\right), \left(\frac{1}{\sqrt{2}},-\frac{1}{\sqrt{2}}\right)\right\}\) of \(\R^2\text{.}\)
(ii) Find the coordinates of the vector \((2,5,7)\) with respect to an orthonormal basis \(\beta'=\left\{\left(\frac{2}{\sqrt{6}},\frac{1}{\sqrt{6}},\frac{1}{\sqrt{6}}\right), \left(\frac{-1}{\sqrt{3}},\frac{1}{\sqrt{3}},\frac{1}{\sqrt{3}}\right),\left(0,\frac{-1}{\sqrt{2}},\frac{1}{\sqrt{2}}\right)\right\}\) of \(\R^3\text{.}\)

Activity 6.1.1. Dot Product and Orthogonality in Sage.

Sage has a method .dot_product() to find the dot product of two vectors and u.norm() returns the norm of \(u\text{.}\)
Consider a set of vectors
\begin{equation*} u_1=(1,1,1,1), u_2=(1,2,-1,-2),u_3=(-7,5,1,1) \in \mathbb{R}^4 \end{equation*}
Let us check that \(u_1,u_2,u_3\) are orthogonal. Let \(u=(1,2,3,4)\in \mathbb{R}^4\text{.}\) Define
\begin{equation*} u_4:=u-\frac{u_1\cdot u}{\norm{u_1}^2}u_1-\frac{u_2\cdot u}{\norm{u_2}^2}u_2-\frac{u_3\cdot u}{\norm{u_k}^2}u_3. \end{equation*}
Let us verify that \(u_4\) is orthogonal to \(u_1,u_2,u_3\text{.}\)

Checkpoint 6.1.8.

Can you think of how will you generate an orthogonal basis of \(\mathbb{R}^n\) starting with a non-zero vector in \(\mathbb{R}^n\text{?}\)
Hint.
We can start with any non zero vector \(u_1\text{.}\) Now choose \(u\notin{\rm span}(u_1)\) and construct \(u_2\) as in (6.1.1). Next we find \(u\notin {\rm span}(\{u_1,u_2\})\) and define \(u_3\) as in (6.1.1). Conntinuing this way we can get an orthogonal basis of \(\R^n.\)
In the next section, we deal with finding an orthonormal basis of \(\R^n\) or any subspace of \(\R^n\) from a basis.