Recall, that if \(S=\{v_1,\ldots,
v_k\}\) is linearly independent subset of \(\R^n\) and \(v_{k+1}\notin span(S)\text{,}\) then \(S\cup \{v_{k+1}\}=\{v_1,\ldots,
v_k,v_{k+1}\}\) is linearly independent subset of \(\R^n\text{.}\) (why?)
where \(u \cdot v \) represents the standard dot product in \(\mathbb{R}^n.\) More precisely for \(x=(x_1,\ldots,x_n),y=(y_1,\ldots, y_n)\in \R^n\text{,}\) we have
\begin{equation*}
x\cdot y = x_1y_1+\cdots+x_ny_n=\sum x_i y_i.
\end{equation*}
Taking dot product with \(u_i\text{,}\) we get \(\alpha_i \norm{u_i}^2=0\text{.}\) Since \(u_i\neq 0, \norm{u}\neq 0\text{.}\) Hence \(\alpha_i=0\text{.}\)
\(u=0\text{,}\) for such a vector \(u=(u_1,\ldots, u_n)\in \R^n\text{,}\) we have \(u\cdot u=\norm{u}^2=0\text{,}\) implies for each \(i\text{,}\)\(u_i=0\text{.}\)
(ii) If \(u\notin {\rm span}(\{u_1,\ldots, u_k\})\text{,}\) then \(u_{k+1}\neq 0\) and \(\{u_1,\ldots,
u_k,u_{k+1}\}\) is an orthogonal set. What is \(u_{k+1}\) if \(u\in {\rm span}(\{u_1,\ldots, u_k\})\text{?}\)
A basis \(\beta =\{u_1,\ldots,
u_n\}\) is called an orthogonal basis if \(\beta\) is an orthogonal set in \(\R^n\text{.}\) In addition if \(\norm{u_1}=1\) for all \(i\text{,}\) then \(\beta\) is called an orthonormal basis.
\(\beta =\left\{\left(\frac{1}{\sqrt{2}},\frac{1}{\sqrt{2}}\right), \left(\frac{1}{\sqrt{2}},-\frac{1}{\sqrt{2}}\right)\right\}\) is an orthonormal basis of \(\R^2\text{.}\)
\(\beta'=\left\{\left(\frac{2}{\sqrt{6}},\frac{1}{\sqrt{6}},\frac{1}{\sqrt{6}}\right), \left(\frac{-1}{\sqrt{3}},\frac{1}{\sqrt{3}},\frac{1}{\sqrt{3}}\right),\left(0,\frac{-1}{\sqrt{2}},\frac{1}{\sqrt{2}}\right)\right\}\) is an orthonormal basis of \(\R^n\text{.}\)
Let \(\beta =\{u_1,\ldots, u_n\}\) be an orthonormal basis of \(\R^n\) and \(x\in \R^n\text{.}\) Then there exist scalars \(\alpha_1,\ldots, \alpha_n\) such that \(x=\alpha_1u_1+\cdots +\alpha_nu_n\text{.}\) Then it is easy to check that \(\alpha_i = x\cdot u_i\) for all \(i\text{.}\) In particular, the scalars \(\alpha_i\) can be explicitly written in terms of \(u\) and \(u_i\text{.}\) This is advantage of having an orthonormal basis.
(i) Find the coordinates of a vector \((1,2)\) with respect to an orthonormal basis \(\beta =\left\{\left(\frac{1}{\sqrt{2}},\frac{1}{\sqrt{2}}\right), \left(\frac{1}{\sqrt{2}},-\frac{1}{\sqrt{2}}\right)\right\}\) of \(\R^2\text{.}\)
(ii) Find the coordinates of the vector \((2,5,7)\) with respect to an orthonormal basis \(\beta'=\left\{\left(\frac{2}{\sqrt{6}},\frac{1}{\sqrt{6}},\frac{1}{\sqrt{6}}\right), \left(\frac{-1}{\sqrt{3}},\frac{1}{\sqrt{3}},\frac{1}{\sqrt{3}}\right),\left(0,\frac{-1}{\sqrt{2}},\frac{1}{\sqrt{2}}\right)\right\}\) of \(\R^3\text{.}\)
We can start with any non zero vector \(u_1\text{.}\) Now choose \(u\notin{\rm span}(u_1)\) and construct \(u_2\) as in (6.1.1). Next we find \(u\notin {\rm span}(\{u_1,u_2\})\) and define \(u_3\) as in (6.1.1). Conntinuing this way we can get an orthogonal basis of \(\R^n.\)