Section 7.2 Exercise Set
Problem set on orthogonality and inner Product
- Let \(\{u_1,\ldots, u_k\}\) be an orthogonal set of vectors in \(\R^n\text{.}\) Let \(u\in \R^n\) and define\begin{equation*} u_{k+1}:=u-\frac{u_1\cdot u_{k+1}}{\norm{u_1}^2}u_1-\frac{u_2\cdot u_{k+1}}{\norm{u_2}^2}u_2-\cdots -\frac{u_k\cdot u_{k+1}}{\norm{u_k}^2}u_k\text{,} \end{equation*}Then (i) \(u_i\cdot u_{k+1}=0\) for all \(i=1,\ldots, k\) (ii) If \(u\notin span(\{u_1,\ldots, u_k\})\text{,}\) then \(u_{k+1}\neq 0\) and \(\{u_1,\ldots, u_k,u_{k+1}\}\) is an orthogonal set.
- If \(\{u_1,\ldots, u_n\}\) is orthogonal set then it is linearly independent.
- Find the coordinates of the vector \((2,5,7)\) with respect to an orthonormal basis \(\beta'=\left\{\left(\frac{2}{\sqrt{6}},\frac{1}{\sqrt{6}},\frac{1}{\sqrt{6}}\right), \left(\frac{-1}{\sqrt{3}},\frac{1}{\sqrt{3}},\frac{1}{\sqrt{3}}\right),\left(0,\frac{-1}{\sqrt{2}},\frac{1}{\sqrt{2}}\right)\right\}\) of \(\R^3\text{.}\)
- Use the Gram-Schmidt orthogonalization process to find an orthonormal basis, say, \(\{q_1,q_2,q_3\}\) of \(\R^3\) starting with a basis \(\beta=\{(1,1,1),(-1,1,1),(-1,0,1)\}\text{.}\) Define \(Q = [q_1~q_2~q_3]\text{,}\) the column matrix whose columns are \(q_1, q_2, q_3\text{.}\) Show that \(Q^TQ=I\text{.}\)
- Use the Gram-Schmidt orthogonalization process to find an orthonormal basis, say, \(\{q_1,q_2,q_3, q_3\}\) of the subspace \(W\subset \R^4\) with basis\begin{equation*} \beta = \{ v_1=(-1,1,1,0),v_2=(-1,0,1,0),v_3=(1,0,0,1)\}\text{.} \end{equation*}Define \(Q = [q_1~q_2~q_3]\text{,}\) the column matrix whose columns are \(q_1, q_2, q_3\text{.}\) Show that \(Q^TQ=I\text{.}\) Suppose \(A=[v_1~v_2~v_3]\text{.}\) Find \(Q^TA\) and check that it is an upper triangular matrix with positive diagonal matrix. If we write \(R=Q^TA\text{,}\) then \(A=QR\) called the \(QR\)-factorization of \(A\text{.}\)
- The following are equivalent for an \(n\times n\) matrix \(A\text{.}\) (i) \(A\) is orthogonal (ii) \(\norm{Ax}=\norm{A}\) for all \(x\in \R^n\text{.}\) (iii) \(\norm{Ax-Ay}=\norm{x-y}\) for all \(x,y\in \R^n\text{.}\) (iv) \(Ax\cdot Ay = (Ay)^TAx=x\cdot y\text{.}\) {Hint: A matrix \(P\) is orthogonal if if it satisfies any one of the above conditions.}
- For the following matrices find an orthogonal matrix \(P\) such that \(P^{-1}AP\) is a diagonal matrix.\begin{equation*} \begin{pmatrix}2 \amp -1 \\-1 \amp 1 \end{pmatrix} , \begin{pmatrix}1 \amp 0 \amp -1\\0 \amp 1 \amp 2\\-1 \amp 2 \amp 5 \end{pmatrix} \end{equation*}
- Find the QR-factorization of the following matrices:\begin{equation*} \begin{bmatrix}2 \amp 1 \\ 1 \amp 11 \end{bmatrix} , \begin{bmatrix}1 \amp -1 \amp 1\\ 2 \amp 0 \amp 1\\2 \amp 1 \amp -2 \end{bmatrix} , \begin{bmatrix}1 \amp 1 \amp 0 \\-1 \amp 0 1\\0 \amp 1 \amp 1\\1 \amp -1 \amp 0 \end{bmatrix} \end{equation*}
- Let \(V\) be an inner product space. Then for any two vectors \(, y\in V\text{,}\) show that\begin{equation*} \norm{x+y}^2=\norm{x}^2+\norm{y}^2+2\inner{x}{y}, \norm{x-y}^2=\norm{x}^2+\norm{y}^2-2\inner{x}{y} \end{equation*}
- If \(x, y\) are two vectors in an inner product space \(V\) with inner product \(\langle .\rangle\text{.}\) Then show that\begin{equation*} \norm{x+y}^2+\norm{x-y}^2=2(\norm{x}^2+\norm{y}^2)\text{.} \end{equation*}This is called the parallelogram identity. Geometrically, in a parallelogram, the sum of square of the diagonal is 2 the sum of the squares of the side lengths.
- Let \((V, \inner{.}{.})\) be a real inner product space. Let \(x_1,x_2\,x_n\) be \(n\) orthogonal vectors in \(V\text{.}\) Then show that\begin{equation*} \norm{x_1+x_2+\cdots+x_n}^2=\norm{x_1}^2+\norm{x_2}^2+\cdots+\norm{x_n}^2\text{.} \end{equation*}This is an extension of the Pythagoras Theorem.
- Let \(\beta=\{u_1,\ldots, u_n\}\) be an orthogonal basis of an inner product space \(V\text{.}\) Let \(v\in V\) and \(\theta_1,\ldots, \theta_n\) between \(v\) and \(u_1,\ldots, u_n\text{,}\) respectively. Then\begin{equation*} \cos\theta_1^2+\cdots+\cos\theta_n^2=1\text{.} \end{equation*}Here \(\cos\theta_i\) are called the direction cosines of \(v\) corresponding to \(\beta\text{.}\)
- Find the orthogonal projection of vector \(b=\begin{bmatrix}1\\2\\3\\4 \end{bmatrix}\) onto the subspace spanned by three vectors \(\left\{\begin{bmatrix}1\\-1\\0\\1 \end{bmatrix} , \begin{bmatrix}0\\1\\1\\-1 \end{bmatrix} , \begin{bmatrix}1\\1\\-1\\0 \end{bmatrix} \right\}\text{.}\)
- Consider the standard basis \(\beta=\{1,x,x^2,x^3\}\) of \({\cal P}_3(\R)\) with inner product \(\inner{f}{g}:=\int_0^1 f(x)g(x)\,dx\text{.}\) Find an orthonormal basis starting with \(\beta\) using the Gram-Schmidt orthogonalization process. (Hint: replace dot product in the Gram-SChmidt process by the inner product.)
- Consider the standard basis \(\beta=\{1-x,1+x,1+x+x^2\}\) of \({\cal P}_3(\R)\) with inner product \(\inner{f}{g}:=\int_{-1}^1 f(x)g(x)\,dx\text{.}\) Find an orthonormal basis starting with \(\beta\) using the Gram-Schmidt orthogonalization process.
- (i) Define \(\inner{u}{v}:=u^TAv\) where \(A = \begin{bmatrix}1 \amp 1 \amp 0\\1 \amp 2 \amp 0\\0 \amp 0 \amp 1 \end{bmatrix}\text{.}\) Show that \(\inner{.}{.}\) is an inner product on \(\R^4\text{.}\) Hence show that (ii) Show that\begin{equation*} \left\{\begin{bmatrix}2 \\-1\\0 \end{bmatrix} , \begin{bmatrix}0 \\1\\1 \end{bmatrix} , \begin{bmatrix}0 \\-1\\2 \end{bmatrix} \right\} \end{equation*}is an orthogonal basis of \(\R^3\) with respect to this inner product. (iii) Consider a basis \(\beta=\{(1,1,1),(-1,1,1),(-1,0,1)\}\) of \(R^3\text{.}\) Use the Gram-Schmidt orthogonalization process to find an orthonormal basis of \(\R^3\) with respect to the inner product defined in (i).