Skip to main content

Section 5.3 Complex Eigenvalues

So far we have only considered eigenvalues that are real numbers. However, there are matrices whose characteristic polynomal has no real root. For example the characteritic polynomial of \(\begin{pmatrix}0 \amp -1\\1 \amp 0\end{pmatrix}\) is \(x^2+1\) which does have real root. In this section, we shall consider such matrices. In particular, we shall allow both real and complex numbers as eigenvalues. All the concepts that we have seen in last two sections can be extended to complex eigenvalues as well. We assume that readers are familiar with basic notion of complex numbers and some of its standard properties.

Example 5.3.1.

Consider a matrix \(A =\begin{pmatrix}1 \amp -1 \\1 \amp 1\end{pmatrix}\text{.}\) It is easy to see that the characteristic polynomila of \(A\) is \(\det{(x I-A)}=x^2-2x+2\) whose charateristic roots are \(\lambda_1=1+i, \lambda_2=1-i\text{.}\)
Let us find eigenvectors corresponding to the eigenvalue \(\lambda=1+i\text{.}\) It is natural to expect, eigenvectors will also be vectors with complex entries. Let \(v=\begin{pmatrix}z_1\\z_2\end{pmatrix}\) be an eigenvector corresponding to \(\lambda_1=1+i\text{.}\) Then
\begin{equation*} \begin{pmatrix}1 \amp -1 \\1 \amp 1\end{pmatrix}\begin{pmatrix}z_1\\z_2\end{pmatrix} = (1+i)\begin{pmatrix}z_1\\z_2\end{pmatrix}. \end{equation*}
Solving the above equations, it is easy to see that \(z_2=-iz_1\text{.}\) Hence one of the solution can be taken as \(v_1 =\begin{pmatrix}1\\-i\end{pmatrix}\text{.}\)
Similarly, we can chech that an eigenvector corresponding to the eigenvalue \(v_2 =\begin{pmatrix}1\\i\end{pmatrix}\text{.}\)
Let us define
\begin{equation*} P = [v_1~v_2]=\begin{pmatrix} 1 \amp 1 \\-i\amp i\end{pmatrix}. \end{equation*}
Then
\begin{equation*} P^{-1} =\begin{pmatrix} 1/2 \amp i/2 \\1/2\amp -i/2\end{pmatrix} \end{equation*}
and
\begin{equation*} P^{-1}AP =\begin{pmatrix} 1+i \amp 0\\0 \amp 1-i\end{pmatrix}. \end{equation*}

Example 5.3.2.

Let \(A= \left(\begin{array}{rrr} -1 \amp -2 \amp 1 \\ 3 \amp 3 \amp 1 \\ -4 \amp -4 \amp 3 \end{array}\right)\text{.}\) Let us find the eigenvalues and eigenvector of \(A\text{.}\)
Solution.
The characteristic polynomial of \(A\) is
\begin{equation*} \det{(xI-A)}=x^{3} - 5x^{2} + 17x - 13=(x - 1) \cdot (x^{2} - 4x + 13). \end{equation*}
Hence the eigenvalues are \(\lambda_1=1,\lambda_2=2-3i, \lambda_3=2+3i\text{.}\)
Let us find an eigenvector corresponding to the eigenvalue \(\lambda_1=1\text{.}\) We shall find the \({\rm ker}(A-\lambda I)\text{.}\)
\begin{equation*} A-I = \left(\begin{array}{rrr} -2 \amp -2 \amp 1 \\ 3 \amp 2 \amp 1 \\ -4 \amp -4 \amp 2 \end{array}\right) \xrightarrow{ \begin{array}{c} RREF \end{array}} \left(\begin{array}{rrr} 1 \amp 0 \amp 2 \\ 0 \amp 1 \amp -\frac{5}{2} \\ 0 \amp 0 \amp 0 \end{array}\right). \end{equation*}
Clearly \({\rm ker}(A-I)=\{\alpha(1, -5/4, -1/2):\alpha \in \R\}\text{.}\) Hence we can an eigenvalue \(v_1=(1, -5/4, -1/2)\text{.}\)
Next we find an eigenvector corresponding to \(\lambda_2=2-3i\text{.}\)
\begin{align*} A-(2-3i)I \amp = \left(\begin{array}{rrr} 3 i - 3 \amp -2 \amp 1 \\ 3 \amp 3 i + 1 \amp 1 \\ -4 \amp -4 \amp 3 i + 1 \end{array}\right)\\ \amp \xrightarrow{ \begin{array}{c} RREF \end{array}} \left(\begin{array}{rrr} 1 \amp 0 \amp -\frac{1}{2} \\ 0 \amp 1 \amp -\frac{3}{4} i + \frac{1}{4} \\ 0 \amp 0 \amp 0 \end{array}\right). \end{align*}
It is easy to check that \(v_2=\left(1, - \frac{1}{2}+\frac{3}{2} i,2\right)\) is an eigenvector.
Similarly \(v_2=\left(1, - \frac{1}{2}-\frac{3}{2} i,2\right)\) eigenvector corresponding to the eigenvector \(\lambda_3=2+3i\text{.}\)
Define
\begin{equation*} P=[v_1~v_2~v_3]=\left(\begin{array}{rrr} 1 \amp 1 \amp 1 \\ -\frac{5}{4} \amp \frac{3}{2} i - \frac{1}{2} \amp -\frac{3}{2} i - \frac{1}{2} \\ -\frac{1}{2} \amp 2 \amp 2 \end{array}\right). \end{equation*}
Then
\begin{equation*} P^{-1}=\left(\begin{array}{rrr} \frac{4}{5} \amp 0 \amp -\frac{2}{5} \\ -\frac{11}{30} i + \frac{1}{10} \amp -\frac{1}{3} i \amp \frac{1}{10} i + \frac{1}{5} \\ \frac{11}{30} i + \frac{1}{10} \amp \frac{1}{3} i \amp -\frac{1}{10} i + \frac{1}{5} \end{array}\right). \end{equation*}
It is easy to check that
\begin{equation*} P^{-1}AP=\left(\begin{array}{rrr} 1 \amp 0 \amp 0 \\ 0 \amp 2-3 i \amp 0 \\ 0 \amp 0 \amp 2+3 i \end{array}\right) \end{equation*}

Exercises Exercises

1.

Let \(A=\left(\begin{array}{rr} 3 \amp -1 \\ 4 \amp 3 \end{array}\right)\text{.}\) Find the eigenvalues and eigenvectors of \(A\) and hence diangonalize it.

2.

Diagonalize the matrix \(A=\left(\begin{array}{rrr} -18 \amp -10 \amp 9 \\ 25 \amp 10 \amp 0 \\ -16 \amp -20 \amp 8 \end{array}\right)\text{.}\)
From the theory of equations, we know that complex roots of a polynomial with real coefficients occur in conjugate pair. That is if \(z=a+ib\) is a root of a polynomual \(p(x)\text{,}\) \(\overline{z}=a-ib\) is also a root of \(p(x)\text{.}\) This leads to the following theorem.

Proof.

Since \(A\) is real we have \(\overline{A}=A\text{,}\) that is conjugate of \(A\) is same as \(A\text{.}\) Hence
\begin{equation*} A\overline{v}=\overline{A}\overline{v}=\overline{Av}=\overline{\lambda v}= \overline{\lambda} \overline{v}. \end{equation*}

Proof.

(1) Let \(\lambda\) be an eigenvalues of \(A\) and \(v\text{,}\) the corresponding eigenvector of \(A\text{.}\) Then by definition \(Av=\lambda v\text{.}\) Multiplying both sides by \(\overline{v}^T\) (the conjugate transpose of the vector \(v\)), we get
\begin{equation*} \overline{v}^TAv=\lambda\overline{v}^Tv \Longrightarrow \lambda=\dfrac{\overline{v}^TAv}{\overline{v}^Tv}\text{.} \end{equation*}
Since \(\overline{v}^Tv=\norm{v}^2\) is a real number. The behavior of \(\lambda\) is determined by \(\overline{v}^TAv\text{.}\)
Hence
\begin{align*} \overline{\left(\overline{v}^TAv\right)}\amp =v^T\overline{A}\overline{v}\\ \amp=v^TA\overline{v} \quad \text{ since }A \text{ is real}\\ \amp=v^TA^T\overline{v} \quad \text{ since } A^T=A\\ \amp=(Av)^T\overline{v} \\ \amp=\overline{v}^TAv. \end{align*}
This implies that \(\overline{v}^TAv\) is a real number and hence \(\lambda\) is a real number.
(2) Now if \(A\) is a skew-hermitian matrix, then it is easy to show that \(\overline{(\overline{v}^TAv)}=-(\overline{v}^TAv)\text{.}\) Hence \(\overline{v}^TAv\) is either purely imaginary or zero. Which show \(\lambda\) is either purely imaginary of zero.

Example 5.3.5.

Condsider a matrix \(A=\begin{pmatrix}a \amp -b \\b\amp a\end{pmatrix}\text{.}\) What are eigenvalues of \(A\text{?}\) What does \(A\) do to any vector geometrically?
Solution.
The characteristic polynomial of \(A\) is \(\lambda^2-2a\lambda+(a^2+b^2)\text{.}\) Hence the eigenvalues are \(a\pm ib\text{.}\) You can check that the corresponding eigenvectors are \(\begin{pmatrix}1\\-i\end{pmatrix}\) and \(\begin{pmatrix}1\\i\end{pmatrix}\) respectively.
Define \(r=\sqrt{a^2+b^2}=\sqrt{\det{(A)}}\) length of each of the column of \(A\text{.}\) Then
\begin{equation*} \left(\frac{a}{r}\right)^2+\left(\frac{b}{r}\right)^2=1. \end{equation*}
Hence \((a/r,b/r)\) lies on the unit circle. Therefore, there exists \(\theta\) such that \((a/r,b/r)=(\cos\theta,\sin\theta)\text{.}\) That is, \((a,b)=r(\cos\theta,\sin\theta)\text{.}\) Thus we have
\begin{equation*} A = r\begin{pmatrix}\frac{a}{r} \amp -\frac{b}{r} \\\frac{b}{r}\amp \frac{a}{r}\end{pmatrix}= r\begin{pmatrix}\cos\theta \amp -\sin\theta \\\sin\theta\amp \cos\theta\end{pmatrix} =\begin{pmatrix}r \amp 0\\r\amp 0\end{pmatrix} \begin{pmatrix}\cos\theta \amp -\sin\theta \\\sin\theta\amp \cos\theta\end{pmatrix} \end{equation*}
Hence geometrically \(A\) is rotation by an angle \(\theta\) anticlock wise followed by scaling by \(r\text{.}\) Needless to say that first we can scale then rotate as well.
Let us explore this using Sage. You may change the matrix \(M\) and see what happens to the image of a unit square under \(M\text{.}\)
A matrix of the form \(\begin{pmatrix}a \amp -b \\b\amp a\end{pmatrix}\) is called a rotation-scaling matrix.
Imaginary eigenvalues
Suppose \(A\) is a \(2\times 2\) matrix with eigenvalue \(\lambda=a+ib\) which is imaginary, that is \(b\neq 0\text{.}\) Assume that \(v=\begin{pmatrix} x+iy\\z+iw\end{pmatrix}\) be an eigenvector corresponing to \(\lambda\text{.}\) That is, \(Av=\lambda v\text{.}\) Then we can write
\begin{equation*} v=\begin{pmatrix} x+iy\\z+iw\end{pmatrix}=v=\begin{pmatrix} x\\z \end{pmatrix} +i \begin{pmatrix} y\\w \end{pmatrix}={\rm Re}(v)+i {\rm Im}(v). \end{equation*}
Suppose \({\rm Re}(v), {\rm Im}(v)\) are linearly dependent and that \({\rm Re}(v)=c{\rm Im}(v)\) for non zero real number \(c\text{.}\) We have
\begin{align*} Av \amp = A\lambda v \\ \amp = (a+ib){\rm Re}(v)+i{\rm Im}(v)\\ \amp=(a {\rm Re}(v)-b {\rm Im}(v)) + i(a {\rm Im}(v)+b {\rm Re}(v)) \end{align*}
On the other hand
\begin{equation*} Av=A({\rm Re}(v)+i {\rm Im}(v))=A{\rm Re}(v)+i A{\rm Im}(v). \end{equation*}
Comparing the real and imaginary parts we get
\begin{align} A({\rm Re}(v)) \amp= a {\rm Re}(v)-b {\rm Im}(v)\tag{5.3.1}\\ A({\rm Im}(v))\amp = a {\rm Im}(v)+b {\rm Re}(v). \tag{5.3.2} \end{align}
Now using the assumption \({\rm Re}(v)=c{\rm Im}(v)\) in (5.3.1) we get
\begin{align*} A (c {\rm Im}(v)) \amp =a c {\rm Im}(v) - b {\rm Im}(v)\\ \implies c(a {\rm Im}(v)+b {\rm Re}(v)) \amp=(ac-b) {\rm Im}(v)\\ \implies c(a {\rm Im}(v)+bc {\rm Im}(v)) \amp=(ac-b) {\rm Im}(v) \\ \implies (ac {\rm Im}(v)+bc^2 {\rm Im}(v)) \amp=(ac-b) {\rm Im}(v). \end{align*}
This implies \({\rm Im}(v)\neq 0\text{,}\) we have \(bc^2=b\text{.}\) (why?) Further, \(b\neq 0\text{,}\) we have \(c^1=-1\text{,}\) a contradiction. This proves that \({\rm Re}(v), {\rm Im}(v)\) are linearly independent.
From (5.3.1), we get
\begin{equation*} A({\rm Re}(v)) = a {\rm Re}(v)-b {\rm Im}(v)=\begin{pmatrix} ax-by\\az-bw \end{pmatrix}. \end{equation*}
Similarly from (5.3.2), we get
\begin{equation*} A({\rm Im}(v)) = a {\rm Im}(v)+b {\rm Re}(v)= \begin{pmatrix} bx+ay\\by+aw\end{pmatrix}. \end{equation*}
Hence we get
\begin{align*} A [{\rm Re}(v)~~{\rm Im}(v)] \amp = \begin{pmatrix} ax-by \amp bx+ay \\az-bw \amp by+aw \end{pmatrix}\\ \amp = \begin{pmatrix} x \amp y \\z \amp w\end{pmatrix} \begin{pmatrix} a \amp b \\-b \amp a\end{pmatrix}. \end{align*}
Define
\begin{equation*} P: =\begin{pmatrix} x \amp y \\z \amp w\end{pmatrix} \end{equation*}
Since \({\rm Re}(v), {\rm Im}(v)\) are linearly independent, \(P\) is non singular. Hence we have
\begin{equation} A = PBP^{-1}.\tag{5.3.3} \end{equation}
Note that the matrix \(B\) is rotation scaling matrix and that \(\det{(B)}=|\lambda|^2\text{.}\)
Thus we have the following result.

Example 5.3.8.

Let \(A=\left(\begin{array}{rr} -3 \amp -8 \\ 4 \amp 5 \end{array}\right)\text{.}\) Find the eigenvalues and the corresponding eigenvector. Hence find the matrix \(P\) and \(B\) and show that \(A=PBP^{-1}\text{.}\)
Solution.
The characteristic polynomial of \(A\) is \(p(x)=x^{2} - 2 x + 17\text{.}\) Hence one of eigenvalues is \(\lambda=a+ib=1-4i\) and the corresponding eigenvector is \(v=\left(1,-1/2 + 1/2i\right)\text{.}\) Hence we have
\begin{equation*} B = \begin{pmatrix} a \amp b \\-b \amp a\end{pmatrix}= \left(\begin{array}{rr} 1 \amp -4 \\ 4 \amp 1 \end{array}\right),\quad P = \left(\begin{array}{rr} 1 \amp 0 \\ -\frac{1}{2} \amp \frac{1}{2} \end{array}\right). \end{equation*}
It is easy to check that
\begin{equation*} PBP^{-1}=A. \end{equation*}
Geoemetric transformation of \(A=\left(\begin{array}{rr} -3 \amp -8 \\ 4 \amp 5 \end{array}\right)\)
Let us use Sage to deomostrate how the matrix \(A=PBP^{-1}\) transforms a home like image in \(\R^2\text{.}\)
Complex eigenvalues of \(3\times 3\) matrices.
Now let us see what happens if we take a \(3\times 3\) real matrix which has a complex eigenvalue say \(\lambda_1=a+ib\text{.}\) We know that in this case \(\lambda_2=\overline{\lambda_1}=a-ib\) is another eigenvalue and it has a real eigenvector say \(\lambda_3\text{.}\) Let \(v_1, v_2\) and \(v_3\) be corresponding eigenvectors of \(A\text{.}\) Since eigenvalues are distinct, this matrix is diagonalizable. In this case it turns out that we can follow a similar procedure to show that \(A=PBP^{-1}\) where \(P=\begin{bmatrix}{\rm Re}(v_1)\amp {\rm Im}(v_1) \amp v_3\end{bmatrix}\) and \(B = \begin{pmatrix}{\rm Re}(\lambda_1) \amp {\rm Im}(\lambda_1)\amp 0\\ -{\rm Im}(\lambda_1) \amp {\rm Re}(\lambda_1)\amp 0\\ 0 \amp 0 \amp \lambda_1\\ \end{pmatrix}\text{.}\)
Let us solve this problem in Sage.