MIT OpenCourseWare Close Window
 
» Required Reading » Table of Contents » Chapter 32

32.8 Computing Eigenvalues and Eigenvectors

Previous Section Next Section

We here address the following questions

How can we actually compute eigenvalues and eigenvectors of a given matrix?

An application of eigenvalues: quadratic forms and the question, when is a critical point a local extremum and when is it a saddle point?

An eigenvector game: learn how to just look at a matrix and guess an eigenvector!

How to compute eigenvalues and eigenvectors for large matrices is an important question in numerical analysis. We will merely scratch the surface for small matrices.

There is an obvious way to look for real eigenvalues of a real matrix: you need only write out its characteristic polynomial, plot it and find its solutions. This is quite easy to do in two dimensions, not difficult in three or four dimensions, and not really difficult for a computer in many more dimensions.
This is very straightforward and dull.

In two dimensions the characteristic equation is

x2 - tr(M)x + det(M) = 0

This equation can be solved using the quadratic formula and the eigenvalues can be obtained by explicit formulae.

In three dimensions the characteristic equation is

x3 - tr(M)x2 + Ax - det(M) = 0

where A is the sum of pairs of diagonal elements minus the products of each opposite pair of off diagonal elements:

A = M11 * M22 + M11 * M33 + M22 * M33 - M12 * M21 - M13 * M31 - M23 * M32.

There is a cubic formula for solving this equation but it is probably easier to find one solution, say z, numerically, whereupon the other two obey the quadratic equation

Since the characteristic polynomial is cubic, it goes in opposite directions for large arguments positive versus negative, and so by starting at same and homing in (by the divide and conquer approach) you can find a solution to any desired accuracy with relative ease.

So how do we find an eigenvector given an eigenvalue z? There is a very simple answer that usually works. A column eigenvector can be obtained by taking the cofactors of any row of M-zI and arranging them as a column vector.

Exercises:

32.5 When will this approach fail?

32.6 Prove that if the cofactors don't all vanish they provide a column eigenvector.

32.7 Choose a random 3 by 3 matrix and find an eigenvalue and corresponding eigenvector.

There are other ways to find eigenvectors and eigenvalues that often work.
One approach is to raise the matrix to a high power. This is easier to do than it sounds. You can then notice that the high power of the matrix will tend to have rank 1, usually, and you can read off a row and a column eigenvector from it. You can easily deduce the corresponding eigenvalue by having the matrix act on the eigenvector you find.
If there is an eigenvalue that has greater magnitude than any other and it has only one eigenvector, (it is not a multiple root of the characteristic equation for M) then this method will usually find it. You can apply the same approach to the inverse matrix to M to find an eigenvalue smallest in magnitude and its eigenvector.

These actions are relatively easy on Excel spreadsheets, because these have functions that take the product of two matrices (called mmult), which finds the inverse of a matrix (minverse) and takes the determinant of a matrix (mdeterm). Using mmult it is quite easy to square a matrix, copying the procedure to raise it to the fourth power, copy both procedures to raise it to the eighth and then sixteenth power; copy the whole mess to raise to the 256th power etc.

For a four by four matrix once you have two eigenvalues, the you can get the rest by solving quad4ratics and you can usually get the largest and smallest in magnitude by raising A and A-1 to high powers.

Of course once you have the eigenvalue that is largest in magnitude, you could look for the second largest. This can be accomplished by projecting the columns of M to vectors normal to the first row eigenvector, and working with the resulting matrix.

Of course you run into trouble with this approach if there are two eigenvectors with the same largest magnitude of eigenvalue or nearly the same one.

How do we find the matrix A which we can use to diagonalize M?
A's columns are normalized eigenvectors of M.