|
|||||
We have noted that our first question has a number of variants, and we will
note the changes in the answers when the variants are used. When we allow complex
matrix elements, and complex vectors, we can diagonalize a wider class of matrices. Thus the dot product of the column vector with entries (a + ib, c + id) with the same row vector is (a - ib) * (a + ib) + (c - id) * (c + id) or a2 + b2 + c2 + d2 The dot product of the same column vector with (e + if, g + ih) is instead (e - if) * (a + ib) + (g - ih) * (c + id) Notice that with this definition the dot product is no longer symmetric. However
it does not change if you interchange row and column and also take the complex
conjugate, since the asymmetry lies in taking the complex conjugate of the row
and not the column. Again we can ask, what matrices can be diagonalized by a unitary transformation?
A preliminary question is: which matrices can be diagonalized so that its eigenvalues,
which are what appear on the diagonal when it is diagonalized, are all real? The
answer now is that any matrix that is its own transpose complex conjugate will
have this property: which implies if M is n by n M has n real eigenvalues and
an orthonormal basis of eigenvectors. Such matrices are called Hermitian matrices. Again the necessity of this condition follows from the fact that "Hermitivity"
is preserved by unitary transformations and real diagonal matrices are Hermitian.
Answer to the general question, without reference to real eigenvalues is that
the matrix must commute with its complex conjugate transpose. This condition is
again preserved under unitary transformations, and it is a property of diagonal
matrices, since all diagonal matrices commute with one another, so it is definitely
necessary. There is an easy answer which again can easily be seen to be necessary. Suppose
a1, a2, ..., ak are the distinct eigenvalues
of M. Any vector can be written as a sum of basis vectors. If each basis vector
is an eigenvector of M, say corresponding to eigenvalue aj, then M - ajI acting
on it will be the zero vector. On the other hand M - ahI acting on
it merely multiplies it by aj - ah. Thus, if there is a
basis consisting of eigenvectors of M then the product over all j from 1 to k
of (M - ajI) must be the zero matrix, since it must give 0 in acting
on every basis vector. This product is called the minimal polynomial of M and the equation that it is the zero matrix is called the minimal equation for M. Thus if M obeys its own minimal equation then it has a basis of eigenvectors. By the way an interesting and curious fact is that every matrix obeys its own characteristic equation (that is if you substitute M for the variable x in it, you get the 0 matrix.) |