One of the key reasons for which my class was taught Linear Algebra, including how to compute Eigenvalues and Eigenvectors of Matrices, was so that we could Diagonalize Symmetrical Matrices, in Real Numbers. What this did was to compute the ‘Perpendicular Matrix’ of a given matrix, in which each column was one of its Eigenvectors, and which was an example of an Orthogonal Matrix. (It might be the case that what was once referred to as a Perpendicular Matrix, may now be referred to as the Orthogonal Basis of the given matrix,?)

(Edit 07/04/2018 :

In fact, what we were taught, is now referred to as The Eigendecomposition of a matrix. )

Having computed the perpendicular matrix P of M, it was known that the matrix product

P^{T} M P = D,

which gives a Diagonal Matrix ‘D’. But, a key problem my Elementary Linear class was not taught to solve, was what to do if ‘M’ had complex Eigenvalues. In order to be taught that, we would need to have been taught in general, how to combine Linear Algebra with Complex Numbers. After that, the Eigenvectors could have been computed as easily as before, using Gauss-Jordan Elimination.

I have brushed up on this in my old Linear Algebra textbook, where the last chapter writes about Complex Numbers. Key facts which need to be understood about Complex Vector Spaces, is

- The Inner Product needs to be computed differently from before, in a way that borrows from the fact that complex numbers naturally have conjugates. It is now the sum, of each element of one vector, multiplied by the conjugate, of the corresponding element of the other vector.
- Orthogonal and Symmetrical Matrices are relatively unimportant with Complex Elements.
- A special operation is defined for matrices, called the Conjugate Transpose, A
^{*}. - A Unitary Matrix now replaces the Orthogonal Matrix, such that A
^{-1}= A^{*}. - A Hermitian Matrix now replaces the Symmetrical Matrix, such that A = A
^{*}, and the elements along the main diagonal are Real. Hermitian Matrices are also easy to recognize by inspection. - Not only Hermitian Matrices can be diagonalized. They have a superset, known as Normal Matrices, such that A A
^{*}= A^{*}A . Normal Matrices can be diagonalized.

This could all become important in Quantum Mechanics, considering the general issue known to exist, by which the bases that define how particles can interact, somehow need to be multiplied by complex numbers, to describe accurately, how particles do interact.

Apparently, finding complex Eigenvalues for a matrix, converts the entire problem into one where the Eigenvectors and (Unitary) Perpendicular Matrices all have complex elements, but which can be marched through, by finding the Characteristic Equation, determining its roots nonetheless, substituting the roots, and then finding the non-trivial parametric solutions, as stated, by conventional means.

The new form for computing the Inner Product needs to be used, to normalize each of the base-vectors of ‘P’.

And, the complex-numbered diagonalization becomes

P^{*} M P = D .

If I may make the bold assumption, that the Diagonal Matrix which results is itself a Hermitian Matrix, and will still consist of real elements along its main diagonal and otherwise zeroes, I would propose an intuition.

The rows and columns of the matrices being used in QM, *misidentify* the real properties of the particle. The fact that they contain complex numbers suggests that somehow, the number of elements correct, but that the real properties, hinted at by a potential Diagonal Matrix, are somehow askew from the assumed ones.

The real properties of the particles may be derivable, if the set of equations known to Quantum Mechanics could be Diagonalized.

Yet, this proposition might not get received well, because it also suggests that the *Quantum Numbers* are wrong.

Dirk

## One thought on “Self-Educating about Perpendicular Matrices with Complex Elements”